Robotics: Science and Systems 2017 Workshop

Morality and Social Trust in Autonomous Robots

July 16, 2017, Massachusetts Institute of Technology, Cambridge, MA, USA
An affiliated workshop of RSS 2017
Submission deadline May 26, 2017
Notification June 02, 2017
Camera ready July 07, 2017
Workshop July 16, 2017

Welcome

We invite researchers attending RSS 2017, and others, to participate in this workshop focused on morality and trust in autonomy. This workshop will include a mix of invited talks, panel discussions, and contributed poster presentations. Poster submissions are encouraged from researchers and students wishing to give a poster presentation on any topic within the theme of the workshop. The main goal of this workshop is to shed light on the little-understood notions of morality, ethics, and trust in autonomous robots from various perspectives. We have therefore invited a number of experts from both academia and industry, whose works focus on the intersection of the field of technology with sociology, philosophy, ethics, or logic, to give their views on the topic through talks and panel discussions.

Morality, Ethics, and Trust in Autonomy - Why?

Robots are becoming members of our society. Complex algorithms have been making robots increasingly sophisticated machines with rising levels of autonomy, enabling them to leave behind their traditional work places in factories and to enter our society with convoluted social rules, relationships, and expectations. Driverless cars, home assistive robots, and unmanned aerial vehicles are just a few examples. As the level of involvement of such systems increases in our daily lives, their decisions affect us more directly. Therefore, we instinctively expect robots to behave morally and make ethical decisions. For instance, we expect a firefighter robot to follow ethical principles when it is faced with a choice of saving one's life over another's in a rescue mission, and we expect an eldercare robot to take a moral stance in following the instructions of its owner when they are in conflict with the interest of others (unlike the robot in the movie "Robot & Frank"). Such expectations give rise to the notion of trust in the context of human-robot relationship and to questions such as "how can I trust a driverless car to take my child to school?" and "how can I trust a robot to help my elderly parent?" In order to design algorithms that can generate morally-aware and ethical decisions and hence creating trustworthy robots, we need to understand the conceptual theory of morality in machine autonomy in addition to understanding, formalizing, and expressing trust itself. This is a tremendously challenging (yet necessary) task because it involves many aspects including philosophy, sociology, psychology, cognitive reasoning, logic, and computation. In this workshop, we try to continue with the discussions initiated in our RSS 2016 workshop on "Social Trust in Autonomous Robots" with the additional theme of ethics and morality to shed light on these multifaceted concepts and notions from various perspectives through a series of talks and panel discussions.

Workshop Flyer

Please use the following two flyers to advertise for the workshop: flyer 1 (preferred) and flyer 2.

poster

Contact Information

Email the organizers for any comments or questions.

This workshop is supported by the EPSRC.

EPSRC