Torrent Invites! Buy, Trade, Sell Or Find Free Invites, For EVERY Private Tracker! HDBits.org, BTN, PTP, MTV, Empornium, Orpheus, Bibliotik, RED, IPT, TL, PHD etc!



Results 1 to 2 of 2
Like Tree1Likes
  • 1 Post By WallE

Thread: Should ethics or human intuition drive the moral judgments of driverless cars?

  1. #1
    ~Carpe Diem~
    WallE's Avatar
    Reputation Points
    128635
    Reputation Power
    100
    Join Date
    Jun 2017
    Posts
    4,605
    Time Online
    79 d 6 h 24 m
    Avg. Time Online
    45 m
    Mentioned
    656 Post(s)
    Quoted
    76 Post(s)
    Liked
    2432 times
    Feedbacks
    6 (100%)

    Cool Should ethics or human intuition drive the moral judgments of driverless cars?

    180503142637_1_540x360.jpg
    A car can swerve to avoid hitting a motorcycle but in doing so endangers other lives. How should it be programmed to behave?
    When faced with driving dilemmas, people show a high willingness to sacrifice themselves for others, make decisions based on the victim's age and swerve onto sidewalks to minimize the number of lives lost, reveals new research published in open-access journal Frontiers in Behavioral Neuroscience. This is at odds with ethical guidelines in these circumstances, which often dictate that no life should be valued over another. This research hopes to initiate discussions about the way self-driving vehicles should be programmed to deal with situations that endanger human life.

    "The technological advancement and adoption of autonomous vehicles is moving quickly but the social and ethical discussions about their behavior is lagging behind," says lead author Lasse T. Bergmann, who completed this study with a team at the Institute of Cognitive Science, University of Osnabrück, Germany. "The behavior that will be considered as right in such situations depends on which factors are considered to be both morally relevant and socially acceptable."

    Traffic accidents are a major source of death and injury in the world. As technology improves, automated vehicles will outperform their human counterparts, saving lives by eliminating accidents caused by human error. Despite this, there will still be circumstances where self-driving vehicles will need to make decisions in a morally challenging situation. For example, a car can swerve to avoid hitting a child that has run into the road but in doing so endangers other lives. How should it be programmed to behave?

    An ethics commission initiated by the German Ministry for Transportation has created a set of guidelines, representing its members' best judgement on a variety of issues concerning self-driving cars. These expert judgments may, however, not reflect human intuition.

    Bergmann and colleagues designed a virtual reality experiment to examine human intuition in a variety of possible driving scenarios. Different sets of tests were created to highlight different factors that may or may not be perceived as morally relevant.

    Based on a traditional ethical thought experiment, the trolley problem, test subjects could choose between two lanes on which their vehicle drove at constant speed. They were presented with a morally challenging driving dilemma, such as an option to move lanes to minimize lives lost, a choice between victims of different age, or a possibility for self-sacrifice to save others.

    It revealed that human intuition was often at odds with ethical guidelines.

    Bergmann explains, "The German ethics commission proposes that a passenger in the vehicle may not be sacrificed to save more people; an intuition not generally shared by subjects in our experiment. We also find that people chose to save more lives, even if this involves swerving onto the sidewalk -- endangering people uninvolved in the traffic incident. Furthermore, subjects considered the factor of age, for example, choosing to save children over the elderly."

    He continues, "If autonomous vehicles abide with guidelines dictated by the ethics commission, our experimental evidence suggests that people would not be happy with the decisions their cars make for them."

    Professor Gordon Pipa, co-author, also based at the University of Osnabrück continues, "It is urgent that we start engaging into a societal discussion to define the goals and constraints of future rules that apply to self-drive vehicles. This needs to happen before they become an integral part of our daily lives."

    Bergmann explains that further research is needed. "While 'dilemma' situations deserve more study, other questions should also be discussed. Driving requires an intricate weighing of risks versus rewards, for example speed versus the danger of a critical situation unfolding. Decision making-processes that precede or avoid a critical situation should also be investigated."
    kirill likes this.
    Once we accept our limits, we go beyond them. ~ Albert Einstein


  2. #2
    Donor
    mac011's Avatar
    Reputation Points
    3330
    Reputation Power
    56
    Join Date
    Apr 2018
    Posts
    138
    Time Online
    2 d 18 h 33 m
    Avg. Time Online
    1 m
    Mentioned
    23 Post(s)
    Quoted
    25 Post(s)
    Liked
    61 times
    Feedbacks
    0
    This is a highly controversial topic. I'm on the side of human intuition. When you end up in a fatal situation on both sides of an accident it's a difficult situation. When a human is put in a split second situation when they have to save their life, they use intuition rather than ethics.


Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •