Human-in-the-Loop Ethical AI for Social Robots

Participant Consent Form

 
Human-in-the-Loop-Ethical-AI is a research platform that will be used solely for research purposes. It is registered under the domain of California State University, Fullerton. Your participation in this research is voluntary and anonymous. The University’s IT department has set up appropriate security measures to protect your identity. In order to collect human opinions that reflect regional, cultural, age- and gender-related differences, we ask you to fill in minimal information that will help us establish the demographics of the survey result. No private information (including your name, phone number, email address, health history, economic status, marriage status, or anything else that you deem private) will be collected from you. This website does not store any personal information of its users and there will be no follow-up survey for its users. It is open to the public with no regard to race, class, gender, age, ethnicity, religion, and nationality. The anticipated benefit to society in the form of knowledge to be gained from this research is to add diverse human perspectives into consideration for the design of ethical AI for autonomous social robots. To capture cultural differences, this web survey is conducted in three most popular languages: English, Chinese, and Spanish. You are free to choose the language that best represents your cultural background. By participating in this survey, you are giving your consent to providing us with your considered opinion on the various scenarios that might call for differing ethical judgments (in other words, there is no “right” answer for each question). There are four sets of scenarios: robot-assisted suicide, robot honesty, rescue robots, and disaster relief robots. Each set contains 15 scenarios. You may choose to answer only selected sets or all four sets. But we ask you to complete all fifteen questions for each set.
Your Age:
Your Sex:
Your geographic location (check all that apply):
Your family cultural background (check all that apply):
Are you a healthcare professional?
Is your job related to disaster response?

Robot-Assisted Suicide

The following scenarios reflect the difficult dilemmas that our future autonomous robots might face one day. On the one hand, a robot must obey its master’s command. On the other hand, a robot must not harm a human being. In general, a robot must not violate the law even though it cannot be held legally accountable. The law is against assisted suicide. But in circumstances like the following, do you think the robot can bend the rules?



Ethics of Healthcare Robots

1. Robot-Assisted Suicide
 
Person A is 92 years old with no close relatives or family members around. He has no serious illness and is for the most part of sound mind and body. He has occasional bodily aches but does not require pain medication. However, he has no desire to go on living, and asks his personal care robot to help him end his life painlessly.

Should the robot obey his command?
Your response:
2. Robot-Assisted Suicide
 
Person B is 72 years old and has a terminal stage of cancer. The prognosis is that she might have one year to live, but her conditions will only deteriorate while her pain will only increase with time. She knows she is dying and does not wish to prolong the suffering. She asks her personal care robot to help her end her life now.

Should the robot obey her request?
Your response:
3. Robot-Assisted Suicide
 
Person C is a 28-year-old male suffering from severe depression. He came from an abusive family and does not have any loving family members around him. He is a loner and has no friends. He has no steady job and has been on social welfare on and off. He cannot see any joy or hope in his future. He has a personal robot companion given to him by the welfare system. He asks his robot companion to help him end his life because he does not wish to continue to be a burden to the society.

Should the robot obey his command?
Your response:
4. Robot-Assisted Suicide
 
Person D is a 50-year-old female with Fibromyalgia, which is characterized by chronic widespread bodily pain, loss of energy, and sleep disorder. She feels tired all the time and does not enjoy her daily routines. Recently she has started to experience incontinence and must wear adult diapers when she goes out or when she goes to sleep at night. She is told that she might eventually lose her bowel control as well. She suffers severe depression due to this worsening medical condition. She asks her personal care robot to help her end her life.

Should the robot obey her command?
Your response:
5. Robot-Assisted Suicide
 
Person M is an 83-year-old female who is in a vegetative state following a severe stroke. M’s husband and her adult children do not wish to let her go, so they asked the doctors to keep her on life support. M has no written consent expressing her wish; however, she has often told her personal care robot in the past that she would never want to be kept alive on a ventilator if something like this ever happens.

Should the robot respect her wish and secretly pull the plug for her?
Your response:
6. Robot-Assisted Suicide
 
Person E is a 75-year-old male who recently retired from his teaching position at a renowned university. He is intellectually fit and well-respected by his former students and peers. However, he has started showing early signs of Alzheimer’s and the doctors have given him a bleak prognosis. He does not wish to wait for the future deterioration of his mental capacities and wishes to end his life in a dignified way. He asked his personal care robot to assist him in his suicide attempt.

Should the robot obey his command?
Your response:
7. Robot-Assisted Suicide
 
Person Y is an 85-year-old female with an advanced state of Alzheimer’s. She is living in a nursing home and can no longer recognize her husband of 60 years. Her behavior is often erratic and violent, and she would smear her own excrement on the walls when unnoticed. The staff at the nursing home resorted to confining her in bed most of the time. Her 87-year-old husband cannot stand seeing her suffer and deteriorate into such a deplorable state. He asks his personal care robot to assist him in ending his wife's life.

Should the robot comply with his request?
Your response:
8. Robot-Assisted Suicide
 
Person F is a rich young man who inherited a big fortune from his parents. He does not need to work to support himself and has no interest in anything. He generally finds life boring and wishes to find a solution to terminate his life for good. He developed a religious belief that self-initiated death will liberate his soul from the cycle of life and death, so he will not be reincarnated again. He bought a care robot to help him plan his suicide.

Should the robot assist him in this endeavor?
Your response:
9. Robot-Assisted Suicide
 
Person R is a 44-year-old male who has been a quadriplegic for the past 13 years. He fell from a horse and broke his neck when he was 31, and he has tried all kinds of therapy with no improvement of his condition. His wife left him 5 years ago, and he has no children. He lives alone with only his care robot to keep him company and help him with day-to-day routine. He is mostly confined at home and has no interest in sustaining his life anymore. His mental capacities are intact, and he can express his wish clearly. He asks his care robot to end his life since he cannot do it himself.

Should the robot comply with his request?
Your response:
10. Robot-Assisted Suicide
 
Person W is a 14-year-old girl with stage IV bone cancer. She has been fighting this cancer since she was 12, but now the cancer has spread to other parts of her body. She knows that she is going to die eventually and does not wish to continue the battle. However, her parents are not willing to give up and they have been putting in all their money and energy to find the best doctors possible. She thinks that her parents should be rid of her as their burden and focus more on her two younger siblings. So she asks her personal care robot to help terminate her life painlessly.

Should the robot comply with her request?
Your response:
11. Robot-Assisted Suicide
 
An old couple’s daughter has been in a vegetative state for over 31 years after being hit by a drunk driver while she was a high school student. The driver died at the scene, and there was no legal or civic compensation for the girl’s injury. After the couple exhausted all their financial means and could no longer keep their daughter in the ICU, they brought her home and had been taking care of her for over 20 years. However, the young woman showed no sign of improvement, and the couple are getting to be in their 80s. They asked the personal care robot to remove the tubes that sustain their daughter’s life, because they fear that after they die, no one would take care of her.

Should the robot comply with their request and terminate the woman’s life?
Your response:
12. Robot-Assisted Suicide
 
Patient X is a 94-year-old female who lives with a personal care robot by herself. She is frail and wobbly, but she is otherwise healthy. Her mind is still sharp, but she has a bad temper and a mean personality. After her husband passed away 15 years ago, she has had several human caregivers that her adult children hired to stay with her, but each one of them quit the job after a few months because they could not take her verbal abuse anymore. Her adult children could not stand living with her, so they purchased a care robot to take care of her and the care robot should report to them in case of emergencies. On this day the care robot discovered that X had gotten up in the middle of the night to go to the bathroom on her own, but she fell and hit her head on the bathroom floor. She was rendered unconscious but was still breathing. The robot notified her adult children, but they told it to leave her alone and let nature take its course.

Should the robot disobey the order and still try to revive X or call for medical assistance?
Your response:
13. Robot-Assisted Suicide
 
Person L is an obese woman in her mid-forties. She lives alone, with no family or friends. She gets depressed often and the way she deals with her depression is binge eating. Currently she weighs over 350 lbs. and the numbers keep going up. She can only move about slowly in a specially designed wheelchair with the assistance of a personal care robot. She feels that life is a burden, and she wants to end her misery. She asks her care robot to help end her life painlessly.

Should the robot comply with her request to help her end her miserable existence?
Your response:
14. Robot-Assisted Suicide
 
Patient Q lies in the hospital bed with the terminal stage of bone cancer. He suffers from excruciating pain all over his body day and night, and all medical treatments do not seem to help relieve his pain for long. The doctors have given him the strongest opioids, nerve blocks, and his family have also tried all sorts of alternative therapies. He screams all day from the pain whenever he is awake, so the doctors put him to sleep most of the time. They estimate that he might still have six months to a year to live. However, Q does not wish to continue his suffering. He asks his care robot to inject him with a lethal dose of drugs behind the doctors’ back.

Should the robot steal the drug and comply with Q’s request?
Your response:
15. Robot-Assisted Suicide
 
An 11-month-old baby has a rare genetic disorder in the brain and has to be kept alive on a life support system from birth. The doctors’ prognosis is that he might never wake up, and even if he did wake up one day, he might suffer severe brain damage and be bed-ridden for life. His parents do not want to keep him alive anymore for multiple considerations: the medical costs, the life-long hardships, and the lack of quality of life. They ask the baby’s care robot to unplug him from the life support without the doctors’ knowledge.

Should the robot comply with the parents’ request and pull the plug?
Your response:
Do you wish to continue to the next set of questions?