Whom to Save: Ethics for Self-driving Cars
A software engineer is assigned to program self-driving cars. But he realizes the ethical framework for programming the cars should be established before technical programming begins. Should a car protect the driver at the cost of other traffic users? Who is liable for damage done in an accident? Is it the driver, the programmer, or the government?
Based on field research; 5 pages.
Follow the 'handle' link to access the Case Study on RePub.
For EUR staff members: the Teaching Note is available on request, you can contact us at rsm.nl/cdc/contact/
For external users: follow the link to purchase the Case Study and the Teaching Note.
Tom Anderson is head of the software department at the German company TrustOurSoftware that makes software for self-driving cars. The board of directors asked him to make a recommendation for strategies on how to programme the cars. Due to a near accident with pedestrians he has recently experienced, Tom realized that the ethical framework for programming the cars should be established before technical programming begins. He organizes a stakeholder meeting a few days before the meeting with the board hoping to get enough input there for sound ethical advice on the matter. This case deals with ethical questions concerning self-driving cars. Should a car protect the driver at cost of other traffic users? Should the government step in and define ethical boundaries? What does that mean for other stakeholders such as insurance companies and marketing managers for self-driving cars? And who is liable for damage done in an accident? Is it the driver, the programmer, or the government?