Ethics for robot cars

The robot car of tomorrow might be programmed to hit you. Imagine an autonomous car — a robot car that has been programmed to drive itself. It can collect and process more information and do so much faster than a human driver can. Now suppose that car is in a situation in which a collision is unavoidable. The only options are for it to collide with a motorcyclist wearing a helmet or a motorcyclist without a helmet. Which option should it be programmed to take? What would rational, ethical “crash optimization” require?

Who should get to study philosophy?

In “Why teach Plato to plumbers?,” Scott Samuelson writes: “Once, when I told a guy on a plane that I taught philosophy at a community college, he responded, ‘So you teach Plato to plumbers?’ Yes, indeed. But I also teach Plato to nurses’ aides, soldiers, ex-cons, preschool music teachers, janitors, Sudanese refugees, prospective wind-turbine technicians, and any number of other students who feel like they need a diploma as an entry ticket to our economic carnival.” So why teach them Plato? “My answer is that we should strive to be a society of free people, not simply one of well-compensated managers and employees.”