The robot car of tomorrow might be programmed to hit you. Imagine an autonomous car — a robot car that has been programmed to drive itself. It can collect and process more information and do so much faster than a human driver can. Now suppose that car is in a situation in which a collision is unavoidable. The only options are for it to collide with a motorcyclist wearing a helmet or a motorcyclist without a helmet. Which option should it be programmed to take? What would rational, ethical “crash optimization” require?