Monday, July 11, 2016

Drivers Prefer Autonomous Cars That Don't Kill Them

It's an interesting thought that I don't really have an answer for, how we have to program morality into AI. Normally we think about it in terms of robots and such, but really just having autonomous machines, the maker has to decide what is the ideal situation: save the buyer or save the maximum amount of people? What interests me is most people don't want the government to decide, which means there won't be an industry standard. Perhaps that's an advertising thing? "This car will save you at all costs!" "Our competitors' car is programmed to kill twenty-four babies if necessary."

1 comment:

  1. http://www.smbc-comics.com/comic/self-driving-car-ethics

    ReplyDelete