Driverless cars seem to be an inevitable part of our future. Google's prototype has already tackled the hairpin turns of San Francisco's famous Lombard Street, the cliff side meander of the Pacific Coast Highway, and looped Lake Tahoe. Florida, Nevada, Michigan, and D.C. have also passed laws allowing autonomous autos on their streets in some capacity.
It doesn't take long to dream up reasons why self-driving cars would be nice. On long commutes, people could get work done, read a book, or hang out with their passengers without any concerns about being too distracted. Senior citizens who might not be capable of driving could regain their independence. People wouldn't have to worry about drunk drivers or texting while behind the wheel. In general, driverless cars are considered to be much, much safer than those driven by people.
Unfortunately, once we get into all the ways autonomous vehicles are great, questions about problems aren't far behind. When they crash, which they will, who is at fault? If a crash is unavoidable, what decisions will the car make? Why? There are no easy answers for these concerns but, given the direction things are moving, these ethical dilemmas seem to be a price we're willing to pay.
Watch more Seeker:
Are Ranchers Exploiting America's Public Land?
Learn more about self-driving cars:
Stanford Law: Human Error as a Cause for Vehicle Crashes
Wired: The Surprising Ethics of Robot Cars