If you follow car or automotive news at all, then you know self-driving cars are the next big step for the industry. Self-driving cars are certainly the future and we have the latest news and cool concepts for you right here.
Obviously, the designers and programmers of self-driving cars must program cars to operate safely under and circumstance, avoiding other cars, erratic drivers, and even pedestrians. If that wasn't enough, the approaching reality of this new type of car has unleashed a flurry of legal and ethical debates and challenges to overcome.
According to Bigthink.com, researchers at the Toulouse School of Economics are researching what type of ethics should be programmed into a self-driving car. For example, should a self-driving car sacrifice its driver if it is necessary to avoid hitting and killing 5 other people? Or should the driver take a higher priority than everyone else, with the car taking any and all steps necessary to protect its “master” above all else?
These are some tough ethical questions that do not have easy answers. They will not go away soon either, and will likely become a part of the discussion around this emerging technology for years. Robots and AI in general are yet to carve out their place and role in society in a meaningful way either. It will be interesting to see if such devices or robots are given a type of pseudo-person status under the law, similar to what corporate entities have. Or if they will be treated as a sort of second-class citizen, subject to their own special set of laws.
Tracing and understanding the legal issues are even more complicated and thorny. If a self-driving car does hit another car or a person, who is at fault? Robots are unlikely to be able to insure themselves anytime in the near future, so it is likely a person or company is at fault, only the law will define whom.
Google keeps making headlines with their self-driving experiments, with some of the most interesting indicating how the company is unexpectedly having their car act more “human.” Google discovered that their car was making left turns strangely, stopping further back from cars then most human drivers would. This made it difficult for the car to blend in and angered or confused fellow drivers, since it did not adhere to the accepted behavior that everyone else on the road followed.
Google has also been working to have their cars better understand and recognize children. They recently had children test their car's software at their headquarters. The hope is that the software will recognize children and understand that they are unpredictable, forcing the car to act more cautiously in response.
Self-driving cars are still a long way from being commonplace and commercially viable. However, the technology remains a frequent news item and a hotbed for competition as “traditional” automakers enter the field. It remains to be seen which company will crack the code and release the first publicly available autonomous car, as well as how the legal and ethical issues now being voiced, will be handled.