Auto insurance agencies know a lot about traffic, how it works, what causes it, and how to best ensure drivers based on factors like age, college GPA, income, and color of the vehicle. Traffic engineers know well that no matter what they do to direct and control traffic, human error is hard to account for. For example, we have the classic dilemma of when to merge in the event of a road closure or construction. Do you merge over as soon as you see that the road is closing soon, or do you wait until your lane is cut off to merge?
In his landmark work on traffic engineering entitled, Traffic: Why We Drive the Way We Do (and What It Says About Us), Tom Vanderbilt helps readers understand the nuances of what makes drivers behave the way they do and how city planners, construction crews, navigation apps, and drivers themselves try to account for this in their day-to-day lives. From an unexpected demonstration to someone distracted by a song on the radio, it’s hard to account for human error when making plans that have to do with transportation.
What if we take the human driver out of the equation, though? For the most part, auto insurance covers the damage drivers cause to their own cars and to the people, vehicles, or other property involved in the accident. It’s been demonstrated that self-driving cars are not immune from getting into accidents and causing havoc. In 2016, a self-driving Tesla car was involved in a fatal accident in Florida. Who, then, bears the responsibility?
Warren Buffett, who owns Berkshire Hathaway, which recently acquired Geico insurance, believes that the onus of protection will shift from drivers to those who manufacture and program the self-driving cars. Already the state of Michigan has passed laws that require the automakers to assume responsibility for any accident that occurred because of a self-driving car.
Since it’s common knowledge that most of the traffic issues that plague our day to day lives are the results of human error, many believe that automation will reduce the instance of accidents. However, if automakers are required to bear the responsibility for insuring these autonomous vehicles, they may experience a disincentive to produce them, since the costs would swell as the technology’s bugs get worked out. Thus, the speed of technology development would plateau and human-caused traffic accidents would continue to occur.
There are a lot more questions than answers right now regarding what will happen as cars are no longer controlled by fallible people. Insurance companies, automakers, and legislators will continue to debate where the burden belongs, but in the meantime, auto insurance will continue to cover must human error.