Everyone knows the story of thalidomide and the deformed babies that were born as a result of approval in European countries. There were delays in approving the drug in the United States. The director at the FDA in charge of the approval was worried about nerve damage that wasn't proven to be a real problem, but she became a hero for keeping this drug off the US market once the birth defects were linked with the drug. Fewer people know about the deaths caused by the FDA's delay of approving beta-blockers for preventing the risk of a second heart attack. Despite trials showing its effectiveness in the mid 1970's and approval in Europe, it wasn't until 1981 until timolol was approved for use in preventing a second heart attack. In the press release, the FDA announced that it could save 17,000 lives a year. So by their numbers around 100,000 people died as the FDA dragged their feet.
Now, medicine is complicated. There are many examples of delays of useful treatments, there are many more ineffective treatments that were properly blocked and there are some treatments that should never have been approved. Recently, a study found that guidelines suggesting the use of beta-blockers in non-cardiac surgery are resulting in significant excess strokes and deaths. The problem is that the deaths caused by actions are taken far more seriously that the deaths caused by inaction and delays. Our moral calculus does not seem to recognize that obvious missed opportunities to save a life are almost as bad as other causes of death, even in obvious cases like the example above where many people who had already had a heart attack died of a second one because the FDA had yet to approve beta-blockers as a treatment.
That brings us to self-driving cars. Every year in the past decade over 30,000 people in the United States have died in car crashes on public roads. To the extent that many of these accidents are caused by driver error, replacing more fallible human drivers with an electronic system has the potential to prevent many of these deaths. Recently a pedestrian was killed in Arizona and the company suspended tests in all cities. To the extent that Uber is suspending testing to fix an obvious bug that caused this mishap this is what we should expect. But the extent to which Uber or any other company has to suspend operations after an accident only for political optics then we should recognize that the knee-jerk political response is pushing back the time when humans will not only be free to devote many hours of commuting time to other activities, but tens of thousands of lives each year are lost because human drivers still dominate the roads. And that's just counting the deaths that occur within the United States.
We must not expect new technology to start out an order of magnitude better than the old technology. When human lives are on the line we must have higher standards, but those standards should not be unrealistically higher that the status quo. If they are even equally as safe as current human driven systems, then ensuring a legal and regulatory framework under which self driving cars can operate will allow for real time improvement and iteration. When we get to a point that self-driving cars are ubiquitous it will save tens of thousands of lives and give billions of hours of free time back to commuters.
Be careful about demanding too much perfection too soon. Demanding perfection now causes delays, and our moral calculus is too ready to ignore the tens of thousands of lives lost every year that effective automated driving is delayed.