We're All Just Bugs
By JOEL HANS, Associate Editor, Industrial Maintenance & Plant Operation (IMPO)
A number of news sources — including the Associated Press — have recently reported that Google has been quietly working on research and development of hardware and software solutions toward a lofty goal: autonomous cars. According to a post on Google’s corporate blog, the self-driving vehicles have rolled over 140,000 miles on California roads with drivers sitting behind the wheel, ready to intervene in case of a malfunction. Apparently, they have only been required to do so in a handful of occasions, such as when a small-brained cyclist (this, coming from someone who tries to bike to work more than he drives) blew through a stop sign.
All of this is interesting news, but not particularly revolutionary. As Sebastian Thrun, the posts’ author, explains, there are already a number of competitions for autonomous vehicles, including the government-run DARPA Challenges. One of the engineers Google hired to spearhead this effort previously built a “modified Prius that delivered pizza without a person inside.”
What was truly intriguing about the announcement is the conversation around a statement Google’s CEO, Eric Schmidt, made back in September about the relationship between cars and their human operators: “Your car should drive itself. It just makes sense … It's a bug that cars were invented before computers.”
To paraphrase: he just called humans the equivalent of computer “bugs.”
In a lot of ways, Schmidt’s statement is kind of offensive. I’m a responsible driver! I use my turn signals and I’ve never been in an accident that was my fault (sorry, jaywalker, but I hope that ticket teaches you a lesson). Most of the drivers I see on a daily basis are the same way; we just want to get from point A to point B with as little hassle as possible. All in all, I feel like I’m something better than a “bug” in my city’s roadway infrastructure.
But when you think over it a little harder, that’s exactly what we are. Put aside even the most offensive of drivers — those who drive drunk, clock speeds around 55 in a 35-MPH zone, text on the interstate, and cut off “slow moving” traffic — and it’s easy to see how human error is an enormously common thread in a large portion of roadway incidents. Even the best of drivers can have a momentary lapse in attention, or fall asleep behind the wheel or be distracted by kids in the backseat. There are too many wild variables. To help prevent these type of accidents, Volvo has been developing its automatic braking system for a while now (for better or worse, based on the result of this test), and other car manufacturers are doing similar work.
The issue is similar to safety within a manufacturing facility. The world’s best engineers could combine their intellectual forces and design the most “safe” manufacturing facility known to man, but it’s all bunk when an operator neglects all the warnings and sticks his hand inside a machine because “it’s easier that way.” Industrial accidents emerging from true mechanical failure can be monitored and prevented with enough instrumentation, but there’s nothing — less than a 1984-esque surveillance system — that will prevent a plant floor employee from making a mistake.
Would we tangibly benefit from these engineering improvements, from autonomous cars? Undoubtedly, but it’s not a simple conclusion. Eliminating the human bugs on the road will simply create software bugs that will bog down these autonomous systems. And at the same time, we know that artificial intelligence — whether in the form of talking robots or cars filled with radar and cameras — can’t cope with certain scenarios. But I do think it’s a movement in the right direction.
The key for eliminating the most “bugs” from any system — whether they be human- or software-based — will be striking the proper balance between the two. A car that will help a distracted driver stop before hitting a pedestrian is, on every level, a successful and beneficial innovation. So is a car that keeps itself from driving across the double yellow lines after a driver has nodded off to sleep. A car that prevents a coherent driver from making an evasive maneuver to avoid something in the road because they haven’t yet flicked on their turn signal (and use your turn signals, people!), is perhaps too overbearing.
I won’t give up the ability to control my car for nothing. But when a company, be it Google, Volvo or anyone else, develops a safety system that properly addresses my inherent weaknesses as a person, while understanding that I always have “veto” power, I’ll be ready to sign on the dotted line. That is, of course, if I can manage to pay my current “bug-filled” car off in time.
What do you think? Are people the computer bugs of the roads? Or would having automated cars make getting from point A to point B even more treacherous? Let me know at firstname.lastname@example.org.