The dangers of partially automated driving cars
The first generation of partially “self-driving” cars is being touted nationally as the answer to America’s growing traffic fatality rate. But the reality is there is nothing safe about partial automation, and in the rosy glow of what could be, these unproven technologies are being allowed on city streets, using real people as stand-ins for crash-test dummies.
The current generation of partial automation is not part of a drive to safety; it’s a drive to get to market first with little-tested technologies.
{mosads}A case in point: Recently, a colleague took a test ride in such a car developed by Uber and Volvo in San Francisco. In the course of a 15-minute ride, the car had to be corrected by the safety driver some half dozen times. It cut across bike lanes illegally and was stumped by what to do when it came across a pedestrian in the crosswalk.
To be clear, this was a publicity test drive, to tout the advanced state of the technology. Worse, eyewitness video recently captured a similar “self-driving” Uber vehicle running a red light at speed while a pedestrian waited to cross the street. Based on numerous accounts from test drives in San Francisco, Pittsburgh, and other cities, it is clear that the complexity of urban streets—with pedestrians, cyclists, truck deliveries, and people exiting parked cars—is simply not something the vehicles are ready to handle.
The danger of partial automation comes from the fact that it lulls people into a false sense of security. “Drivers” think they can tune out and the car will do the work. The recent death of a Tesla driver in Florida testifies to the problem. An alert motorist could likely not have missed the tractor-trailer turning in front of him, but the technology did.
Even specially trained safety engineers operating Uber’s self-driving vehicles don’t always react fast enough. As reported in October in Pittsburgh, witnesses saw a partially automated Uber driven by one turning the wrong way down a one-way street.
For policymakers, the danger of partial automation is that it looks like the future we dream of – one in which human error is replaced by robotic precision – but hidden under the hood is the imperfect and potentially deadly present-day technology.
Officials at the California Department of Motor Vehicles, the Mayor of San Francisco, the State Attorney General and others have all called for a halt to the operation on San Francisco streets of the two-ton semi-automated vehicles operating under Uber’s program. That Uber is not complying is disturbing.
NACTO strongly supports using technology for real safety improvements and recently released a policy on autonomous vehicles that offers a clear path to harnessing these innovations for the reliability and sustainability of our city transportation systems.
Existing technology can easily curb unnecessary and excessive speeding, reduce lane departures, and improve braking, helping people to avoid or mitigate crashes. Fully automated vehicles—not available yet—won’t drink and drive, and they can be programmed to follow the rules of the road. In addition, information collected by networked vehicles, with clear data-sharing protocols, has the potential to help cities make sound transportation planning decisions.
Full automation, done correctly, could be a boon to safety in our cities. However, partial automation is specifically cited as one of the most dangerous paths the country can take because of the increase in the possibility of distraction and the potential for magnifying driver error.
Technology companies are moving ahead, racing to cash in on recently developed technologies. States and cities must start moving at the same pace, requiring that technology and e-hail companies promote vehicles that they can confidently submit to testing as fully automated.
Policymakers must use their regulatory and enforcement powers to restrain bad actors on our streets. Anything less puts Americans’ lives at risk.
Linda Bailey is executive director of National Association of City Transportation Officials .
The views expressed by authors are their own and not the views of The Hill.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..