Driverless cars represent the next technology landmark in humanity’s highly computerized world. Hundreds of automakers, technology and services providers, and technology start ups, including big-name companies like Ford, GM, Toyota, and Tesla, have invested millions of dollars into autonomous vehicle (AV) research and development. With Congress stalling on producing federal laws regarding self-driving vehicles, state governments and policy experts have been working on implementing AVs onto American roadways.
What You Need to Know about Driverless Cars
To date, self-driving test cars have been in the spotlight due to vehicle accidents and deaths on public roads. Other concerns lurk below the surface. As the number of driverless cars grows, computer glitches, cyber attacks, and issues related to integration with existing vehicles are a cause for concern.
Knowledge is power, so it’s important to understand the issues involved with autonomous vehicles – a catchall term used to describe vehicles where human occupants are simply passengers and need never be involved in driving.
Autonomous Vehicle Definitions
The federal National Highway Traffic Safety Administration (NHTSA) and driverless car developers have adopted the Society of Automotive Engineers International definitions for levels of driverless automation. This system outlines the methodical increase in automation until the ultimate driverless car is produced.
Level 0 — The human driver does all the driving; your typical car.
Level 1 — A vehicle’s advanced driver assistance system (ADAS) can assist the human driver with steering or braking/accelerating.
Level 2 — A vehicle’s ADAS can control both steering and braking/accelerating under some circumstances. The human driver must continue to monitor the driving environment at all times and perform other tasks associated with driving.
Level 3 — A vehicle’s automated driving system (ADS) can perform all driving tasks under some circumstances. At those times, the human driver must be ready to take back control at the request of the ADS. Most of the time, though, a human driver is literally and figuratively behind the wheel.
Level 4 — A vehicle’s ADS can perform all driving tasks and monitor the driving environment – essentially, do all the driving – in certain circumstances. The human driver doesn’t have to pay attention during those times.
Level 5 — A vehicle’s ADS can do all the driving in all circumstances. The human occupants are just passengers and never need to be involved in driving.
Note that Levels 0, 1, and 2 require an active and engaged driver, while in Levels 3, 4 and 5, the car does not need human supervision.
The evolution of driverless cars is steadily accelerating. Mercedes-Benz announced in September 2023 their first Level 3 ADAS system approved for sale in U.S. markets, and is the first and only automaker to reach Level 3. Most other big-name car manufacturers trail behind at Level 2. A Ford Vice President in charge of Software recently reported that Ford would not have Level 3 vehicles out until at least 2025. According to “The Street,” Elon Musk’s aversion to LiDAR sensors is keeping Tesla at Level 2.
The Inner Workings of Driverless Cars
At Level 5, sensors, GPS, and computers replace the human brain. The sensors gather and continually update information about the driving environment to the vehicle’s computerized ADS. In turn, the ADS’ decisions direct the car. In other words, you could sleep in the back seat while traveling from New York to Miami, presumably with a gentle voice awakening you at programmed pit stops.
According to a report from Synopsys, AVs currently use four types of sensors to obtain information:
Ultrasonic sensors use sound waves to detect obstacles in their immediate vicinity. Typically used for parking, these sensors are considered fully developed and have no needed improvements.
Image sensors utilize several cameras, including 3-D models, to capture the environment. Image sensors can read signs and can serve as a back-up if another sensor fails.
Radar sensors are located around the perimeter of a self-driving car and send out electromagnetic waves that bounce back from obstacles. These sensors can track the speed of other vehicles in real time, but refinements are needed to move from 2-D to 3-D sensors.
LiDAR sensors are harmless laser beams that detect the car’s environment. The most expensive of the sensors, the LiDAR could be improved by incorporating a flashing beam for greater accuracy.
Lastly, a computer inside the car utilizes complex algorithmic and machine learning software to piece together a picture of the road and instructs the car on its next actions.
Major Traffic Safety Concerns About Autonomous Cars
These are the top safety concerns for driverless cars:
Computer error, causing the ADS to fail. Maybe bad weather caused one of the sensors to mislabel a pedestrian as a trash can. Perhaps the GPS sent outdated information. Or it could be a garden-variety, who-knows-the-reason computer error. Unlike an error on your laptop, an ADS error could cause a fatal accident.
The blending of driver controlled and driverless cars as autonomous vehicles face increasingly unusual and challenging situations. The Governors Highway Safety Association’s recent 2023 Policies and Priorities report calls for more research on “the intersection of autonomous vehicles and traffic safety” to make sure that other drivers on the road remain unaffected with increased driverless car rollout. Shared responsibility between the driver and the vehicle itself poses behavioral risks that extend other drivers. While humans can signal with each other through body language and eye contact, it is nearly impossible for humans to communicate with driverless cars.
Cyber-attacks, data breaches that steal or destroy information via access to computer systems. Even government officials aren’t safe, as U.S. agencies are often targeted for their sensitive and relevant data. So far in 2024, hospitals, churches, and schools have suffered such attacks. Some regard autonomous cars as the future’s next cybersecurity headache. With corrupted technology at the wheel, the damage can be fatal. An author of thriller films could easily outline six scary scenarios over lunch.
Ambiguous language for Level 4 driverless cars. Currently, Level 4 represents the realistic target for manufacturers of autonomous cars. However, the ambiguous language in Level 4’s description will likely create liability issues that will ultimately be resolved by the courts in the future, as no company has reached ADAS Level 4 yet in early 2024.
In 2018, Lemberg Law filed one of the first lawsuits regarding an accident with an autonomous vehicle, Nilsson vs. General Motors. Here, a self-driving Chevrolet Bolt suddenly veered into another driver’s lane, causing severe injuries. Managing Attorney Sergei Lemberg reports that the suit was resolved to the client’s liking. Since then, there have been hundreds of lawsuits involving autonomous vehicles, and they continue at a rapid pace. The problem is, Reuters reports, is that problems with driverless cars are outpacing liability laws concerning them. The report details that autonomous vehicle technology “is so new that there’s no rulebook.”
Unanswered Questions
ADAS’s level 3’s description reads “A vehicle’s ADS can perform all driving tasks and monitor the driving environment – essentially, do all the driving – in certain circumstances. The human driver doesn’t have to pay attention during those times.”
What defines “essentially”? “What are the “certain circumstances?” How does the human know what to do, and what not to do? And what are the consequences? Those questions remain unanswered, as does the question of how insurers will handle driverless car coverage and claims that arise from autonomous vehicles.
NPR’s “All Tech Considered” summarized the opinions of insurance and legal authorities: “The introduction of autonomous vehicles raises novel new issues regarding fault and liability which will be resolved by the courts on a case-by-case basis. This will create the data necessary for the insurance companies to set premiums which may, or may not, fall entirely on the manufacturers.”
Existing Driverless Car Regulations
Currently, there are not many driverless car regulations, but many interested parties, including manufacturers and developers, are prodding Congress to enact national standards. They fear that, in the absence of national public policy, states will pass their own laws and create an impossible patchwork of differing standards.
Federal rules and regulations have been stalling in Congress for over five years. The House passed the Safely Ensuring Lives Future Deployment and Research In Vehicle Evolution Act, thankfully better known as the SELF DRIVE Act, in September 2017. However, it stalled in the Senate and has still not been passed. The provisions of the bill include:
States cannot pass their own laws regulating the performance of driverless cars
Privacy protection provided for users
Cybersecurity protection and notification requirements
Exemptions for current Federal Motor Vehicle Safety Standards, which apply to conventional cars
Notably, the Senate bill does not apply to autonomous trucks.
However, these regulations vary from state to state. According to “Automotive World,” Michigan, California, and Arizona are the most receptive to AV technology on the roads and in testing, while other states have remained more cautious with their lawmaking.
Note that the NCSL maintains a NHTSA-supported database of all AV legislation available for public access, organized by state and topic.
Pros, Cons, Cautions, and Controversies Surrounding Autonomous Cars
Generally, the technology and finance sectors love driverless cars. Consumers, on the other hand, are still split. For a complete and straightforward analysis of the future of autonomous vehicles, read “Autonomous Vehicle Implementation Predictions” released in December 2023 by the Victoria Transport Policy Institute. The Institute is an independent research organization dedicated to developing innovative and practical solutions to transportation problems.
The report notes, “This analysis indicates that Level 5 autonomous vehicles, able to operate without a driver, may be commercially available and legal to use in some jurisdictions by the late 2020s, but will initially have high costs and limited performance. Some benefits, such as independent mobility for affluent non-drivers, may begin in the 2030s but most impacts, including reduced traffic and parking congestion, independent mobility for low-income people (and therefore reduced need for public transit), increased safety, energy conservation and pollution reductions, will only be significant when autonomous vehicles become common and affordable, probably in the 2040s to 2060s.”
However, after many years of AV car development, consumers are starting to perceive them more positively. According to a 2023 report by McKinsey, a global consulting firm, just over half of car buyers could envision themselves switching to a fully autonomous vehicle (ADAS Levels 4 and 5) in the future. The report also indicates that consumers would be willing to pay thousands of dollars, and even switch manufacturers, to get the ADAS features they want in a car.
According to the World Health Organization, about 1.2 million people die in traffic accidents each year, and over 94% of them are due to human error. Autonomous cars are a step in the right direction, and have proven them as such. The National Highway Traffic Safety Administration reported that, during a four-month period in 2022, only 11 crash deaths involved an autonomous car. While the technology is far from perfect, it has shown promising signs in safety.
Conclusion
Driverless cars are complex. Luckily, the world’s smartest engineers, computer scientists, and policy experts are working to develop autonomous vehicles and examine the impacts of putting them on the road. Though it may be years before we see the first fully autonomous vehicles on public roads, they have shown promising signs in safety and reliability and drawn valid concerns about their integration with vehicles on the road. While many believe driving will become an obsolete skill in the coming decades, for now the world will wait and watch as transportation changes forever.
About the Author:
Sergei Lemberg is an attorney focusing on consumer law, class actions related to automotive issues, and personal injury litigation. With nearly two decades of experience, his areas of practice include Lemon Law (vehicle defects), Debt Collection Harassment, TCPA (illegal robocalls and texts), Fair Credit Reporting Act, Overtime claims, Personal Injury cases, and Class Actions.
He has consistently been recognized as the nation's "most active consumer attorney." In 2020, Mr. Lemberg represented Noah Duguid before the United States Supreme Court in the landmark case Duguid v. Facebook. He is also the author of "Defanging Debt Collectors," a guide that empowers consumers to fight back against debt collectors and prevail, as well as "Lemon Law 101: The Laws That Lemon Dealers Don't Want You to Know."
Sorry, but this misses the most important feature that all autonomous vehicles rely on, and their main weakness. While computers driving vehicles do use lidar and radar, they are impossible to use to recognize objects because they are colorless and textureless pinpoints of sensor distance data, that at best can turn the surrounding environment into lines of spinning slices. They also do use cameras for full color vision, but that is much slower than one would imagine. Vision camera produce a series of two dimensional arrays called a frame. Humans see faster than 32 frames per second. But single processor computers are at least 32 times slower just to recognize some object in each frame, and much slower that even that to difference between frames to extrapolate motion. Why are humans so much better at vision? Because we have over 100 billion parallel processors. Can computers compete? No, because the way computers do parallel processors now is with cores that share a memory bus bottleneck and are useless for things like image processing. Even our biggest supercomputers only have a thousand processors and are way too big and expensive for any vehicle.
So then how do autonomous vehicles drive? Well they just use GPS. They don’t and can’t see the lanes, and are just guessing based on mapping data. Is that bad? YES! Because GPS mapping is horribly unreliable. Not only is it out of date, but it goes down completely every 11 years when solar maxima interferes with satellites.
Sorry, but this misses the most important feature that all autonomous vehicles rely on, and their main weakness. While computers driving vehicles do use lidar and radar, they are impossible to use to recognize objects because they are colorless and textureless pinpoints of sensor distance data, that at best can turn the surrounding environment into lines of spinning slices. They also do use cameras for full color vision, but that is much slower than one would imagine. Vision camera produce a series of two dimensional arrays called a frame. Humans see faster than 32 frames per second. But single processor computers are at least 32 times slower just to recognize some object in each frame, and much slower that even that to difference between frames to extrapolate motion. Why are humans so much better at vision? Because we have over 100 billion parallel processors. Can computers compete? No, because the way computers do parallel processors now is with cores that share a memory bus bottleneck and are useless for things like image processing. Even our biggest supercomputers only have a thousand processors and are way too big and expensive for any vehicle.
So then how do autonomous vehicles drive? Well they just use GPS. They don’t and can’t see the lanes, and are just guessing based on mapping data. Is that bad? YES! Because GPS mapping is horribly unreliable. Not only is it out of date, but it goes down completely every 11 years when solar maxima interferes with satellites.