Nighttime Driving in Uber Autonomous Vehicles
I’m Ed Smith, a self-driving car accident lawyer in Sacramento. On March 18, a woman was killed when crossing the road in Tempe, Arizona, by a self-driving car with a backup driver. After investigating the accident, Uber released a report recently saying that it now feels the crash may have been due to a software problem. The software in question is instrumental in deciding if the vehicle avoids an obstacle in its path or not.
Software Details
According to reports by Techcrunch, the problem involved the way the system was set regarding obstacle detection and avoidance. The cameras used on the Uber vehicles are set to detect objects in the car’s path. The software then decides to avoid the object or to ignore it. Uber has said the system on the car that was involved in the Arizona accident was set at a high avoidance level. The system allegedly saw the pedestrian but decided against reacting swiftly.
Mobileye Comments
Mobileye, the Intel unit for collision avoidance, is a part of the software Aptiv PLC uses in Volvos. After testing its unit using the video from the accident, the company said its sensors could detect the pedestrian one second before the accident happened. Even if the sensors were functioning, it would not have given the vehicle time to stop.
How Driverless Cars Detect Objects
Driverless cars use a system called LIDAR. This stands for light detection and ranging. It works by sending 150,000 pulses of light per second aimed at objects in the road or around the vehicle. A sensor measures how long it takes for the light pulse to bounce back. This tells the autonomous system how far the object is from the vehicle. The same principle of detection is already used in radar and sonar such as that used on submarines. For autonomous technology, light waves are better than using sound and radio waves. It is reliable and cheaper.
LIDAR Works in the Dark
LIDAR works in the absence of light and can navigate a vehicle in ways a human cannot. Ford has tested its driverless vehicles in Arizona on stretches of road in total darkness. The range of LIDAR may be decreased during heavy snow or fog, however, it is still accurate for several hundred feet. In absolute darkness, someone dressed in black would still show up as a reason for the vehicle to slow down. Each autonomous system has its own built-in technology. Take a look at a test of nighttime driving on Ford’s system.
How Do Uber Vehicles “See” Objects?
Uber has several detection systems that work together to detect objects:
- LIDAR: Mounted on the roof, this unit provides a 3-D picture of the environment.
- Cameras: There are forward, rear and side-facing cameras. The forward camera looks for pedestrians, vehicles, signs and traffic lights. The side and rear cameras look at everything around and behind the vehicle.
- Radar: The front has a radar system that provides a 360-degree picture. It confirms what LIDAR picks up. So, a pedestrian in the path of the vehicle may not be crisp in terms of “picture” quality, but it confirms LIDAR resolution of someone crossing in front of the car.
Stopping Distance for a Vehicle
There are three components to stopping a vehicle:
- Thinking distance: This is the distance a vehicle goes between the time a human driver realizes he or she should stop and actually brakes.
- Braking distance: This is the distance the vehicle travels once the brakes are applied.
- Stopping distance: Distance the vehicle travels before it stops. It is a combination of the thinking and braking distance.
At 40 mph, the thinking distance would be 39 feet with a braking distance of 80 feet. Adding these two components together would mean a vehicle would stop at 119 feet. If the speed limit in this stretch of road was 20 mph, the vehicle would have stopped in 40 feet. The lower the speed of a motor vehicle, the more likely that a pedestrian will survive. For example, at 20 mph, about 93 percent of pedestrians would live. The speed limit on the roadway where the Uber accident happened is 45 mph, as reported by the New York Times.
Decision-Making Process Failure
The way an autonomous vehicle sees the roadway and obstacles in it is equal to or beyond human capability. The LIDAR system is able to see in the dark, and object recognition programs can track numerous vehicles, pedestrians and other objects at one time. However, just as with humans, all this data is useless without the ability to use it meaningfully and make correct decisions about what to do next. In the Arizona accident, the data indicated there was something or someone in the car’s path, but the system decided to ignore it.
Negligence
Because the vehicle was able to operate on the road, the fact that the software was not capable of making the right decision based on the way it was set to operate, the car manufacturers could be seen as behaving in a negligent way. If this occurred in California where product liability lawsuits do not necessitate proving that negligence occurred, this fault could be used as an erroneous design error in a product liability lawsuit.
Self-Driving Car Accident Lawyer in Sacramento
I’m Ed Smith, a self-driving car accident lawyer in Sacramento. Although the technology for driverless vehicles is growing quickly, these cars and trucks are still in the testing phase. If you have been injured by a self-driving vehicle, it is important to turn to a legal representative with lots of experience. You can reach me at (916) 921-6400 or (800) 404-5400, or fill out our contact form.
I have helped Sacramento residents and those in Northern California for 36 years with serious injuries such as amputation and traumatic brain injuries as well as pedestrian accidents.
I belong to the Million Dollar Forum, a nationwide group of trial lawyers who have won in excess of $1 million in a settlement or verdict for a former client.
Go to the following to learn more about my practice:
Photo Attribution: https://pixabay.com/en/car-street-expressway-federal-street-1275930/
:cd [cs 1052] cv