PROMISE: Our kitties will never sit on top of content. Please turn off your ad blocker for our site.
puuuuuuurrrrrrrrrrrr
Hayder Radha
Published: Thursday, July 7, 2022 - 12:03 It’s hard to miss the flashing lights of fire engines, ambulances, and police cars ahead of you when you’re driving down the road. But in at least 11 cases from January 2018 to July 2021, Tesla’s Autopilot advanced driver-assistance system did just that. This led to 11 accidents in which Teslas crashed into emergency vehicles or other vehicles at those scenes, resulting in 17 injuries and one death. In August 2021, the National Highway Transportation Safety Administration (NHTSA) launched an investigation into Tesla’s Autopilot system in response to the crashes. The incidents took place in Arizona, California, Connecticut, Florida, Indiana, Massachusetts, Michigan, North Carolina, and Texas. It’s also not the first time the federal government has investigated Tesla’s Autopilot. The NHTSA announced on June 9, 2022, that it has broadened its investigation of Tesla’s Autopilot to look at 830,000 of the 2014 to 2021 Model S, X, 3, and Y Tesla cars sold in the United States—virtually every car the company has made since 2014. Also, there were an additional three incidents involving Tesla cars crashing into first responder vehicles since the August 2021 report. On June 15, 2022, the administration released data about crashes from July 1, 2021, through May 15, 2022, involving cars equipped with advanced driver-assist systems from all carmakers. The data, collected in response to the investigation of Tesla cars, showed that there were 367 crashes in cars with driver-assist technology in use during that 10-month period, including six deaths and five serious injuries. As a researcher who studies autonomous vehicles, I believe the investigation will put pressure on Tesla to reevaluate the technologies the company uses in Autopilot, and could influence the future of driver-assistance systems and autonomous vehicles. Tesla’s Autopilot uses cameras, radar, and ultrasonic sensors to support two major features: Traffic-Aware Cruise Control and Autosteer. Traffic-Aware Cruise Control, also known as adaptive cruise control, maintains a safe distance between the car and other vehicles that are driving ahead of it. This technology primarily uses cameras in conjunction with artificial intelligence algorithms to detect surrounding objects such as vehicles, pedestrians, and cyclists, and estimate their distances. Autosteer uses cameras to detect clearly marked lines on the road to keep the vehicle within its lane. In addition to its Autopilot capabilities, Tesla has been offering what it calls “full self-driving” features that include autopark and auto lane change. Since its first offering of the Autopilot system and other self-driving features, Tesla has consistently warned users that these technologies require active driver supervision and that these features do not make the vehicle autonomous. Tesla is beefing up the AI technology that underpins Autopilot. The company announced on Aug. 19, 2021, that it is building a supercomputer using custom chips. The supercomputer will help train Tesla’s AI system to recognize objects seen in video feeds collected by cameras in the company’s cars. Advanced driver-assistance systems have been supported on a wide range of vehicles for many decades. The Society of Automobile Engineers (SAE) divides the degree of a vehicle’s automation into six levels, starting from Level 0, with no automated driving features, to Level 5, which represents full autonomous driving with no need for human intervention. Within these six levels of autonomy, there is a clear and vivid divide between Level 2 and Level 3. In principle, at Levels 0, 1 and 2, the vehicle should be primarily controlled by a human driver, with some assistance from driver-assistance systems. At Levels 3, 4 and 5, the vehicle’s AI components and related driver-assistance technologies are the primary controller of the vehicle. For example, Waymo’s self-driving taxis, which operate in the Phoenix area, are Level 4, which means they operate without human drivers but only under certain weather and traffic conditions. Tesla Autopilot is considered a Level 2 system, and hence the primary controller of the vehicle should be a human driver. This provides a partial explanation for the incidents cited by the federal investigation. Though Tesla says it expects drivers to be alert at all times when using the Autopilot features, some drivers treat the Autopilot as having autonomous driving capability with little or no need for human monitoring or intervention. This discrepancy between Tesla’s instructions and driver behavior seems to be a factor in the incidents under investigation. Another possible factor is how Tesla ensures that drivers are paying attention. Earlier versions of Tesla’s Autopilot were ineffective in monitoring driver attention and engagement levels when the system is on. The company instead relied on requiring drivers to periodically move the steering wheel, which can be done without watching the road. Tesla announced in 2021 that it has begun using internal cameras to monitor drivers’ attention and alert drivers when they are inattentive. Another equally important factor contributing to Tesla’s vehicle crashes is the company’s choice of sensor technologies. Tesla has consistently avoided the use of lidar. In simple terms, lidar is like radar but uses lasers instead of radio waves. It’s capable of precisely detecting objects and estimating their distances. Virtually all other major companies working on autonomous vehicles, including Waymo, Cruise, Volvo, Mercedes, Ford, and GM, are using lidar as an essential technology for enabling automated vehicles to perceive their environments. By relying on cameras, Tesla’s Autopilot is prone to potential failures caused by challenging lighting conditions, such as glare and darkness. In its announcement of the Tesla investigation, the NHTSA reported that most incidents occurred after dark where there were flashing emergency vehicle lights, flares, or other lights. Lidar, in contrast, can operate under any lighting conditions and can “see” in the dark. The investigation could eventually lead to changes in future versions of Tesla’s Autopilot and its other self-driving systems. The investigation might also indirectly have a broader effect on deploying future autonomous vehicles. In particular, the investigation may reinforce the need for lidar. Although reports in May 2021 indicated that Tesla was testing lidar sensors, it’s not clear whether the company was quietly considering the technology or using it to validate their existing sensor systems. Tesla CEO Elon Musk called lidar “a fool’s errand” in 2019, saying it’s expensive and unnecessary. However, just as Tesla is revisiting systems that monitor driver attention, the NHTSA investigation could push the company to consider adding lidar or similar technologies to future vehicles. This is an updated version of an article originally published on August 23, 2021. This article is republished from The Conversation under a Creative Commons license. Read the original article. Quality Digest does not charge readers for its content. We believe that industry news is important for you to do your job, and Quality Digest supports businesses of all types. However, someone has to pay for this content. And that’s where advertising comes in. Most people consider ads a nuisance, but they do serve a useful function besides allowing media companies to stay afloat. They keep you aware of new products and services relevant to your industry. All ads in Quality Digest apply directly to products and services that most of our readers need. You won’t see automobile or health supplement ads. So please consider turning off your ad blocker for our site. Thanks, Hayder Radha is a Michigan State University Foundation professor of electrical and computer engineering, and an international leader in the broad fields of multidimensional signal processing and visual analysis.Why Tesla’s Autopilot Crashes Spurred the Feds to Investigate Driver-Assist Technologies
And what that means for the future of self-driving cars
How Tesla’s Autopilot works
Tesla’s Autopilot display shows the driver where the car thinks it is in relation to the road and other vehicles. Rosenfeld Media/Flickr, CC BY.Autopilot does not equal autonomous
News coverage of a Tesla driving in Autopilot mode that crashed into the back of a stationary police car.Fallout from the investigation
Our PROMISE: Quality Digest only displays static ads that never overlay or cover up content. They never get in your way. They are there for you to read, or not.
Quality Digest Discuss
About The Author
Hayder Radha
© 2023 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute, Inc.
Comments
Driver Assist Technology
It will be interesting to see if it is established whether or not driver assist technology reduces the frequency of cost of car accidents. We see some spectacular failures but who's to say whether many, perhaps equally significant crashes were avoided.
The big difference? With auto-pilot at the wheel who do you sue?