DETROIT – (AP) – The U.S. government has launched a formal investigation into Tesla’s semi-automated autopilot driving system following a series of collisions with parked ambulances.
The research covers 765,000 vehicles, almost everything Tesla has sold in the U.S. since the start of the 2014 model year. Seventeen people were injured and one killed in the accidents identified by the National Highway Traffic Safety Administration as part of the investigation.
According to the NHTSA, it has identified 11 accidents since 2018 in which Teslas using autopilot or traffic-aware cruise control have hit vehicles in locations where first responders have used flashing lights, torches, an illuminated arrow sign, or cones to warn of hazards. The agency announced the action on Monday in a posting on its website.
The investigation is another sign that the NHTSA under President Joe Biden is taking a tougher stance on automated vehicle safety than it has under previous governments. The agency was previously reluctant to regulate the new technology for fear of hindering the introduction of the potentially life-saving systems.
The investigation covers the entire current model range from Tesla, the models Y, X, S and 3 of the model years 2014 to 2021.
The National Transportation Safety Board, which has also investigated some of the 2016 Tesla crashes, has recommended that NHTSA and Tesla limit the use of the autopilot to areas where it can be safely operated. The NTSB also recommended that the NHTSA require Tesla to have a better system to ensure drivers are careful. NHTSA did not implement any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.
“Today’s action by NHTSA is a positive step forward for safety,” said NTSB Chair Jennifer L. Homendy in a statement on Monday. “As we move into the emerging world of advanced driver assistance systems, it is important that NHTSA has insight into what these vehicles can and cannot do.”
Last year, the NTSB blamed Tesla, drivers, and the NHTSA’s lax regulation for two collisions in which Teslas crashed beneath intersecting semi-trailers. The NTSB took the unusual step of accusing the NHTSA of contributing to the crash for failing to ensure that automakers are taking safety precautions to limit the use of electronic driving systems.
The agency made the findings after investigating an accident in 2019 in Delray Beach, Florida that killed the 50-year-old driver of a Tesla Model 3. The car was on autopilot when neither the driver nor the autopilot system braked or tried to avoid a tractor-trailer intersection in its path.
“We’re excited to see the NHTSA finally acknowledge our long-standing call to investigate Tesla for bringing technology to the streets that is predictably abused in ways that will lead to accidents, injuries and death,” said Jason Levine, executive Director of the non-profit center for car safety, an advocacy group. “If anything, this probe must go well beyond accidents with first-aid vehicles, as there is a danger to all drivers, passengers and pedestrians when the autopilot is activated.”
Autopilot has been widely abused by Tesla drivers who were caught drunk behind the wheel or even in the back seat while a car rolled down a California freeway.
A message was left asking for a comment from Tesla, which has disbanded its media office. Shares of Tesla Inc., based in Palo Alto, California, fell 4.3% on Monday.
Since June 2016, the NHTSA has sent investigation teams into 31 accidents with partially automated driver assistance systems. Such systems can keep a vehicle in the middle of the lane and at a safe distance from vehicles in front. Of those crashes, 25 involved Tesla Autopilot, in which 10 fatalities were reported, according to the agency.
Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at any time. In addition to crossing articulated lorries, Teslas with autopilot crashed into stopped emergency vehicles and a lane block.
The NHTSA investigation is long overdue, said Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University who studies automated vehicles.
Tesla’s failure to effectively monitor drivers to ensure they are paying attention should be a top priority of the investigation, Rajkumar said. Teslas sense pressure on the steering wheel to make sure drivers are engaged, but drivers often fool the system.
“It’s very easy to get around the steering pressure thing,” said Rajkumar. “It’s been going on since 2014. We’ve been discussing this for a long time.”
The emergency vehicle accidents cited by the NHTSA began in Culver City, California, near Los Angeles on January 22, 2018, when an autopiloted Tesla hit a parked fire truck that was partially in the lanes with flashing lights. The crews had to deal with another crash at this point.
Since then, there have been accidents in Laguna Beach, California, the agency said. Norwalk, Connecticut; Cloverdale, Indiana; Westbridgewater, Massachusetts; Cochise County, Arizona; Charlotte, North Carolina; Montgomery County, Texas; Lansing, Michigan; and Miami, Florida.
“The investigation will evaluate the technologies and methods used to monitor, support and enforce the driver’s involvement in the dynamic driving task during autopilot operation,” the NHTSA said in its investigation papers.
In addition, the probe covers object and event detection by the system as well as the location where it is allowed to operate. The NHTSA says it will investigate “contributing factors” to the crashes, as well as related crashes.
An investigation could lead to a recall or other enforcement action by the NHTSA.
“The NHTSA reminds the public that today no commercial vehicles are able to drive themselves,” said a statement from the authority. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for the operation of their vehicles.”
The agency said it has “robust enforcement tools” to protect the public and investigate potential security issues, and it will act if it finds evidence of “non-compliance or an inappropriate security risk”.
In June, the NHTSA asked all car manufacturers to report all accidents with fully autonomous vehicles or partially automated driver assistance systems.
Tesla uses a camera-based system, lots of computing power, and sometimes radar to spot obstacles, determine what they are, and then decide what the vehicles should do. However, Carnegie Mellon’s Rajkumar said the company’s radar was plagued by “false positives” and would stop cars after overpasses were found to be obstacles.
Now, Tesla has eliminated radar in favor of cameras and thousands of images that the computer neural network uses to determine if there are objects in the way. The system, he said, does a very good job of most objects that would be seen in the real world. But it had problems with parked ambulances and vertical trucks on its way.
“It can only find patterns that it has been trained on to override quotations,” said Rajkumar. “The inputs on which the neural network was trained clearly do not contain enough images. They are only as good as the inputs and the training. Almost by definition, the training will never be good enough. “
Tesla also allows selected owners to test what is known as a “fully self-driving” system. Rajkumar said that should be investigated as well.
Copyright 2021 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed in any way without permission.






:quality(70)/cloudfront-us-east-1.images.arcpublishing.com/cmg/BPEI2QQ76SHPPOW6X6A6WHEGX4.jpg)















:quality(70)/cloudfront-us-east-1.images.arcpublishing.com/cmg/GLQND2AXQQO2G4O6Q7SICYRJ4A.jpg)




