Technology

Federal safety regulator is investigating Tesla’s fully autonomous driving software

Published

on

The top U.S. auto safety regulator has launched a brand new investigation into Tesla’s “fully autonomous driving (supervised)” software after 4 reported crashes in low visibility conditions – including one which killed a pedestrian.

National Highway Traffic Safety Administration (NHTSA) Office of Defect Investigation. announced On Friday, it tests the driving force assistance system to see if it will probably “detect and respond appropriately to low visibility conditions on the road,” reminiscent of “sun glare, fog or airborne dust.” The agency also desires to know whether every other accidents aside from those reported have occurred under such conditions.

The investigation comes just per week after Tesla CEO Elon Musk unveiled his company’s “CyberCab” prototype, a two-seater automobile that he says will function a robotaxi after years of unfulfilled guarantees. During the event, Musk also stated that Tesla’s Model 3 sedan and Model Y SUV would have the option to operate unattended in California and Texas sooner or later in 2025, though he provided no details on how that may occur.

In April, NHTSA ended an almost three-year study of Autopilot, Tesla’s less powerful driver-assistance software, after investigating nearly 500 crashes during which the system was energetic. The agency determined that 13 of those crashes were fatal. At the identical time it closed its investigation, NHTSA opened a brand new investigation right into a recall patch issued by Tesla to handle Autopilot problems.

Tesla’s software also poses other legal threats. The Department of Justice is investigating Tesla’s claims about driver-assist features, and the California Department of Motor Vehicles has accused Tesla of overstating the software’s capabilities.

The company is also facing a variety of lawsuits related to autopilot failures. He solved some of the famous cases that was to go to court earlier this 12 months. The company has said previously that it makes drivers aware that they need to consistently monitor Full Autopilot and Autopilot and be able to take control at a moment’s notice.

The recent investigation announced on Friday clearly identifies 4 accidents during which the fully autonomous driving system (supervised) was energetic, all of which occurred between November 2023 and May 2024.

The November 2023 accident occurred in Rimrock, Arizona. The incident involved a Model Y, which hit and killed a pedestrian. Another crash occurred in January 2024 in Nipton, California, where a Model 3 collided with one other automobile on a highway during a sandstorm. In March 2024, a Model 3 collided with one other automobile on a highway in Red Mills, Virginia in cloudy weather. In May 2024, a Model 3 crashed right into a stationary object on a rural road in Collinsville, Ohio, in fog. NHTSA noted that somebody was injured within the May 2024 accident.

The NHTSA Defect Investigation Team divides its investigation into 4 levels: defect report, initial assessment, recall inquiry, and technical evaluation. The agency classified this recent investigation as a preliminary assessment. NHTSA typically tries to finish this sort of study inside eight months.

This article was originally published on : techcrunch.com

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version