After a Tesla crash killed 3 other people in Newport Beach, the federal government is investigating the role of Autopilot

\n \n \n “. concat(self. i18n. t(‘search. voice. recognition_retry’), “\n

The federal government is investigating whether a Tesla involved in a twist of fate that killed 3 other people and injured 3 others last week in Newport Beach had its Autopilot formula activated at the time of the collision.

A special crash investigation was sent for the May 12 incident on the Pacific Coast Highway, the National Highway Traffic Safety Administration said Wednesday.

In that accident, Newport Beach police were called at approximately 12:45 p. m. m. a block 3000 of the Pacific Coast Highway where they learned that a 2022 Tesla Model S sedan had crashed into a sidewalk and hit structural equipment.

Three other people were discovered dead in the Tesla; they were known last week as Crystal McCallum, 34, of Texas; Andrew James Chaves, 32, of Arizona; and Wayne Walter Swanson Jr. , 40, of Newport Beach, according to the Orange County Sheriff’s Department.

Three staff members of the structure suffered non-life-threatening injuries, police said, adding that the main investigation team had been called into the department’s turn of fate.

Tesla, which dissolved its media relations department, did not respond Wednesday to a request for comment from the Times regarding NHTSA’s investigation into the Orange County crash.

The federal investigation is a component of the agency’s broader investigation into crashes involving complex motive force assistance systems like Tesla’s Autopilot. Investigators have been dispatched to 34 injured since 2016 in which the systems were in use or suspected of working; 28 of those involved in Teslas, according to an NHTSA document released Wednesday.

In those 34 crashes, another 15 people were killed and at least 15 others were injured, and all of the deaths occurred in Tesla-related injuries, according to the document.

The NHTSA told The Times on Wednesday night that it would not comment on the open investigations.

In addition to those crashes, NHTSA is investigating several incidents in which Autopilot Teslas collided with emergency cars parked along highways despite flashing lights or danger cones, as well as a number of court cases that the Autopilot formula triggered at high speed. phantom braking” for no apparent reason.

NHTSA is also investigating two injuries involving Volvos, a Navy round-trip accident, two involving Cadillacs, one in a Lexus and one in a Hyundai. March 2018.

In Los Angeles County, the District Attorney’s Office in January filed what appears to be the first prosecution of thieves in the United States against a driving force accused of committing death using a partially automated motive force assistance system.

The fees were established two years after Gardena’s accident.

Kevin George Aziz Riad, 27, was driving a 2016 Tesla Model S on Autopilot on Dec. 29, 2019, when he stepped off a highway, ran a soft red light, and crashed into a Honda Civic.

The driver of the Civic, Gilberto Alcázar López, and his passenger, María Guadalupe Nieves López, died instantly.

Riyadh faces two counts of involuntary manslaughter.

Tesla has warned Autopilot drivers, as well as its so-called full autonomous driving system, that cars cannot drive themselves and that drivers will have to be prepared to interfere at all times.

Last June, NHTSA ordered dozens of auto and generation companies to report car accidents to better monitor their safety.

No commercially available motor vehicle can be fully driven, the firm said. .

“Whether or not an automated driving formula [Level 2] is used, each and every vehicle requires the human motive force to be in control at all times, and all state laws hold the human motive force accountable for the operation of their vehicles,” according to an NHTSA spokesperson. “Some complex motive force assist functions can promote protection by helping driving forces avoid injuries and mitigate the severity of injuries that occur, but as with all motor vehicle technologies and equipment, motive forces will have to use them and responsibly. “

Many legal experts are transparent that the responsibility for Tier 2 systems, such as autopilot, lies entirely with the driving force, not companies commercializing technologies that could lead consumers to make functions work better than they are.

But the California Department of Motor Vehicles is grappling with confusion over Tesla’s full autonomous driving feature, a state-of-the-art edition of Autopilot meant to do precisely what its call suggests: provide a full range, to the point where no humans are needed. drive. .

While other self-driving car developers, such as Waymo and Argo, use trained drivers who adhere to strict protection rules, Tesla conducts its s with its own customers, charging car owners $12,000 for this privilege.

Other self-generating companies are required to report formula errors and errors to the DMV as part of its verification permit formula, but the company has allowed Tesla to withdraw from those regulations.

After tension from state lawmakers, sparked through frightening videos on YouTube and Twitter highlighting the poor performance of full autonomous driving, the DMV said in January it was “reviewing” its stance on Tesla’s technology.

The company is also conducting a review to determine whether Tesla is violating any other DMV regulations with its full self-driving systems, one that prohibits corporations from marketing their cars as autonomous vehicles when they are not.

Times Russ Mitchell and The Associated Press contributed to this report.

This story gave the impression in the Los Angeles Times.

Leave a Comment

Your email address will not be published. Required fields are marked *