[ad_1]

Detroit-The U.S. government has begun a formal investigation Tesla’s After a series of collisions with parked emergency vehicles, Autopilot partially autopilots the system.

The survey covered 765,000 vehicles, almost all vehicles Tesla sold in the United States since the start of the 2014 model year.In the certified crash National Highway Traffic Safety Administration As part of the investigation, 17 people were injured and 1 person died.

National Highway Traffic Safety Administration Said that since 2018, it has identified 11 car accidents, in which Tesla’s Autopilot or Traffic Aware Cruise Control hit the vehicle in scenes where emergency personnel used flashing lights, flares, illuminated arrow boards or cones to warn of danger. The agency announced the action in an announcement on its website on Monday.

This investigation is another sign under NHTSA President Joe Biden Compared with previous administrations, the United States has taken a tougher stance on the safety of autonomous vehicles. Previously, the agency was reluctant to supervise new technologies because it was worried that it would hinder the adoption of systems that might save lives.

The survey covers Tesla’s current model lineup, Model Y, X, S and 3 model years from 2014 to 2021.

This National Transportation Safety BoardIt also investigated some of Tesla’s crashes in 2016, and recommended that NHTSA and Tesla limit the use of Autopilot to areas where it can be operated safely.This National Transportation Safety Board It is also recommended that NHTSA require Tesla to have a better system to ensure the driver’s attention. NHTSA has not yet taken action on any recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.

Last year, the National Transportation Safety Board blamed the lack of strict supervision of Tesla, drivers and NHTSA on two collisions that occurred under Tesla’s traversing tractor trailer. The NTSB took unusual steps, accusing NHTSA of failing to ensure that automakers took protective measures to limit the use of electronic driving systems, which led to the accident.

The agency made its decision after investigating a car accident in Delray Beach, Florida in 2019, in which a 50-year-old driver Tesla Model 3 Was killed.When neither the driver nor the Autopilot system is available, the car is driving in Autopilot mode brake Or try to prevent the tractor trailer from crossing its path.

Autopilot is often misused by Tesla drivers. They are found driving under the influence, and even sitting in the back seat while driving on California highways.

Earlier on Monday, people left a message asking Tesla to comment, and Tesla has disbanded its media relations office.

Since June 2016, NHTSA has dispatched investigation teams to investigate 31 accidents involving some automated driving assistance systems. Such systems can keep the vehicle in the center of the lane and maintain a safe distance from the vehicle in front. According to data released by the agency, in these accidents, there were 25 cases involving Tesla Autopilot, of which 10 people died.

Tesla and other manufacturers warn that drivers using these systems must be ready to intervene at any time. In addition to crossing the semifinals, the Tesla car using Autopilot also crashed into a parked emergency vehicle and road obstacles.

Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University who studies self-driving cars, said that the NHTSA investigation is overdue.

Rajkumar said Tesla failed to effectively monitor drivers to ensure that their attention should be the top priority of the investigation. Tesla detects the pressure on the steering wheel to ensure that the driver is involved, but the driver often cheats the system.

“It’s easy to bypass the steering pressure,” Rajkumar said. “It started in 2014. We have been discussing it for a long time.”

The emergency vehicle crash cited by the NHTSA began on January 22, 2018 in Culver City, near Los Angeles, California, when a Tesla using Autopilot crashed into a firetruck parked in the driveway with blinking lights. The staff was dealing with another accident at the time.

Since then, the agency has stated that there has been a crash in Laguna Beach, California. Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochs County, Arizona; Charlotte, North Carolina; Montgomery County, Texas; Lansing, Michigan; and Miami, Florida .

NHTSA stated in its investigation document: “The investigation will evaluate technologies and methods used to monitor, assist and force drivers to participate in dynamic driving tasks during Autopilot operations.”

In addition, the detector will cover the system’s detection of objects and events, as well as the locations that allow its operation. The National Highway Traffic Safety Administration said it will examine the “contributing factors” that led to crashes and similar crashes.

Investigations may result in NHTSA recalls or other enforcement actions.

The agency said in a statement: “NHTSA reminds the public that no commercial motor vehicle can drive autonomously today.” “Every available vehicle requires a human driver to be in control at all times. All state laws require human drivers to Responsible for the operation of the vehicle.”

The agency said it has “powerful law enforcement tools” to protect the public and investigate potential security issues, and it will take action when it finds evidence of “non-compliance or unreasonable security risks.”

In June, NHTSA required all automakers to report any Self-driving car Or part of the automated driving assistance system.

At the opening on Monday, shares of Tesla Inc., headquartered in Palo Alto, California, fell 3.5%.

Tesla uses a camera-based system, a lot of computing power, and sometimes radar to find obstacles, determine what they are, and then decide what the vehicle should do. But Rajkumar of Carnegie Mellon University said that the company’s radar was plagued by “false alarm” signals and stopped vehicles after determining that the overpass was an obstacle.

Tesla has now eliminated the radar and replaced it with thousands of images used by cameras and computer neural networks to determine whether there are objects in the way. He said that the system does a very good job on most objects that can be seen in the real world.But it encountered problems with parked emergency vehicles and vertical truck In its path.

Rajkumar said: “It can only find patterns trained with’quote unquote’.” Obviously, the input to train the neural network does not contain enough images. They are only as good as input and training. Almost by definition, training is never good enough. “

Tesla also allows selected car owners to test so-called “fully autonomous driving” systems. Rajkumar said that this should also be investigated.

[ad_2]

Source link

Leave a Reply