What's new

Tesla Model S - Self Driving demo

Two artificial intelligent self-driving cars bound to crash into each other is guaranteed as tested. Bad idea in the long run.
 
https://www.theguardian.com/us-news/2021/may/15/tesla-fatal-california-crash-autopilot

A Tesla car involved in a fatal crash on a southern California freeway last week may have been operating on autopilot, according to the California highway patrol.

The 5 May crash in Fontana, a city 50 miles east of Los Angeles, is also under investigation by the National Highway Traffic Safety Administration (NHTSA). It is the 29th case involving a Tesla that the federal agency has investigated.

In the Fontana crash, a 35-year-old man was killed when his Tesla Model 3 struck an overturned semi on a freeway about 2.30am. The driver’s name has not been made public. Another man was seriously injured when the electric vehicle hit him as he was helping the semi’s driver out of the wreck.

The state highway patrol (CHP) announced on Thursday that its preliminary investigation had determined the Tesla’s partially automated driving system “was engaged”.

But on Friday the agency walked back its previous declaration. “To clarify,“ a new statement said, “there has not been a final determination made as to what driving mode the Tesla was in or if it was a contributing factor to the crash.”

At least three people have died in previous US crashes involving the system.

The CHP initially said it was commenting on the Fontana crash because of the “high level of interest” about Tesla crashes and because it was “an opportunity to remind the public that driving is a complex task that requires a driver’s full attention”.

The federal investigation comes after the CHP arrested a man who authorities have said was in the back seat of a Tesla driving on Interstate 80 near Oakland with no one behind the wheel.

CHP has not said if officials have determined whether the Tesla in the I80 incident was on autopilot, which can keep a car centered in its lane and a safe distance behind vehicles in front of it. But it’s likely that either autopilot or full self-driving were in operation for the driver to be in the back seat. Tesla is allowing a limited number of owners to test its self-driving system.

Tesla, which has disbanded its public relations department, did not respond to an email seeking comment. The company says in owner’s manuals and on its website that both autopilot and full self-driving are not fully autonomous and that drivers must pay attention and be ready to intervene at any time.

Autopilot has had trouble dealing with stationary objects and traffic crossing in front of Teslas. In two Florida crashes, in 2016 and 2019, cars with autopilot in use drove beneath crossing tractor-trailers, killing the men driving the Teslas. In a 2018 crash in Mountain View, California, an Apple engineer driving on autopilot was killed when his Tesla struck a highway barrier.

Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas have struck several firetrucks and police vehicles stopped on freeways with their flashing emergency lights on.

In March, the NHTSA sent a team to investigate after a Tesla on autopilot ran into a Michigan state police vehicle on I96 near Lansing. Neither the trooper nor the 22-year-old Tesla driver was injured, police said.

After the Florida and California fatal crashes, the National Transportation Safety Board (NTSB) recommended Tesla develop a stronger system to ensure drivers are paying attention, and limit use of autopilot to highways where it can work effectively. Neither Tesla nor the safety agency took action.
 
https://www.reuters.com/business/autos-transportation/tesla-crash-victim-lauded-full-self-driving-videos-tiktok-2021-05-16/

A Tesla car driver killed in a recent accident in California praised the automaker's "full self-driving" features, and posted videos on his apparent Tiktok account, in which he appeared to drive with his hands off the wheel.

On May 5, a Tesla Model 3 crashed into an overturned truck on a highway in Fontana, killing the Tesla driver and injuring the truck driver and a motorist who had stopped to help him.

The Associated Press news agency cited police as saying a preliminary investigation had determined the Tesla's driver assistant system Autopilot was engaged prior to the crash.

But in a correction issued late on Friday, police said, "There has not been a final determination made as to what driving mode the Tesla was in."

Two videos of a man driving with his hands off the wheel were posted on the alleged Tiktok account of the victim, 35-year-old Steven Hendrickson of Running Springs in California.

"What would do I do without my full self-driving Tesla after a long day at work," said a message on one. "Coming home from LA after work, thank god, self-drive," said a comment on another video, adding, "Best car ever!"

Tesla dubbed its driver assistant features "Autopilot" or "Full Self-driving," which experts say could mislead consumers into believing the car can drive by itself.

On its website, Tesla said its Autopilot feature does not make the vehicle autonomous, however.

On his Facebook account, Hendrickson was shooting a video while driving on autopilot, saying, "Don't worry. I am on autopilot."

Family members were not available for comment and Tesla, which has disbanded its public relations teams, was not immediately available for comment.

Tesla Club-SoCal, a group of Tesla owners in Southern California, said on social media that he was an active member who "loved his Tesla." He is survived by his wife and two children, it added.

The National Highway Traffic Safety Administration has been investigating more than two dozen crashes of Tesla vehicles, including the Fontana crash and a high-profile one in Texas last month that killed two men.

Since 2016 at least three Tesla vehicles operating on Autopilot have been in fatal crashes, two involving a Tesla car driving beneath a semi-truck in Florida.

The U.S. transport safety board said Tesla's autopilot system failed to properly detect a truck as it crossed the car's path, contributing to the accidents also caused by a lack of driver attention and an adequate driver monitoring system.
 
https://www.theguardian.com/technology/2021/aug/16/teslas-autopilot-us-investigation-crashes-emergency-vehicles

The US government has opened a formal investigation into Tesla’s driver-assistance system known as Autopilot after a series of collisions with parked emergency vehicles.

The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the US since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration (NHTSA) as part of the investigation, 17 people were injured and one was killed.

NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action on Monday in a posting on its website.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the agency said.

The investigation covers Tesla’s entire current model lineup, the Models Y, X, S and 3 from the 2014 through 2021 model years.

The National Transportation Safety Board (NTSB), which also has investigated some of the Tesla crashes, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate.

The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies such as NHTSA.

Autopilot has frequently been misused by Tesla drivers, who have been caught driving drunk or even riding in the back seat while a car rolled down a California highway.

The investigation is the latest in a series launched by NHTSA which has opened at least 30 crash investigations involving Tesla cars that it suspected were linked to Autopilot. One investigation into a 2016 crash cleared Tesla’s Autopilot of any blame.

Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. Teslas using the system have crashed into semis crossing in front of them, stopped emergency vehicles and a roadway barrier.

A message was left early on Monday seeking comment from Tesla, which has disbanded its media relations office. Earlier this month Tesla tweeted that “a Tesla with Autopilot engaged experienced 0.2 accidents per million miles driven, while the US average was 9x higher”.

The crashes into emergency vehicles cited by NHTSA began on 22 January 2018 in Culver City near Los Angeles when a Tesla using Autopilot struck a firetruck that was parked partially in the travel lanes with its lights flashing. Crews were handling another crash at the time.

Since then, the agency said there were crashes in Laguna Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochise county, Arizona; Charlotte, North Carolina, Montgomery county, Texas; Lansing, Michigan; and Miami, Florida.

“The investigation will assess the technologies and methods used to monitor, assist and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” NHTSA said in investigation documents.

In addition, the investigation will cover object and event detection by the system, as well as where it is allowed to operate. NHTSA says it will examine “contributing circumstances” to the crashes, as well as similar crashes.

An investigation could lead to a recall or other enforcement action by NHTSA.

“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said in a statement. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.”

The agency said it has “robust enforcement tools” to protect the public and investigate potential safety issues, and it will act when it finds evidence “of noncompliance or an unreasonable risk to safety”.

In June NHTSA ordered all automakers to report any crashes involving fully autonomous vehicles or partially automated driver assist systems.

The measures show the agency has started to take a tougher stance on automated vehicle safety than in the past. It has been reluctant to issue any regulations of the new technology for fear of hampering adoption of the potentially life-saving systems.

Shares of Tesla, based in Palo Alto, California, fell nearly 2% before the opening bell.
 
https://www.reuters.com/world/us/us-opens-investigation-into-580000-tesla-vehicles-over-game-feature-2021-12-22/

U.S auto safety regulators said Wednesday they have opened a formal safety investigation into 580,000 Tesla vehicles sold since 2017 over the automaker's decision to allow games to be played on the front center touchscreen.

The National Highway Traffic Safety Administration (NHTSA) said its preliminary evaluation covers various 2017-2022 Tesla Model 3, S, X, and Y vehicles. This functionality, referred to as “Passenger Play,” "may distract the driver and increase the risk of a crash," the agency said.

NHTSA said it has "confirmed that this capability has been available since December 2020 in Tesla 'Passenger Play'-equipped vehicles." Before then, the game feature "was enabled only when the vehicle was in Park."

NHTSA said in a statement on Wednesday it was "committed to ensuring the highest safety standards on the nation’s roadways."

The agency said the decision to open the investigation was based on reports "Tesla’s gameplay functionality is visible from the driver's seat and can be enabled while driving the vehicle."

The Governors Highway Safety Association said on Wednesday it was pleased with NHTSA's Tesla safety investigation "and want to remind all drivers to be alert and focused on the road when you're behind the wheel."

Tesla did not immediately comment.

NHTSA said it would "evaluate aspects of the feature, including the frequency and use scenarios of Tesla 'Passenger Play'."

Earlier this month, the New York Times highlighted the game feature, prompting NHTSA to say it was in discussions with Tesla about its concerns.

The agency noted earlier in December that distracted driving accounts for a significant number of U.S. road deaths - 3,142 in 2019 alone. Safety advocates have said official figures underestimate the problem because not all drivers involved in crashes later admit they were distracted.

The Times said the Tesla update added three games - Solitaire, a jet fighter and conquest strategy scenario - and said that vehicles have warnings reading: "Playing while the car is in motion is only for passengers."

The paper said the game feature asks for confirmation that the player is a passenger, though a driver could still play simply by pressing a button.

In 2013, NHTSA issued guidelines to encourage automakers "to factor safety and driver distraction-prevention into their designs and adoption of infotainment devices in vehicles."

The guidelines "recommend that in-vehicle devices be designed so that they cannot be used by the driver to perform inherently distracting secondary tasks while driving," the agency said.

The agency in August opened a safety investigation into 765,000 Tesla vehicles over its driver-assistance system Autopilot after a series of crashes involving the system and parked emergency vehicles.

A preliminary evaluation is a first step before NHTSA decides whether to upgrade a probe to an engineering analysis, which must happen before the agency can demand a recall.

NHTSA said it received a complaint in November about the game feature from a Tesla Model 3 driver in Oregon, who said: "Creating a dangerous distraction for the driver is recklessly negligent."

On Nov. 29, Daimler's Mercedes-Benz recalled 227 U.S. vehicles - 2021 model year S580, 2022 EQS450, EQS580, and S500 -- because the vehicle infotainment systems "might allow activation of the television and internet display while driving, causing a distraction for the driver."
 
Back
Top