On Saturday (17), an accident with a Model S 2019 caused the death of two men in Texas. Local authorities believe that the vehicle was guided by the autopilot, as none of the occupants were in the driver’s seat. For the CEO of Tesla not quite: via Twitter, Elon Musk stated that the data recovered so far indicates that this system was not activated.
To the press, the police informed that the car was driving at high speed on a highway when, around 23:00 local time, it left the lane when making a turn. The vehicle hit a tree and caught fire.
It took the firefighters nearly four hours to extinguish the fire. About 30,000 gallons of water had to be used in the operation. Faced with the difficulty of controlling the fire, they even contacted Tesla to ask for help on how to proceed. It was not clear, however, whether the company provided any guidance.
The firefighters probably had a hard time putting out the fire due to the possibility of lithium batteries continually rekindling when ignited due to the stored energy.
When the fire was finally extinguished, firefighters found two men inside the vehicle. Their identities have not yet been revealed. However, it is known that they were 59 and 69 years old.
One of them was in the back seat. The other was found at the front, but in the passenger seat. The driver’s seat was empty. This made the police assume that no one was driving the car at the time of the accident. Here is the effect: the security of the Tesla Autopilot has been called into question.
The possibility that the driver was displaced from his seat with the force of the collision was raised. However, local authorities stated that, based on their experience with accident investigations, the occupants’ positions left no doubt that none of them were driving the vehicle at the time of the crash.
Elon Musk comes out in defense of Tesla
On Monday (19), Elon Musk spoke on the subject. The entrepreneur did not like the approach of the Wall Street Journal about the accident. The newspaper quoted critics who say the company does not do enough to prevent drivers from becoming too dependent on the automatic features of the brand’s vehicles or using these technologies in situations for which they were not developed.
Via Twitter, Musk stated that the data recovered so far indicates that the autopilot (Autopilot) was not enabled at the time of the accident and that the owner had not purchased Full Self Driving (FSD), a package that gives more autonomy features for the vehicle.
Even if these features were enabled, the presence of a person at the driver’s station would not be dispensed with. Tesla itself warns that, at the current stage, none of these systems offer fully autonomous driving. There are several circumstances in which the driver has to take control of the car.
In his tweet, Elon Musk points out that the vehicle does not travel autonomously on roads without marks that delimit the lanes, for example. The accident road does not have this type of marking.
It is possible that the car was with the autonomous driving system being used in conditions not appropriate for this, therefore. Was Tesla failing to detect this circumstance or were the vehicle’s sensors circumvented in any way? It is what investigations must discover.
Regardless of the outcome, Tesla will not leave the uncomfortable situation it is in any time soon. The company has been criticized for using names like Autopilot and Full Self Driving. In the opinion of experts, these terms may lead the user to believe that such systems allow totally autonomous operation when, in fact, they consist of mechanisms of assistance to the driver.
Currently, the National Road Traffic Safety Administration of the United States (NHTSA) investigates, in addition to this, another 27 accidents with Tesla cars.
With information: The Verge, CNBC.