A German teenager claims he found a way to hack over 25 Tesla cars in 13 countries

0

A month ago, towards the end of August 2021, the National Highway Transportation Safety Administration (NHTSA) launched an investigation in Tesla’s autopilot system after it was found responsible 11 accidents, resulting in 17 injuries and one death. Now a new study conducted by the Massachusetts Institute of Technology (MIT) has confirmed just how unsafe Elon MuskIndeed, the infamous autopilot feature of .

Titled A model for naturalistic gaze behavior during Tesla Autopilot deactivations, the study supports the idea that the EV maker’s “Full-Self-Driving” (FSD) system is actually — surprise, surprise — not as safe as it claims. After following Tesla Model S and X owners in the Boston area as they go about their daily routines for a year or more, MIT researchers found that they tend to become inattentive when using semi-automated driving systems. Notice here that I’ve gone from calling Autopilot a full driving system — which is the term Tesla uses to describe it, and therefore meaning it’s fully autonomous — to then calling it an automated one Driving system, also known as Advanced Driver Assistance System (ADAS), what it is really is.

“Visual behavioral patterns change before and after [Autopilot] retreat,” the study said. “Before turning off, drivers looked less at the road and focused more on non-driving areas than they did after transitioning to manual driving. The higher proportion of off-road looks before switching to manual driving wasn’t compensated by longer looks ahead.” To be entirely fair, it makes sense that drivers would be less inclined to pay attention if they believed the autopilot have full control of their car. The only thing is, it isn’t.

Meanwhile, Tesla will roll out by the end of this week latest version of its Autopilot beta software, in this case version 10.0.1, on public roads – ignoring the current federal investigation when it comes to the security of its system. Billionaire tings, imagine.

Musk also clarified that not everyone who paid for the FSD software can access the beta version, which promises more automated driving features. First things first, Tesla will use telemetry data to collect personal driving data over a 7-day period to ensure drivers continue to remain vigilant. “The data could also be used to implement a new safety ratings page that tracks the owner’s vehicle linked to their insurance,” he added TechCrunch.

In other words, Musk is aware of the risk the current autopilot system poses, and he’s working hard to improve it, or at least make sure he doesn’t bear the blame if more Tesla-related accidents happen. How do you say your autopilot isn’t autopilot without being clear about it—and risk damaging your brand? They release a newer version of it that drivers can easily blame for their negligence, duh.

“The researchers found that this type of behavior could be the result of a misunderstanding [autopilot] Feature can and what its limitations are, what will be reinforced if it works well. Of course, drivers whose tasks are automated for them may become bored after trying to maintain visual and physical attention, which researchers say only leads to further inattention,” TechCrunch continued.

My opinion on Musk and Tesla aside, the MIT study isn’t about shaming Tesla, it’s advocating for driver attention management systems that can provide real-time feedback to the driver or adjust automation features to the driver’s level of alertness. Currently, Tesla’s Autopilot system doesn’t monitor driver alertness via eye or head tracking — two things researchers believe are necessary.

The technology in question – a model for gaze behavior – already exists, for example with car manufacturers Mercedes Benz and Ford is reportedly already working on the implementation. Will Tesla follow suit, or will Musk’s “only child” energy rub off on the company?

New MIT study confirms Tesla’s Autopilot is indeed unsafe



Share.

About Author

Comments are closed.