Electric car manufacturer Tesla Motors hit back on Tuesday after a Twitter video recently made the rounds showing a driver asleep at the wheel of his Tesla.
A spokesperson for the company responded to the incident pointing to an increase in “dangerous hoaxes” appearing on social media.
For the record, this was no hoax, at least not on my part. Maybe the people in the car were faking being asleep, but I’m skeptical.
I just saw something weird on the highway and recorded it. Nothing revolutionary.
— Dakota Randall (@DakRandall) September 10, 2019
The story was picked up by NBC News after a Massachusetts man filmed the episode on Interstate 90 in Newton. Dakota Randall claims it wasn’t a hoax, at least not on his part.
The video appears to show both the driver and his passenger fast asleep in the middle of a Sunday afternoon. While a specific time wasn’t provided it does add validity to Tesla’s claim that the event may have been an elaborate hoax.
Tesla Fans Come to the Aid of Autopilot
There are no grey areas when it comes to Tesla and its charismatic leader Elon Musk. You either love them or hate them. That means only one thing, you have to pick a side.
In this case, that side was Tesla as supporters of the embattled carmaker quickly came to its aid. According to one fanatic, AutoPilot is designed to prevent drivers from falling asleep at the wheel.
Two videos posted by @tesla_truth and re-tweeted by Musk himself show cars slowing down and coming to a complete halt when their drivers did not re-engage the steering wheel.
Tesla AutoPilot does not allow drivers to sleep behind the wheel
If the driver does not respond to attention prompts, the car will play a sound to try and get their attention /wake them up
— Steve Jobs Ghost 👻 (@tesla_truth) September 10, 2019
What they didn’t show, however, was the alleged pulling over to the curb. A fully stopped Tesla in the middle of a highway could in itself be potentially hazardous.
That said, in medical emergencies with a loss of consciousness, the benefit of AutoPilot is abundantly clear.
In the meantime, the jury is still divided on AI-enabled driving. Literally. A recent report from the National Transportation Safety Board determined that both AutoPilot and the driver were at fault for a California crash that took place in 2018.
Twitterati Pounce on the Opportunity to Amuse
To be clear, drumming up attention on the back of a dangerous hoax is not a smart idea. But that didn’t stop the Twitterati from bringing some humor to the controversial situation.
One user speculated that AutoPilot could conveniently be used to transport victims away from the scene of a crime:
Maybe they’re dead and the killer programmed it to drive far away from the crime scene.
— Cheesy Delight (@cookiedrool) September 9, 2019
Another user speculated that it’s probably not a good idea to listen to religious podcasts while at the wheel:
He was probably listening to my pastor’s podcast…
— BitterBlueBetty (@BitterBlueBetty) September 9, 2019
And finally, one last tweeter suggested that the sleeping driver was probably even safer than the cameraman. Randall spent up to a minute shooting the film before leaving the slumbering occupants in his dust.
Last modified (UTC): September 10, 2019 5:33 PM
Credit: Source link