The First Fatal Self-Driving Car Crash Happened And I’m Sure There Will Be More To Come

Email this to a friend


The First Fatal Self-Driving Car Crash Happened And I'm Sure There Will Be More To Come

The first of what will likely be hundreds of thousands of fatal self-driving car crashes has happened in Florida, Tesla recently confirmed.

The driver, cruising around in a Tesla Model S, was in semi-autonomous autopilot mode when the crash occurred two months ago about 100 miles north of Orlando. Apparently, the car’s sensors, nor the driver, were able to identify a tractor trailer cutting across the highway. The Tesla was obliterated.

From The Verge:

The accident occurred on a divided highway in central Florida when a tractor trailer drove across the highway perpendicular to the Model S. Neither the driver — who Tesla notes is ultimately responsible for the vehicle’s actions, even with Autopilot on — nor the car noticed the big rig or the trailer “against a brightly lit sky” and brakes were not applied. In a tweet, Tesla CEO Elon Musk said that the vehicle’s radar didn’t help in this case because it “tunes out what looks like an overhead road sign to avoid false braking events.”

Because of the high ride-height of the trailer, as well as its positioning across the road, the Model S passed under the trailer and the first impact was between the windshield and the trailer. Tesla writes that if the car had impacted the front or rear of the trailer, even at high speed, the car’s safety systems “would likely have prevented serious injury as it has in numerous other similar incidents.”

I might just be a Luddite, but I think all of this automated technology is going to end up destroying mankind. The victim of the crash, Joshua Brown, a former Navy SEAL, was unable to detect an eighteen-wheeler? That adds up with what this self-driving auto expert at Volvo was saying about people’s unwavering trust in these deadly machines.

Some autonomous driving experts have criticized Tesla for introducing the Autopilot feature so early, with a Volvo engineer saying the system “gives you the impression that it’s doing more than it is.” In other words, the car handles most situations so smoothly that drivers are led to believe that the car can handle any situation it might encounter. That is not the case, and the driver must remain responsible for the actions of the vehicle, even with Autopilot active.

So they are autonomous cars that need someone to constantly monitor what is going on to avoid a collision. Oh cool, Tesla. You invented a slightly more complex version of cruise control. Bravo.

Tesla has done its best to shirk responsibility of this incident, claiming that all drivers are aware that self-driving cars are in a beta phase and that autonomous cars are still as safe or safer than regular cars. Yeah, sure thing, buddy. They released a massive blog post on the website, titled “A Tragic Loss.” You can read it below:

We don’t need self-driving cars, you spoiled brats. Go buy a goddamn Nissan Altima and be happy you can get from point A to point B without dying of yellow fever like our ancestors.

[via The Verge]

Image via Hadrian /

Email this to a friend


Log in or create an account to post a comment.

Click to Read Comments (10)