What the Tesla fatal crash tells us about the future of autonomous driving
Joshua Brown was a proud Tesla owner and keen technologist. He died in, but perhaps not quite behind the wheel of a Tesla.
You only need to look at the size of the touchscreen in any Tesla product to know the company is all about pushing technological frontiers. Aside from electric-only power, Tesla are also leading the way in autonomous driving technology, and their suite of tech is called Autopilot – despite the name, Tesla are careful to say the technology is only a driving aid, not a replacement for the driver. There’s several components to Autopilot:
Autosteer
Here’s what Tesla says:
Autosteer keeps the car in the current lane and engages Traffic-Aware Cruise Control to maintain the car’s speed. Using a variety of measures including steering angle, steering rate and speed to determine the appropriate operation AutoSteer assists the driver on the road, making the driving experience easier.
This is otherwise known as adaptive cruise control, and lane keep assist, albeit versions that are designed more for autonomous driving. Most lane keep assist features merely warn the driver and do very little to keep the car in its lane. Autosteer is in beta, which means not quite ready for widespread use but good enough to be tested and used by members of the public who are interested.
Auto lane change
This is not common on other cars. All owners have to do is indicate and the Tesla will move itself to the required lane when it thinks it is safe to do so.
Automatic Emergency Steering & Side Collision Warning
Another technology not often seen elsewhere. This system alerts the driver to potential collisions, and can even take avoiding action.
Autopilot in practice
So that’s Autopilot, technology designed to eliminate the human driver when freeway cruising. Tesla owners have posted many Autopilot videos, and here’s one:
You get the idea. Here’s how good the collision avoidance is:
Joshua Brown’s death
Joshua Brown died in a Tesla when using Autopilot. Brown was a very keen Telsa owner, ex US Navy SEAL and owner of a technology consulting firm. He posted numerous videos to YouTube showing how his Tesla worked; the title image for this post is taken from one, and the video above is also one of his.
On May 7th 2016 he was travelling on a freeway in Florida when a truck turned left in front of him (the USA drives on the right) and his car went under the truck’s trailer but hit the windshield. Apparently Brown was watching a movie at the time.
The diagram from the Florida Highway Patrol shows what happened:
Telsa released a official statement which included these words:
Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.
That’s what happened. The big question is this:
What does this crash mean for the future of autonomous driving?
There will be a lots of people questioning the technology, but that will quickly fade away and not appreciably slow the inevitable replacement of human by drivers by computers. Once invented, technology cannot be un-invented, and why should it be when it offers benefits such as greater safety and efficiency.
This crash is notable only because it was a Tesla, and one on autopilot. Had it been a normal car with a human at the controls it would barely have made the local news. Instead, it’s a global story.
An argument can be made that an alert human would have spotted the truck waiting to turn left, and that is entirely possible. However, it’s also entirely possible a human wouldn’t have noticed the truck – and in fact Brown either did not notice, or did and couldn’t react in time. Or maybe the truck was at fault, leaving insufficient time for the car to slow down after the truck began its turn regardless of who, or what was in control of it.
Tesla make a fair point too. The circumstances of the accident were very unusual; it’s extremely rare for a car to impact an object with its windshield first, which means the front of the car cannot cushion the blow.
It is also interesting to speculate about the outcome of the crash had the truck been fitted with advanced tech that could pick a gap in the traffic, or warn oncoming vehicles. Or if the other vehicles ahead of the Tesla could have sensed the situation and broadcast the data for following vehicles to act on. There are more arguments for increasing automation in response to this crash as there are for reducing it.
That said, we are now at a dangerous phase in the development of autonomous driving techhology. The tech is good enough to work most of the time, as evidenced by the videos above. But it’s not good enough to that humans can be entirely taken out of the driving role just yet. This is dangerous because the tech will lull people into a false sense of security, relying on the computers 99 times until that 100th time the tech doesn’t do its job.
Autonomous cars do not need to be perfect, they just need to be better overall than humans. In some ways, they already are. That means in one or two situations they might be worse – and those are the situations everyone will hear about, not the many more times the tech worked better than a human.
Autonomous tech is the way of the future, because humans aren’t going to evolve improvements any time soon whereas technology improve on almost a monthly basis.
Related posts
- Should you worry about car hacking?
- Why modern cars are hard to use
- Did a usability problem kill Anton Yelchin?
- Why the TAC’s message on AEB is misleading
I hope there will be fairly comprehensive data collected on these systems. Otherwise the only thing the average person will focus on is fatalities reported in the media. Even with additional data, there will be many who will only focus on fatalities. Most people seem oblivious that they have internalized acceptable risk in many activities, with a big one being driving.
Excellent point, Vatcha. Automated systems do very much open the door to data collection for successful collision avoidance, whereas at the moment it’s just about fatal accidents.
and the logical extension of that is cheaper insurance for cars with the system. That might help convince the public of its safety merits.
If the brakes weren’t applied it suggest inattention by the licensed driver or at best hesitation waiting for the vehicle systems to pick up the obstacle.
It follows that Automous systems will lead to less attention from “drivers”
Exactly right. The problem is that the systems aren’t quite ready to do everything without humans, at least for the moment.
Point I was making was humans aren’t capable of cognitively & reasonably managing risks in anything up to the point of full autonomy.
Until such time as autonomy is but the smallest subset of unfortunate unavoidable instances it needs less marketing more eyes out front hands on wheel and feet at the ready and less trust. Once people become comfortable human nature has itbto switch off.
Like all the valid points about texting and phones whilst driving a moments distraction is a killer. Semi autonomy is thus practically unworkable.
It seems strange that there wasn’t some form of under run protection on the side of the truck to allow the crumple zone of the car to do its job upon impact…..
Such rules were brought in to create drop-down bars after Jayne Mansfield died after rear-ending a truck. There aren’t similar rules for side impacts due to the rarity. Maybe there should be.
Why didn’t the truck have side bars on his unit ,in Aussie it’s mandatory on all trucks ,,like they say it’s a very rare for a car to hit the side of a truck ,that’s why l am glad it’s law ,down under. !