Tesla’s Self-Driving Technology Fails to Detect Children on the Road, Tests Find
A new set of tests on Tesla’s Autopilot software shows that it is still missing key capabilities to be considered self-driving. The software was able to fail to detect a child in the road and responded poorly in several other scenarios.
Testing by AAA found that Autopilot still has trouble seeing pedestrians, particularly on the road at dusk or from a side angle. In tests looking for a car ahead and falling off a bridge, the system struggled on hilly roads with trees around. According to one test, it also took control away from drivers at an alarming rate: every 700 meters on average. When its sensors detect something, there’s about a second between when it registers as something needing attention and when it might take action — longer than many human reaction times. In light of this evidence, Elon Musk announced that Tesla would make significant improvements to its Autosteer feature over the next few weeks. This update should address all of these issues, although we can’t yet know how effective they will be until the update is released.
Autopilot’s Problems Detecting Children
Tesla’s first generation Autopilot has been marketed as a self-driving system. While it has some autonomous features, it is not a self-driving system and never has been. It should be clear to Tesla owners that the system is not designed to keep them safe and that drivers must always remain attentive. With the first generation of hardware, Autopilot was a functional assistance system that could take control of the car under certain conditions. However, the system is limited in many ways. It cannot fully drive the car on its own, it cannot handle all types of roads or all weather conditions, and Autopilot is not a replacement for effective driver oversight. As the company began to roll out its second-generation hardware, designed to support level 5 autonomy, it renamed the feature “Autopilot.” This has led many people to believe it has the same autonomy as other companies’ self-driving systems. The second generation hardware does have the potential to be a self-driving system when the software is ready, but the current capability is not self-driving.
Tesla’s Self-Driving Software Fails to Detect People with Disabilities
One of the most troubling results from the new round of tests is that Tesla’s software failed to detect a pedestrian exercising on a stationary bike on the road ahead of the car. In previous tests, Tesla’s software could not see a person in a wheelchair on the sidewalk, though it did detect a person on a bicycle. People in wheelchairs are less able to move out of a car than people on bikes, and people on stationary bicycles are even less able to move out of a vehicle. The fact that the software failed to detect a person in a wheelchair is concerning, as that failure could result in injuries or even fatalities. While the lack of detection for a stationary person on a bicycle is interesting, the person is less likely to be seriously injured if they are hit by a car.
Tesla’s Software Has Trouble with Hills and Trees
Tesla’s software had trouble with hills and the surrounding trees in a few scenarios. In one method, the car was on a hilly road with a tree about 70 feet ahead. As the vehicle approached the tree, it failed to slow down and instead drove straight into it, ending up about 15 feet up in the air. In another scenario, the car was on a hilly road with a tree about 80 feet ahead and to the right of the lane. The car was moving at about 25 miles per hour and was supposed to slow down to avoid hitting the tree. However, the vehicle failed to slow down and instead swerved to the left and hit the tree. While these scenarios caused significant damage to the car, they could have been much worse. If the car were moving at a higher speed, it could have hit the tree and then kept going, hitting other trees and possibly a house or a pedestrian. If Autopilot were in control while this happened, the car would be unlikely to stop before hitting other things.
Autopilot Takes Control at Unsafe Times
According to one test, Autopilot took control away from drivers at an alarming rate: every 700 meters on average. This is particularly concerning because the drivers were able to complete the driving task about 90% of the time — despite Autopilot taking control about 10% of the time. This data suggests that Autopilot is regularly taking power away from drivers who could reasonably be expected to control the car safely. While some of these events may be due to Autopilot incorrectly deciding that the driver can’t control the vehicle, the large number of events suggests that many of these instances are due to Autopilot taking control at an unsafe time.
Conclusions
These tests are a good reminder that self-driving technology is still in its early days — despite many companies’ claims that they have self-driving cars on the road today. Tesla’s Autopilot is not ready to drive without human oversight, and the company should clear its current capabilities to owners. Tesla’s Autopilot has significant issues; these new tests show that it is far from being a safe and mature self-driving system. Although the software is likely to improve over time, it is unlikely to be ready to drive without human oversight. Self-driving technology promises to make roads safer, but we are a long way from that future.