The US federal agency governing road safety announced an investigation into Tesla and its self-driving car claims, bringing the challenges of autonomous vehicle adoption to the spotlight.
The National Highway Traffic Safety Administration (NHTSA) has identified 11 crashes of particular concern, where the vehicle was operating in an autopilot mode prior to the collision and the driver was sometimes completely distracted from the task of driving.
Tesla’s vehicles boast ‘Full-Self-Driving’ (FSD), but current regulations do not allow for fully autonomous vehicles on the road.
The term Autonomous Vehicle (AV) conjures images of cars running around independently, devoid of human input. But this is not yet the reality within the automotive industry today. The Society of Automotive Engineers (SAE) has agreed on six levels of automation, and the barrier to entry is surprisingly low.
Level 1 vehicles simply need to automate one part of driving, such as maintaining a set speed, which can be achieved with cruise control.
Level 2 vehicles must be able to assist with two aspects of driving; this is typically satisfied with adaptive cruise control and lane assist systems.
It is not until we get to level 4 that the sci-fi fantasy of a self-driving car is achieved.
So, where does Tesla sit on this scale?
Tesla Autopilot is a level 2 system. In other words, it is not a self-driving car, and a human always needs to pay attention to the road.
This is not to say that Tesla Autopilot is a bad system. In fact, according to Consumer Reports, it is the second-best Advanced Driver Assistance System (ADAS) on the market, with GM’s SuperCruise taking the top spot.
This is reflected by videos of Tesla’s system in action: it handles traffic superbly, it can follow the sat nav. But it is not driving the vehicle – a human is. In videos where the driver is asleep or has left the driver’s seat, they are abusing the system and breaking the law.
ADAS is a much better description of the autonomous vehicles currently on the market. No vehicle currently on sale has anything more than driver assistance features; the driver is always in control.
The problem for Tesla is that their system has been catastrophically over-marketed and oversold. The problem ranges from the ‘Full Self Driving’ labeling to exaggerated language that CEO Elon Musk uses when describing the system.
Musk has been talking about the ‘Full Self Driving’ upgrade on Tesla vehicles since 2018, and with multiple revisions, upgrades, and hype cycles since then; some confused customers will think they have bought a self-driving autonomous vehicle when they haven’t.
Camera-only system: Is Tesla wrong?
Another part of the problem is in Tesla’s camera-only sensor suite. While other manufacturers rely heavily on cameras, radars, and LiDARs, Tesla is pursuing a camera-only approach, and has even gone as far as removing the radar from Model 3 and Model Y vehicles (as of May 2021).
The justification is that humans can drive with their eyes: the same information a camera suite can capture. Therefore, a car will be able to drive with cameras alone. A camera suite also has better potential than a human driver, as it provides a greater perspective around the vehicle, and thermal cameras can do a better job at night.
The problem is cameras do not have the same capabilities as radar. Although cameras can infer range and velocity through AI, radars can measure this information intrinsically and are almost completely unaffected by poor weather, darkness, and direct sunlight, which are challenges for a camera.
Tesla can certainly exceed human driving performance with a camera-only suite, but IDTechEx believes they have limited their potential by disregarding radar and have put themselves at a competitive disadvantage. Other manufacturers will also be bringing LiDAR into their ADAS sensor suites, which have some benefits over radar, but some drawbacks also.
Radars and LiDARs can become very important at night when the camera systems are performing far below their peak potential. This is crucial for ADAS systems as most pedestrian fatalities happen at night. The NHTSA also noted that most of the incidents they are investigating involving Tesla’s autopilot systems happened in dark conditions.
It might be time for Tesla to reconsider the removal of radars, especially as newer more sophisticated radars are emerging, such as those from Continental, Arbe and Metawave. These next-generation radars might address the problems that Tesla has had with radar in the past. This would give Tesla superior night-time performance while supporting their camera system during the day. This will help Tesla be ready for when regulations change and level 3 activities are allowed, like in Japan.
Tesla Autopilot and FSD are not bad systems. They are among the best ADAS systems on the market. It is unlikely that the US government and the NHTSA will find anything untoward or dangerous within these systems. The problem lies entirely with a dangerous misinterpretation of the system’s capabilities.
Until there is a clear message that these are not full self-driving vehicles, and the driver is very much responsible for driving the vehicle, these incidents under investigation will continue to damage the perception of automated vehicles, and in the worst case, cost lives.