Autonomous Driving Systems
Camera-only strategy questioned
NHTSA probes Tesla Autopilot over safety concerns
The investigation covers Tesla vehicles produced since 2016 and is expected to further assess whether the system can reliably detect its own limitations.
United States National Highway Traffic Safety Administration
US regulators are intensifying their investigation into Tesla’s driver assistance system, as crash data raises concerns about the reliability of camera-based autonomy in poor visibility.
The US National Highway Traffic Safety Administration
(NHTSA) is expanding its investigation into Tesla’s Full Self-Driving (FSD)
system following a series of accidents. Initial findings suggest that the
system may struggle to operate reliably under low-visibility conditions.
According to the agency, crash data indicates that the
system did not adequately recognise situations where cameras were impaired —
for example by glare or airborne particles — and failed to warn drivers in
time. This raises concerns about whether camera-only sensing can ensure
sufficient safety in real-world environments.
Tesla’s approach, strongly advocated by CEO Elon Musk,
relies exclusively on cameras to enable autonomous
driving. The underlying assumption is that a vision-based system can
replicate — or even outperform — human perception without requiring additional
sensor types.
How Tesla’s strategy differs from competitors
Tesla’s strategy stands in contrast
to most other developers of automated driving systems. Companies such as
Waymo combine cameras with radar and LiDAR technologies
to create redundant sensing layers, improving reliability in complex or
degraded conditions.
If Tesla’s approach proves viable, it could offer a
significant cost advantage by avoiding expensive sensor hardware. However, the
latest regulatory scrutiny highlights the trade-off between cost efficiency and
system robustness.
Tesla Autopilot Investigation: Key Facts
- Authority: NHTSA (US National Highway Traffic Safety Administration)
- System: Tesla Full Self-Driving (FSD)
- Status: Driver assistance system (not fully autonomous)
- Core issue: Performance under poor visibility conditions
- Sensor strategy: Camera-only approach
- Key concern: Failure to detect sensor limitations and warn drivers
- Affected vehicles: Tesla models since 2016
- Competitors: Use multi-sensor setups (camera, radar, LiDAR)
- Strategic trade-off: Lower cost vs. reduced redundancy
- Next step: Ongoing regulatory investigation
It is also important to note that Tesla’s FSD system remains
a driver assistance feature despite its name. Drivers are required to stay
attentive and retain control of the vehicle at all times. A more advanced,
unsupervised version is still limited to beta testing in the United States.
The NHTSA investigation covers Tesla vehicles produced since
2016 and is expected to further assess whether the system can reliably detect
its own limitations — a key requirement for higher levels of vehicle autonomy.