Autonomous Driving Systems

Robotaxi services from 2027

How Nvidia is turning AI into road-ready autonomy

2 min
In the new CLA, Mercedes-Benz deploys substantial Nvidia computing power to enable advanced ADAS functionalities.

Nvidia is moving beyond chips and simulations to position itself as a central force in autonomous mobility. A robotaxi service is planned for 2027, while passenger vehicles equipped with the platform are expected to follow from 2028.

Nvidia is pushing deeper into autonomous mobility, aiming to translate its dominance in artificial intelligence into large-scale, real-world deployment on public roads. CEO Jensen Huang outlined a roadmap that goes well beyond supplying computing hardware: Nvidia intends to help launch a robotaxi service as early as 2027, followed by the introduction of AI-driven systems in privately owned vehicles between 2028 and 2030.

According to the company, the core ambition is to enable vehicles that do more than detect objects. Instead, AI systems are designed to interpret traffic situations in context, anticipate potential developments and make decisions in a way that resembles human reasoning. Nvidia sees this capability as a prerequisite for scaling autonomous driving safely and economically.

From demonstration vehicles to urban deployment

Ahead of the 2026 Consumer Electronics Show (CES) in Las Vegas, Nvidia showcased the current maturity of its technology together with Mercedes-Benz. A production version of the new Mercedes CLA navigated public roads in San Francisco, following traffic rules, responding to traffic lights and signs, and interacting with pedestrians. Over a test route of roughly 45 minutes, a safety driver intervened only in a handful of situations.

The vehicle relied on a sensor configuration combining ten cameras and five radar units. For future robotaxi applications, Nvidia plans to extend this setup with lidar sensors to improve environmental perception in dense urban scenarios. This multi-sensor strategy stands in contrast to approaches that rely exclusively on cameras, underlining Nvidia’s focus on redundancy and robustness rather than minimal hardware.

Intensifying competition in the robotaxi market

The race to deploy driverless services is accelerating. Google subsidiary Waymo currently operates around 2,500 fully autonomous vehicles across several US cities, making it the most advanced player in commercial robotaxi operations. Its position has been further underlined since 2025, when Waymo began expanding its technology partnerships, including a preliminary agreement with Toyota Motor Corporation to explore closer collaboration on accelerating autonomous driving development. The move signalled Waymo’s intention to scale its technology beyond proprietary fleets and into broader automotive ecosystems.

At CES 2026, additional competitors signalled their ambitions: Uber presented electric vehicles from Lucid that are scheduled to enter robotaxi service around San Francisco later this year, powered by autonomous software from Nuro. In Las Vegas itself, vehicles from Amazon-owned Zoox — designed without steering wheels or pedals — are already operating on public roads.

Against this backdrop, Nvidia’s strategy is not to become a vehicle operator, but to position its AI stack as a foundational layer that can be adopted by multiple manufacturers and service providers. The company expects this platform approach to accelerate adoption across different markets and vehicle segments.

Mercedes-Benz builds on Nvidia for US rollout

Mercedes-Benz is among the manufacturers placing early bets on Nvidia’s autonomous driving ecosystem. At CES, the Germans highlighted their MB.Drive system, which integrates Nvidia’s Drive AV full-stack software and the Drive AGX accelerated computing platform.

The resulting system, branded MB.Drive Assist Pro, is already available in China and is scheduled to launch in the United States later this year. Operating at Level 2, it supports assisted driving from parking space to destination while allowing drivers to intervene at any time through a cooperative steering concept. The system processes data from around 30 sensors, including cameras, radar and ultrasonic units, and is positioned as a stepping stone toward higher levels of automation.

With robotaxi services targeted for 2027 and consumer vehicles to follow shortly after, Nvidia is signalling a clear shift: from enabling autonomy in laboratories and pilot projects to embedding AI-driven decision-making into everyday mobility. Whether this timeline holds will depend not only on technology, but also on regulation, infrastructure and public acceptance — yet the momentum behind AI-powered driving is clearly building.