Software Defined Vehicles

3 Questions for… Salil Raje, SVP of AMD

“Edge computing is not a nice-to-have - it’s the only path forward”

3 min
AI at the Edge Can Pave the Way to an Autonomous Future: Raje's topic at this year's Automotive Electronics Congress in Ludwigsburg.

Compute power, platform strategy, edge intelligence - the future of autonomous mobility begins inside the vehicle. That’s the conviction of Salil Raje, SVP of AMD. Ahead of his talk at the Automobil Elektronik Kongress 2025, we asked him three key questions on this topic cluster.

As Senior Vice President and General Manager of the Adaptive and Embedded Computing Group at AMD, Salil Raje oversees all aspects of strategy, business management and operations, business development, and engineering for FPGAs, adaptive SoCs, embedded processors, and custom platforms.

Raje joined AMD in 2022 from Xilinx, where he grew a nascent data centre business into one of the company’s fastest-growing segments. Over his 17-year tenure at Xilinx, he introduced the Vivado Design Suite of ASIC-class algorithms and user interfaces to FPGA designers. He also developed and launched Vitis, enabling seamless application deployment on AMD’s platform.

Raje holds a Bachelor of Technology in Electrical Engineering from the Indian Institute of Technology, Madras, and a Master of Science and doctorate in Computer Science from Northwestern University. He also serves on Carnegie Mellon University’s Electrical and Computer Engineering (ECE) Advisory Council, where he contributes strategic insights to shape educational, research, and outreach activities.

ADT: How does AI at the edge contribute to the future of autonomous vehicles? 

Raje: Edge computing isn’t just an advantage for autonomous vehicles - it’s a requirement. As cars evolve from passive transport machines into intelligent agents, Edge AI becomes essential for three core reasons: The first one is Reaction Time - in a life-critical moment, like a child darting into the street, there’s no time to send data to the cloud and wait. Only edge AI can deliver the ultra-low latency needed for real-time decisions at motorway speeds. The second one is Reliability and Connectivity - AVs must operate safely in tunnels, rural areas, and dead zones. With AI processed on-vehicle, autonomy stays online - even when the network goes offline. The third one is Sensor Data - modern vehicles generate 25+ TB/day of sensor data. Edge computing transforms this firehose into real-time perception, classification, and prediction - without flooding the cloud or data links. The bottom line? We are entering an era where vehicles perceive, reason, and act independently. That transformation only happens when intelligence lives on the edge, inside the vehicle itself.

What are the advantages of edge computing in automotive applications? 

Edge computing is not a nice-to-have - it’s the only path forward for safe, reliable, and efficient autonomous vehicles. Critical decisions must be made in milliseconds - not milliseconds plus round-trip latency to the cloud. Edge computing brings that intelligence into the vehicle, where it belongs. From tunnels to rural roads, autonomous systems must function without a signal. AMD-powered edge platforms ensure the vehicle remains fully capable, regardless of network availability. Edge AI needs to be powerful and efficient. AMD’s adaptive computing delivers up to 3 times better performance-per-watt, enabling high-performance AI inference in the constrained thermal environments of the vehicle. Vehicles generate up to 25 TB/day - processing it locally reduces bandwidth and keeps sensitive user and sensor data secure and private. This not only protects the driver, it helps our customers meet growing regulatory demands. With AMD, the vehicle becomes the data centre - compact, efficient, and fully autonomous. This is how we’re enabling the future of edge-native mobility.

How is AMD addressing the computational demands of modern vehicles?

The computational demands of modern vehicles are staggering - and growing exponentially. AI model complexity has increased 100 times for vision systems and 275 times for generative AI in just the past few years. We're addressing this challenge through a fundamentally different approach. At the heart of our strategy is heterogeneous computing - the right processor for the right workload at the right time. Unlike traditional single-architecture approaches, we've developed domain-specific solutions that work together seamlessly. AMD’s Embedded+ architecture combines Ryzen x86 processors with Versal adaptive SoCs, creating a flexible platform that can handle everything from in-cabin experiences to advanced driver assistance systems. What makes this approach powerful is we're delivering up to 3 times better performance per watt compared to traditional GPU-only solutions. This means vehicles can run sophisticated AI workloads without compromising range or thermal management. But perhaps our most distinctive advantage is the seamless development experience we provide. Engineers can develop on Ryzen AI-powered workstations, simulate in virtual environments, and deploy to vehicles - all using the same tools, workflows, and XDNA AI Engine architecture. What works in the lab works in the car, dramatically accelerating time to market. This isn't just about faster chips - it's about reimagining automotive computing from the ground up to enable the intelligent, software-defined vehicles of tomorrow.