Enrico Salvatori, senior VP and president of Qualcomm Europe, tells AAVI about the company’s new advanced driving (AD) system, recently co-developed with BMW Group for the Neue Klasse platform
Snapdragon Ride in BMW’s Neue Klasse
The advanced driver assistance systems (ADAS) in the all-new BMW iX3 is powered by Snapdragon Ride Pilot, which integrates high-performance, automotive-grade systems-on-chip (SoCs) for centralized sensor data processing, advanced computer vision modules for perception and the Snapdragon Ride Automated Driving software stack co-developed with BMW – including drive policy and safety guardrails.
The automated driving system of the BMW iX3 enables advanced capabilities, including contextual lane change and overtaking, where the system initiates maneuvers based on subtle driver cues such as mirror glances or steering nudges; active lane change and highway assistant, ensuring hands-free driving on approved road networks; and ecosystem-provided AI-powered slot detection parking assistance and camera-based in-cabin monitoring.
BMW’s ‘superbrain of automated driving’ – a central intelligent computer powered by Snapdragon Ride SoCs – combines automated driving functions, offering 20 times higher computing power than the previous generation. The system uses a unified architecture that includes an array of high-definition 8,000,000 pixel and 3,000,000 pixel cameras and radar sensors, enabling 360° coverage. Combined with high-definition mapping and precise GNSS localization, this enables a robust system to support safe and reliable automated driving.
The BMW iX3 is also equipped with Qualcomm Technologies’ V2X 200 chipset to support vehicle-to-everything (V2X) communication for enhanced safety. V2X communication enables vehicles to ‘see’ and ‘hear’ beyond line-of-sight ADAS sensors, helping to reduce collisions by uncovering unseen risks through direct communication between vehicles and their surroundings, such as road infrastructure, pedestrians and other road users.
Software stack, AI perception and system architecture
Following a three-year collaborative effort, in September 2025 Qualcomm and BMW introduced Snapdragon Ride Pilot, their new automated driving system. Built on Qualcomm’s Snapdragon Ride SoCs using the Snapdragon Ride AD software stack, and co-developed by both companies, the system supports AD levels ranging from entry-level New Car Assessment Program (NCAP) to Level 2+ highway and urban navigation on autopilot (NOA).

Snapdragon Ride Pilot will make its global debut in the all-new BMW iX3, the first production vehicle in BMW’s Neue Klasse. It has been validated for use in more than 60 countries, with expected expansion to more than 100 countries in 2026.
The 36-month development of the Snapdragon Ride AD software stack in Snapdragon Ride Pilot required the input of more than 1,400 specialists from various locations, including Germany, the USA, Sweden, Romania and the BMW AD test center in the Czech Republic.
The software stack features a perception stack developed by Qualcomm Technologies and a drive policy engine co-developed with BMW. It uses a camera-based vision stack for object detection, surround view, lane recognition, traffic sign interpretation, parking assistance, driver monitoring and mapping.
Perception performance is enhanced through low-level perception using bird’s-eye-view (BEV) architecture and new methods for information extraction from fish-eye cameras. The low-level perception between camera and radar is designed to reduce tracking latency, optimize system performance in active safety scenarios and detect complex urban intersections. To improve computational efficiency, hardware and software co-design along with network architecture search are applied to manage compute resources and memory bandwidth.
The software stack uses a balance of rule-based and AI-based models for behavior prediction and behavior planning to help enable safe handling of complex driving scenarios.
Snapdragon Ride Pilot supports over-the-air (OTA) updates and is fully customizable via the Snapdragon Ride SDK, giving auto makers the flexibility to tailor solutions across vehicle segments. The software stack leverages fleet data to evolve and enable enhanced safety and comfort over the life of the vehicle.
Stack development and testing is supported by a data and simulation factory – a key component of Snapdragon Ride. This toolchain integrates real-world data with synthetic data generation and AI-based simulations to create a robust and diverse set of driving scenarios, enhancing the training and testing of automotive models, according to Enrico Salvatori, senior VP and president of Qualcomm Europe – who AAVI caught up with recently to find out more.
Interview with Enrico Salvatori
When did the project with BMW start and what was the brief/goal?
BMW and Qualcomm Technologies began co‑engineering the automated driving software stack in 2022, about three years before its global debut in the all-new BMW iX3 – the first production vehicle in BMW’s Neue Klasse – at IAA Mobility 2025.
Our shared vision was twofold: BMW was seeking a leading technology company to build a safety‑first, globally scalable ADAS/AD system that could be the foundation for its Neue Klasse platform and scale across regions and vehicle tiers; and we wanted to expand the accessibility of a user-centric, symbiotic ADAS/AD to other auto makers and Tier 1 suppliers. The result is Snapdragon Ride Pilot, running on Snapdragon Ride SoCs and leveraging the complete Snapdragon Ride toolchain and ecosystem.
The system is validated for 60+ countries today and has a target of 100+ in 2026. A multi‑continent team of 1,400+ people between our companies worked to align BMW’s driving‑dynamics DNA with Qualcomm Technologies’ AI‑centric stack and silicon. Snapdragon Ride Pilot is now available to all global auto makers and Tier 1 suppliers through Qualcomm Technologies.
What are some of the core components and how does the system work?
The system combines Snapdragon Ride SoCs; a fifth-generation, API-rich vision perception application; and a co‑developed automated driving software stack to deliver a driver experience that is smart, safe and symbiotic. This architecture gives auto makers and Tier 1 suppliers flexibility to include a multitude of apps and build scalable solutions – either using their own drive policy or with a turnkey platform – while optimizing cost and time-to-market.
Perception and planning: The AI perception stack delivers 360° scene understanding, fusing camera vision with low-level radar perception to create a bird’s-eye view. This layered approach reduces latency and enables detection of lanes, traffic signals, pedestrians and vehicles. It also supports advanced functions such as parking assistance, driver monitoring and online mapping for complex urban scenarios.
Compute platform: Snapdragon Ride SoCs provide heterogeneous compute – CPU, GPU and NPU blocks optimized for power efficiency and minimal data movement – so sensor data can be processed in real time.
Intelligent driving: The stack combines rule-based and AI-based planning, evolving toward AI-driven approaches that handle massive sensor data streams in complex environments such as dense urban intersections and variable conditions. An AI planner interprets critical inputs dynamically, enabling context-aware driving without relying
on pre-loaded HD maps, and processes relevant data rapidly to navigate complex environments.
Continuous improvement: A data and simulation factory evaluates and refines machine learning models using fleet data, allowing the driver assistance system to detect more objects and extend where it can operate. Generative AI augments drive data for rare scenarios, and new capabilities can be developed in the cloud and deployed via OTA updates, allowing vehicles to be dynamically configured throughout their lifecycle.
What sensors are used and where does the sensor fusion take place?
The system uses a multimodal sensing approach with camera and radar as core and lidar optional by OEM or trim level. Through a unique approach to fusion and perception, it scales from a single camera plus multiple radars for entry-level safety to 11+ cameras and 5+ radars for high‑tier configurations, giving OEMs flexibility to define the exact configuration.
Sensor fusion occurs at low level on the Snapdragon Ride SoC, combining raw camera and radar data into a unified BEV model, which feeds prediction and planning for accurate scene understanding.
How have you reduced latency?
The system is designed for major improvement over previous generations, with increased processing capability that BMW describes as up to 20x higher. One example is extracting features directly from fish-eye cameras to the BEV model, thus reducing tracking latency and improving performance in dense or complex environments.
What role does AI play in sensor data processing?
AI powers the entire stack:
Perception: The fifth‑gen AI perception stack – trained on more than 1,000,000 miles [1,600,00km] containing more than eight million scenarios across the globe – builds a robust 360° scene, detecting lanes, signs, pedestrians and vehicles, and fusing camera vision with radar for added resilience and accuracy.
Prediction and planning: AI models forecast the behavior of surrounding road users and generate smooth, human‑like trajectories that respect local road norms.
Data flywheel: A cloud‑based data pipeline with smart-ingest reprocesses fleet data, runs real and Gen AI-enhanced simulations, and delivers OTA updates to enhance system performance and extend where the system can operate.
What do you mean by context−aware driving?
Context‑aware means the system adapts driving behavior based on real-time scene understanding and situational context, rather than following pre-defined patterns. Examples include:
Driver interface: Leveraging context from the driver’s gaze and head position to inform the stack in lane-change situations – keeping drivers involved in the driving task without unnecessary alerts or overload.
Urban intersections: Using the BEV model with radar reinforcement to smoothly and safely navigate complex intersections without lane markings.
Highway navigation: Layering map routing with traffic flow and performing lane changes or merges driver‑approved and system‑initiated where allowed – with comfort per the user preferences.
Changing conditions: Factoring in local driving conventions, such as construction zones or variable speed limits for a natural, brand-aligned experience.
How do you ensure safety and compliance with standards?
Safety is engineered bottom up and end to end. Snapdragon Ride Pilot prioritizes compliance with Automotive Safety Integrity Levels (ASIL) and functional safety (FuSa) standards, supports ISO 26262, and addresses Safety of the Intended Functionality (SOTIF). It meets global safety regulations, including NCAP and DCAS protocols, and incorporates robust cybersecurity measures with multilayered encryption and threat detection. Validated in 60+ countries, with plans for 100+ by the end of 2026, the platform scales from entry-level safety features to advanced highway and urban driving.
Snapdragon Ride Assist: Supports mainstream safety features to help auto makers achieve NCAP 5-star safety ratings with capabilities like automatic emergency braking, lane centering and adaptive cruise control.
Snapdragon Ride Pilot: Extends functionality to hands-free, eyes-on navigation on approved highways at speeds up to 85mph [137km/h], adapting speed and trajectory for lane changes and merges, and also complex urban conditions such as intersections, stop-and-go traffic, and giving right of way.
How has BMW helped shape development − and what were its key concerns?
BMW shaped the stack to reflect its hallmark user-centric, comfort and trust – prioritizing safety, smooth control and global readiness. Co‑engineering spanned Germany, the USA, Sweden, Romania and BMW’s AD test center in the Czech Republic, aligning algorithms, calibration and HMI with BMW’s brand philosophy of being ‘smart, symbiotic and safe.’ BMW valued driver feedback to ensure the new platform delivers distinct features and cutting-edge technology for driving enthusiasts.
What new functions or features does the system enable in the iX3?
The BMW iX3 marks the global debut of the co-developed stack, now available to other auto makers and Tier 1s. Depending on regional regulations and OEM configurations, capabilities include:
Navigation on pilot (NOP): NOP enables the vehicle to automatically follow routes to destinations, handling driving tasks such as hands-free lane keeping, lane changes and adapting speed based on traffic conditions while keeping the driver engaged and informed.
Contextual lane change and overtaking: Initiating maneuvers based on subtle driver cues like mirror glances or steering nudges, for a more natural driving experience.
Third-party applications: An open platform that supports AI-powered slot detection for parking assistance and camera-based in-cabin monitoring.
How do you plan to develop the system further?
The availability of Snapdragon Ride Pilot to all global auto makers and Tier 1s, not just BMW, enables scaling of the stack globally. Data from fleets will continuously feed that evolution of the stack to larger operational design domains (ODDs). As the partnership moves forward, discussions related to next-generation user experiences is in full focus, and decomposition into technology needs is something we will do in collaboration.
This article was first published in the January 2026 edition of ADAS & Autonomous Vehicle International

