AAVI spoke with Matt Daley, technical director at rFpro, about Sim4CAMSens2, a project advancing automotive sensor simulation by extending high-fidelity modeling from exterior to interior-facing systems
What is Sim4CAMSens2, and how does it build on the first project?

Sim4CAMSens2 is a collaborative research project focused on advancing the development and validation of automotive sensor systems through simulation. The preceding project concentrated on exterior-facing sensors, and a key commercial outcome was the integration of these developments into rFpro’s AV Elevate platform. Ultimately, this enables the industry to model sensors more accurately and accelerate development timelines. Sim4CAMSens2 continues and then extends this work to interior-facing sensors, which are becoming critical for both safety and autonomous vehicle readiness. rFpro’s main role is to enhance its simulation platform to model and test in-cabin perception systems with greater fidelity, including developing infrared camera models and highly detailed digital twins of interiors.
Alongside this, rFpro will be working closely with other project partners to gain a deeper understanding of what sensor output noise factors affect the perception system performance, as well as trying to define the most efficient real sensor characterization tests that are needed to parameterize their simulation model equivalents.
How should synthetic data be used in the development process?

Synthetic data plays a vital role in three key areas of development: tuning, training and testing. It allows developers to fine-tune individual sensor models, supplement real-world datasets to improve AI training, and explore wide-ranging safety scenarios that would be impractical or unsafe to conduct physically.
It is important to say that synthetic data doesn’t replace real-world testing, it complements it. The advantage is that simulations can scale quickly, explore rare edge cases and ultimately accelerate development while reducing costs, all of which can be done before anything physical has been manufactured. The key is correlation, and you need real-world data for this. We need to ensure that the synthetic data truly represents the real world so that engineers can trust it.
What are the typical time and cost requirements to model sensors, and how is this changing?

Traditionally, developing and validating new sensor models has been a lengthy and costly process. This is in part due to the need to collect real-world data. In the original Sim4CAMSens project, the collaborators undertook extensive real-world correlation exercises, including two winter testing campaigns in the Scottish Highlands. A suite of sensors was set up on a test range with static targets. They experienced a wide range of weather, including rain, snow, hail, fog and even the occasional sunny day. More than 120TB of data was collected overall.
Now we have gathered that data to correlate and improve our simulations, customers don’t need to. In AV Elevate we have created validated generic sensor models as well as digital twins of commercially available sensors. Using these as a basis, our customers can create usable digital sensor models in a matter of months rather than years.
Which conditions have the greatest impact on sensor performance?
That is exactly what we are exploring in Sim4CAMSens2. Once we know the answer, developers can focus their efforts on what has the most impact. For example, historically, camera performance has been judged against human visual quality metrics. What an image ‘looks like’ to us. But what really matters is how a perception system interprets that data. The project is therefore examining conditions that degrade perception system performance, such as scratches or dirt on a camera lens, changes in exposure settings, motion blur and atmospheric effects like fog. By correlating both real and simulated degraded conditions, the team will be able to define which factors genuinely affect system performance and which do not, helping developers focus their efforts where it matters most.
What are some of the key challenges involved in simulating in-cabin sensing/perception systems?
In-cabin sensing presents a different set of challenges compared with exterior perception. For example, many systems use infrared cameras for low-light monitoring, so we are developing new sensor models to simulate this. Short-range radar is also commonly used, and this presents some complexity. Inside a car, target objects, such as passengers, are in close proximity to other reflective surfaces, like seats and subframes. This creates multiple reflections and situations where sensors ‘see through’ objects.
Another challenge is the level of detail required. Exterior systems focus on detecting other vehicles, pedestrians or road signs – relatively large elements that are easy to distinguish from each other. In-cabin monitoring, however, often needs to recognize smaller human features such as eye movement or posture. This demands higher-fidelity modeling of people and vehicle interiors.
What new applications might interior sensing enable to improve road safety?

In-cabin sensing is central to the next generation of safety features. It underpins the new 2026 Euro NCAP protocols on driver drowsiness detection and occupant monitoring. These systems can identify whether a driver is attentive, detect if a child has been left unattended in a vehicle or optimize which airbags and restraints should be deployed in a crash, based on occupant size and posture. Looking further ahead, interior sensing also supports autonomous mobility services. For example, verifying passenger readiness before a shuttle departs is essential for safe operation.
Beyond safety, it also opens the door to new in-cabin experiences, such as personalizing infotainment or even optimizing premium audio systems to a passenger’s seating position.
Beyond this project, how is rFpro supporting customers with their simulation needs?
rFpro’s simulation platform is used across the automotive industry, from vehicle dynamics and ADAS development to autonomous systems and human-machine interface studies. At the more extreme end of the scale, we also supply our solution to the majority of the professional motorsports teams competing in the top-tier series.
As the industry continually moves toward virtual testing and validation, our goal is to provide a single simulation environment that spans the full vehicle development process. As vehicles also become increasingly software defined and interconnected, that unified platform is more valuable than ever. rFpro allows manufacturers and suppliers to develop, test and validate a wide range of systems using a consistent virtual world.
In related news, rFpro supports in-cabin perception development as part of Sim4CamSens2 project