Like on the first day, the second day of the Autonomous Vehicle Test & Development Symposium had two sessions running simultaneously, ensuring a packed program. The first presentation in Room B on Wednesday was given by Vince Socci, who shared National Instruments’ take on validating the vision sensors for ADAS.
The presentation kicked off the ‘Vision – Sensors and Lidar Test and Development’ session with a solid foundation in the testing challenges and how to solve them. After all, safe autonomous vehicles require clear vision of the world around them. The vehicle’s ‘eyes’ (radars, cameras, lidars, ultrasound, V2X) and ‘brain’ (ECU, sensor fusion) need a vision test to ensure they pick up any hazards and are able to navigate traffic safely and comfortably. Socci said, “No doubt about it, the sensors and systems of ADAS are the stepping stones to make true AVs happen. If we can’t see the world, we can’t behave in the world.”
He noted that “each of the sensor types have their areas of high performance and maybe areas of poor performance, but you need a vehicle to see everything, which is why there is a movement toward a combination of sensors.”
He stressed that developing and testing sensor fusion was therefore a critical component for AVs: “The big thing about sensor fusion is actually synchronization. The reason you look at cameras and radars together is that a camera is pretty good at telling you what something is, but pretty lousy at telling you where it is. A radar is pretty good at telling you where something is, but pretty lousy at telling you what it is. If your data is not synchronized, your ground truth is in jeopardy.”