There are limits to how much meaningful data can be efficiently gathered from on-road driving, so we’ve developed a proprietary and highly accurate Virtual Testing Suite that helps develop, test, and validate our self-driving technology at a scale that would be impossible in the physical world.
During the fourth quarter of 2021, our team leveraged our increasingly powerful supercomputer to run approximately five million simulations per day – modeling how energy and light move through the world and enabling our team to accurately simulate camera, conventional lidar, and FMCW lidar data.
In the video above, our Virtual Testing Suite places debris in the road ahead of our autonomous vehicle, which then reacts by moving over within the lane to avoid it – providing learnings that can be applied to real world driving.
Our Virtual Testing Suite’s high-fidelity virtual worlds simulate physically accurate sensor performance and actor behavior using data-driven procedural generation. This presents the Aurora Driver with millions of variations of on-road events we’ve seen, imagined, or created – helping train against edge cases, catch errors early, and develop new capabilities well before our software is loaded onto our autonomous vehicles.
This approach also accelerates the development of our hardware; for example, we used our FMCW simulator to evaluate how new hardware designs performed in approximately 20,000 simulated scenarios before manufacturing real-world prototypes.
Safety Case Framework Progress: Completing our Vehicle Operator Safety Case