FreightSafetyCompanyNewsroomCareers
Schedule a call
Schedule a call
View all news
Published ByChris Urmson
Published
October 28, 2021
Read time
1 minute
Topic
Progress
Share
facebook x linkedin

From Cars (the movie) to Self-Driving: Pixar veterans join Aurora to advance simulation efforts

Aurora’s Virtual Testing Suite has been an important part of our company and today we’re accelerating our simulation efforts by welcoming the team from Colrspace – creative technology start-up made up of Pixar veterans behind the computer generated imagery (CGI) magic of iconic movie series like “Toy Story” and “Cars.” 

Colrspace’s team bridges state-of-the-art computer graphics and machine learning with a data-driven approach to reconstructing 3D objects and materials, deployable in real-world environments. As we continue to leverage and expand our Virtual Testing Suite, Colrspace's technology will bring scalability and increased accuracy to the high fidelity virtual worlds that underpin our unique sensor simulation capabilities. Ultimately, Colrspace's team and technology will enable us to move even faster in developing simulation and machine learning tools, accelerating our progress towards delivering the Aurora Driver.

With our industry-leading Virtual Testing Suite, Aurora runs millions of simulations every day. This allows us to train and evaluate the Aurora Driver’s software stack across a vast range of scenarios and driving conditions, finding edge cases and catching errors early, well before the software is loaded onto vehicles. Ultimately, simulation testing drives the development of the Aurora Driver. It is the quickest, and safest way to train and test our self-driving technology – estimated to be equivalent to more than 50,000 trucks driving continuously. This, combined with thoughtful on-road testing, will allow us to deliver the Aurora Driver safely and quickly at scale.

​​As for Colrspace, its innovation is in Protocolr, which is based on an input image and infers texture and other material properties of an object. Key to the process is a neural network that models the processing of the camera pipeline, and couples this with a differentiable image renderer to enable “inverse rendering,” which computes the 3D scene and materials that would produce an image identical to the input photo. Aurora believes this technology will provide a unique advantage in building virtual worlds that are almost indistinguishable from the real one. This is critical because the more realistic the virtual world, the more effective the testing can be.

Related (06)

Progress
Progress

August 7, 2025

The Road Never Sleeps: Aurora’s Trucks Go Driverless Day and Night
Progress
Progress

March 31, 2025

Trust and Transparency: Aurora’s Work with Government Leaders Ahead of Self-Driving Operations
Progress
Progress

March 12, 2025

Answering More of Your Questions About Aurora and Autonomous Vehicles
Progress
Progress

February 27, 2025

Have Questions About Aurora and AVs? Get Your Answers Here.
Progress
Aurora Self-Driving Trucks
Progress

August 5, 2022

Recap: Aurora’s Q2 2022 progress toward commercial launch
Progress
Progress

November 4, 2022

Recap: What you need to know from Aurora’s Q3 2022 Business Review
01 / 06
Home Freight Safety Company Newsroom Careers

Locations

Bay Area, CA
Bozeman, MT
Dallas, TX
Detroit, MI
Louisville, CO
Pittsburgh, PA
Seattle, WA

Our Company

Aurora Driver CapabilitiesFirst RespondersInvestor RelationsContact Us
Self-driving freight is here.
Schedule a call
All Rights Reserved ©2025 Aurora Operations, Inc.
Privacy PolicyTerms of ServiceManage Your Data
Cookie Settings
LinkedIn YouTube X Instagram Facebook