Meet Magnus Wrenninge, an Academy Award-winning Software Engineer from our Perception Simulation team.
At the heart of Aurora’s technology and mission are the individuals behind it. In our series, Aurora Voices, we share the unique voices and stories of the people of Aurora. We’ll focus on Aurorans from all backgrounds, showcasing their personal and professional experiences.
With the award season in full swing and the Oscars right around the corner, we’re thrilled to introduce Magnus Wrenninge, a Software Engineer from our Perception Simulation team, who also happens to be a distinguished visual effects technical director. Magnus was honored with a Technical Achievement Award at the 2015 Scientific and Technical Awards presented by the Academy of Motion Picture Arts and Sciences.
Read on to learn about Magnus’s experience from working behind the scenes of Hollywood to working behind the virtual wheel of self-driving cars.
You’ve worked at Pixar, Sony, and even started your own company. How did you go from movies to self-driving cars?
Magnus: Before joining Aurora, I was a rendering architect at Pixar Animation Studios and Sony Imageworks, where I did research and production work on films such as Alice in Wonderland, Inside Out, and Toy Story 4. One of my friends had just entered the self-driving car space and he suggested I look into the field. My initial reaction was: That sounds fascinating — but I don’t know much about autonomous vehicles.
After some research, I discovered that simulation is a major component of the development process for autonomous vehicles. Simulations are virtual models of the world where we can change the parameters to test how the vehicle would react in variations of the same situation. For example, we might vary the speed of virtual oncoming traffic as the AV makes an unprotected left-hand turn. Creating simulations is very similar to the work I was doing for film: creating high-quality imagery for animations. In both roles, I could use computer graphics to make realistic images.
Once I made this connection, I wanted to dig deeper and investigate how computer simulations could benefit self-driving cars. Hence, I helped launch a simulation startup called 7D Labs. Our startup made photorealistic synthetic data sets for street scenes, and researched the effect that image realism has when creating computer graphics for machine learning systems.
What attracted you to Aurora and our leadership?
Magnus: Chris [Urmson] was committed to making simulation for the Aurora Driver meaningful and accurate, and I was impressed with Chris’s understanding of our work at 7D Labs. Specifically, he recognized the significance and challenges of our research, posing thoughtful questions that illustrated his knowledge in the area.
After several meetings, Chris and I discussed bringing the work of 7D Labs into Aurora. In my experience, many companies seek to solve their simulation problems with a click of a button — an easy fix. Fortunately, Chris realized that this is a difficult research problem and that it is imperative to approach it the right way, which is often not the easiest way. Chris shared our vision of what using computer graphics for machine learning should look like. We were in perfect alignment.
What do you do at Aurora?
Magnus: I help lead Aurora’s perception simulation team. Our perception system is responsible for analyzing sensor data (captured from lidar, radar, and cameras), recognizing important objects in that data (pedestrians, vehicles, etc.), and tracking those objects’ movements. Our motion planning system uses this information to make decisions about what the Aurora Driver should do on the road.
To test how well our perception system can identify and categorize objects, my team creates 1) a virtual 3D environment for the Aurora Driver to drive in, and 2) virtual sensor data that is fed into the perception system.
My job is to ensure that the virtual world and virtual sensor data are as realistic as possible. The goal is to create simulated data that is indistinguishable from the real world.