«  View All Posts

Aurora Voices with Magnus Wrenninge

Meet our team | February 05, 2020 | 4 min. read

By the Aurora Team

 

Meet Magnus Wrenninge, an Academy Award-winning Software Engineer from our Perception Simulation team.

At the heart of Aurora’s technology and mission are the individuals behind it. In our series, Aurora Voices, we share the unique voices and stories of the people of Aurora. We’ll focus on Aurorans from all backgrounds, showcasing their personal and professional experiences.

With the award season in full swing and the Oscars right around the corner, we’re thrilled to introduce Magnus Wrenninge, a Software Engineer from our Perception Simulation team, who also happens to be a distinguished visual effects technical director. Magnus was honored with a Technical Achievement Award at the 2015 Scientific and Technical Awards presented by the Academy of Motion Picture Arts and Sciences.

Read on to learn about Magnus’s experience from working behind the scenes of Hollywood to working behind the virtual wheel of self-driving cars.

You’ve worked at Pixar, Sony, and even started your own company. How did you go from movies to self-driving cars?

Magnus: Before joining Aurora, I was a rendering architect at Pixar Animation Studios and Sony Imageworks, where I did research and production work on films such as Alice in WonderlandInside Out, and Toy Story 4. One of my friends had just entered the self-driving car space and he suggested I look into the field. My initial reaction was: That sounds fascinating — but I don’t know much about autonomous vehicles.

After some research, I discovered that simulation is a major component of the development process for autonomous vehicles. Simulations are virtual models of the world where we can change the parameters to test how the vehicle would react in variations of the same situation. For example, we might vary the speed of virtual oncoming traffic as the AV makes an unprotected left-hand turn. Creating simulations is very similar to the work I was doing for film: creating high-quality imagery for animations. In both roles, I could use computer graphics to make realistic images.

Once I made this connection, I wanted to dig deeper and investigate how computer simulations could benefit self-driving cars. Hence, I helped launch a simulation startup called 7D Labs. Our startup made photorealistic synthetic data sets for street scenes, and researched the effect that image realism has when creating computer graphics for machine learning systems.

What attracted you to Aurora and our leadership?

Magnus: Chris [Urmson] was committed to making simulation for the Aurora Driver meaningful and accurate, and I was impressed with Chris’s understanding of our work at 7D Labs. Specifically, he recognized the significance and challenges of our research, posing thoughtful questions that illustrated his knowledge in the area.

After several meetings, Chris and I discussed bringing the work of 7D Labs into Aurora. In my experience, many companies seek to solve their simulation problems with a click of a button — an easy fix. Fortunately, Chris realized that this is a difficult research problem and that it is imperative to approach it the right way, which is often not the easiest way. Chris shared our vision of what using computer graphics for machine learning should look like. We were in perfect alignment.

What do you do at Aurora?

Magnus: I help lead Aurora’s perception simulation team. Our perception system is responsible for analyzing sensor data (captured from lidar, radar, and cameras), recognizing important objects in that data (pedestrians, vehicles, etc.), and tracking those objects’ movements. Our motion planning system uses this information to make decisions about what the Aurora Driver should do on the road.

To test how well our perception system can identify and categorize objects, my team creates 1) a virtual 3D environment for the Aurora Driver to drive in, and 2) virtual sensor data that is fed into the perception system.

My job is to ensure that the virtual world and virtual sensor data are as realistic as possible. The goal is to create simulated data that is indistinguishable from the real world.

example of what the perception simulation team is creating at Aurora

Here is an example of what the perception simulation team is creating at Aurora. This is a simulated environment where we used scanned models of pedestrians. The entire scene is lit by a high-dynamic-range sky, to provide the virtual environment with a natural balance between lit and shaded areas. The simulation even accounts for the limited dynamic range of a sensor, and in particular for the motion blur that is present in all camera data.

What motivates you to do the work you’re doing?

Magnus: At Aurora, I’m doing the most interesting type of work in this field. I have the privilege of working on cutting-edge technology and exploring new ground every day. How is that not motivating? It’s as good as it gets!

You received the Technical Achievement Award at the 2015 Academy Awards for an innovation called Field3D. What does it mean for someone to receive this award?

(The Technical Achievement Award is one of three types of Scientific and Technical Awards given by the Academy of Motion Picture Arts and Sciences.)

Magnus: It’s an honor to receive the Technical Achievement Award. To qualify, the technology must contribute to the progress of the motion picture industry.

I have also had the honor to serve on the selection committee in the past, which is comprised of the top leaders in various technical fields in the motion picture industry. I know firsthand how rigorous the selection process is for the Sci-Tech Awards, so to have received this recognition from the Academy is a great honor.

What is Field3D?

Magnus: Field3D is an open-source file format for storing volumetric data for computer graphics — things such as fire, smoke, explosions, etc. Field3D had an industry-wide impact because it was shared with the entire film industry, not only at certain studios. This is one of the reasons it was recognized by the Academy. 

 

 

Other than being an Oscar winner, tell us something about yourself that would surprise us.

Magnus: I studied microbiology and biochemistry in high school, and I came within an inch of pursuing that as my profession. At the same time, I always had a passion for computer graphics. When I was applying for university, I changed my first-choice major to computer graphics on the day of the application deadline. I didn’t think much of it at the time, but looking back, it was a sliding door moment.

What is your favorite movie?

Magnus: The City of Lost Children. I’ve always liked Jean-Pierre Jeunet’s films, in particular for his creative image-making process.

If Hollywood made a movie about your life, who would you like to see cast as you?

Magnus: Daniel Radcliffe. I would want to make things extra magical.

What is the best thing about working at Aurora?

Magnus: The work I do at Aurora not only suits my expertise, but it also has an extremely important societal value in terms of revolutionizing transportation safety. I have the opportunity to solve one of the biggest technical challenges of our time. And the people are great!

We’re hiring perception and simulation engineers to help us solve one of the greatest technical challenges of our generation. Visit our Careers page to learn more!

the Aurora Team

Delivering the benefits of self-driving technology safely, quickly, and broadly.