Just a few years ago, Berkeley engineers showed us how they could easily turn images into a 3D navigable scene using a technology called Neural Radiance Fields, or NeRF. Now, another team of Berkeley researchers has created a development framework to help speed up NeRF projects and make this technology more accessible to others.

Led by Angjoo Kanazawa, assistant professor of electrical engineering and computer sciences, the researchers have developed Nerfstudio, a Python framework that provides plug-and-play components for implementing NeRF-based methods, making it easier to collaborate and incorporate NeRF into projects. Kanazawa and her team will present their paper on Nerfstudio at SIGGRAPH 2023.

“Advancements in NeRF have contributed to its growing popularity and use in applications such as computer vision, robotics, visual effects and gaming. But support for development has been lagging,” said Kanazawa. “The Nerfstudio framework is intended to simplify the development of custom NeRF methods, the processing of real-world data and interacting with reconstructions.”

Video URL

This new framework is already helping a wide cross-section of engineers that employ interactive computer graphics in their work, specifically those seeking to create 3D reconstructions in real-world settings. This includes roboticists who use NeRF for manipulation, motion planning, simulation and mapping, as well as gaming studios and news outlets that use interactive graphics to tell stories.

“Researchers as well as industry groups are now using Nerfstudio because it provides an open-source framework, along with the latest NeRF research. It makes it easier for people to begin using NeRFs without having to start from scratch,” said Matt Tancik, the paper’s lead author and a Ph.D. student in Kanazawa’s lab. “So even if you’re doing cutting-edge research, just having this as a baseline, or a starting point, can speed things up a lot.”

Since the introduction of NeRF, researchers worldwide have been working to improve the core technology, from speeding up real-time image rendering and training to developing new editing features. They also have been trying to make NeRF work in new situations, such as when light changes between photos or when objects move within a scene. But this work is often performed by research groups using proprietary repositories, making it difficult to share these contributions with the larger NeRF community.

Nerfstudio addresses these challenges by providing a modular framework that “consolidates these research innovations.” In addition, it fosters “community-driven development” by making the associated code and data publicly available through open-source licensing.

“We set out to create a platform in which people can create new modules and techniques that others can then use,” said Tancik. “Ultimately, the goal is for Nerfstudio to be an open-source community project that researchers will feel interested in working with and also helping to push further.”

Presently, 20 Berkeley engineers are actively contributing to Nerfstudio and helping to maintain it. And as many as 100 people outside the university have already contributed to the core code since its launch in October 2022.

Nerfstudio also enables users to easily run NeRFs on real-world data they collect, a common challenge for developers. At the same time, it makes this technology more accessible to users without NeRF expertise, such as special effects studios and social media users.

“It’s kind of exciting that everything is out in the open,” said Tancik. “It’s incorporating the cutting-edge research you have, with both researchers wanting to push it forward and people who just want to use the tech.”

The study’s other co-authors are Ethan Weber, Evonne Ng, Ruilong Li, Brent Yi, Justin Kerr, Terrance Wang, Alexander Kristoffersen, Jake Austin, Kamyar Salahi, Abhik Ahuja and David McAllister.

Learn more:

Plenoxels convert 2D images into navigable, photorealistic 3D worlds in minutes

NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis

 

This article first appeared on UC Berkeley's College of Engineering website.