Back

Universal Scene Description explained


The concept of the metaverse was first introduced by science fiction author Neal Stephenson in the 1990s. It is ushering in a new era of design and simulation in a wide range of industries. Complex environments for feature film production on LED stages and digital twins are examples of how these large, high fidelity scenes are being used to accurately simulate the real world.

Neal Stephenson

The data models and pipelines needed to represent these virtual 3D worlds in full fidelity are huge and complex. USD (Universal Scene Description) an open, highly scalable, interoperable and extensible description, composition, simulation and collaboration framework and ecosystem that is destined to become the common standard.

With USD, layers that describe properties can be combined together with sparse overrides to non-destructively modify source data. Geometry, shading, lighting, and non-graphical properties are composed in a USD layer stack. Stacks can be easily stitched together into a composed world that brings together all the layer stacks in a USD stage. This stage can then be fed into any number of subsystems including rendering and physics to simulate real-world behaviours.

USD’s four unique properties

The USD framework, ecosystem, and interchange paradigm “supports a wide variety of properties to define and render objects, while making scene structuring and editing more efficient and collaborative with sparse, non-destructive authoring”. There are four unique properties of USD that enable it to be highly extensible and meet the demands of virtual worlds:

  1. Composition Engine: The composition engine enables the assembly of data from numerous sources as individual layers and the authoring of sparse, non-destructive overrides on top of those layers. This preserves the integrity of source data ingested in the USD ecosystem.
  2. Custom Schemas: The data model of USD is fully extensible with custom schemas. USD itself comes bundled with core schemas such as geometry and shading. NVIDIA has worked with Pixar and Apple to create physics schemas for rigid bodies. Other custom schemas are being explored to further expand the ecosystem for digital twins and virtual worlds.
  3. Asset Resolver & Data Storage: USD is entirely filesystem agnostic. The data is not tied to filesystems or any other persistent storage. It can even be procedurally generated. This is made possible by USD’s plugin system for asset resolvers and file formats.
  4. Hydra: USD can feed different renderers out of the box like Pixar Storm and Renderman with Hydra, a generalised pipeline for custom renderers. NVIDIA has also integrated real-time ray tracing and physically accurate rendering support with Hydra.

If you would like advice, guidance and support on how to get the most out of USD and incorporate it into how you create complex 3D virtual environments, we’re here to help – contact@mondatum.com.

Source: NVIDIA Omniverse blog post on Medium



RELATED INSIGHTS