Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems
Source: NVIDIA AI Blog

Share
- Email: Share via email
Editor’s note: This post is part of the Into the Omniverse series, which explores how developers, 3‑D practitioners, and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse.
Physical AI is moving from research labs into the real world, powering intelligent robots and autonomous vehicles (AVs)—such as robotaxis—that must reliably sense, reason, and act amid unpredictable conditions.
To safely scale these systems, developers need workflows that connect real‑world data, high‑fidelity simulation, and robust AI models atop the common foundation provided by the OpenUSD framework.
The recently published OpenUSD Core Specification 1.0 (Universal Scene Description) now defines standard data types, file formats, and composition behaviors, giving developers predictable, interoperable USD pipelines as they scale autonomous systems.
Powered by OpenUSD, NVIDIA Omniverse libraries combine:
- NVIDIA RTX rendering,
- physics simulation, and
- efficient runtimes
to create digital twins and SimReady assets that accurately reflect real‑world environments for synthetic data generation and testing.
NVIDIA Cosmos world‑foundation models can run on top of these simulations to amplify data variation, generating new weather, lighting, and terrain conditions from the same scenes so teams can safely cover rare and challenging edge cases.
Watch the OpenUSD livestream (today at 11 a.m. PT) or view the replay—part of the NVIDIA Omniverse OpenUSD Insiders series.
In addition, advancements in synthetic data generation, multimodal datasets, and SimReady workflows are now converging with the NVIDIA Halos framework for AV safety, creating a standards‑based path to safer, faster, and more cost‑effective deployment of next‑generation autonomous machines.
Building the Foundation for Safe Physical AI
Open Standards and SimReady Assets
The OpenUSD Core Specification 1.0 establishes the standard data models and behaviors that underpin SimReady assets, enabling developers to build interoperable simulation pipelines for AI factories and robotics on OpenUSD.
Built on this foundation, SimReady 3‑D assets can be reused across tools and teams and loaded directly into NVIDIA Isaac Sim, where USDPhysics colliders, rigid‑body dynamics, and composition‑arc‑based variants let teams test robots in virtual facilities that closely mirror real operations.
Open‑Source Learning
The Learn OpenUSD curriculum is now open source and available on GitHub, allowing contributors to localize and adapt templates, exercises, and content for different audiences, languages, and use cases. This gives educators a ready‑made foundation to onboard new teams into OpenUSD‑centric simulation workflows.
Generative Worlds as a Safety Multiplier
- Gaussian splatting – a technique that uses editable 3‑D elements to render environments quickly and with high fidelity – and world models are accelerating simulation pipelines for safe robotics testing and validation.
- At SIGGRAPH Asia, the NVIDIA Research team introduced Play4D, a streaming pipeline that enables 4‑D Gaussian splatting to accurately render dynamic scenes and improve realism.
- Spatial‑intelligence company World Labs is using its Marble generative world model with NVIDIA Isaac Sim and Omniverse NuRec so researchers can turn text prompts and sample images into photorealistic, Gaussian‑based, physics‑ready 3‑D environments in hours instead of weeks.

These worlds can then be used for physical‑AI training, testing, and sim‑to‑real transfer. The high‑fidelity simulation workflow expands the range of scenarios robots can practice in while keeping experimentation safely in simulation.
Lightwheel Helps Teams Scale Robot Training with SimReady Assets
Powered by OpenUSD, Lightwheel’s SimReady asset library includes a common scene‑description layer, making it easy to assemble high‑fidelity digital twins for robots. The assets embed precise geometry, materials, and validated physical properties, and can be loaded directly into NVIDIA Isaac Sim and Isaac Lab for robot training. This enables robots to experience realistic contacts, dynamics, and sensor feedback as they learn.
End‑to‑End Autonomous Vehicle Safety
Advancements in end‑to‑end AV safety are accelerating thanks to new research, open frameworks, and inspection services that make validation more rigorous and scalable.
Key Innovations
-
Sim2Val framework – NVIDIA researchers (with Harvard and Stanford) introduced a statistical method for combining real‑world and simulated test results, cutting the need for costly physical mileage while ensuring safe behavior in rare, safety‑critical scenarios.
Read the paper: Sim2Val (arXiv) -
NVIDIA Omniverse NuRec Fixer – An open‑source, Cosmos‑based model trained on AV data that removes artifacts in neural reconstructions, delivering higher‑quality SimReady assets.
-
NVIDIA Halos AI Systems Inspection Lab – ANAB‑accredited lab that provides impartial inspection and certification of Halos elements across robotaxi fleets, AV stacks, sensors, and manufacturer platforms.
Learn more: Halos Certification Program
Watch: NVIDIA’s “Safety in the Loop” livestream (link in original source).
Early Adopters
| Company | Role |
|---|---|
| Bosch | Participant in Halos Inspection Lab |
| Nuro | Participant in Halos Inspection Lab |
| Wayve | Participant in Halos Inspection Lab |
| onsemi | First company to pass Halos inspection (sensor systems for AVs, industrial automation, medical) |
Ecosystem Integrations
- CARLA simulator – Integrates NVIDIA NuRec and Cosmos Transfer to generate reconstructed drives and diverse scenario variations.
- Voxel51 FiftyOne engine – Linked to Cosmos Dataset Search, NuRec, and Cosmos Transfer for curating, annotating, and evaluating multimodal datasets across the AV pipeline.
Academic Collaboration
- M‑City (University of Michigan) – Enhancing the digital twin of its 32‑acre AV test facility with Omniverse libraries.
- Uses NVIDIA Blueprint for AV simulation and Omniverse Sensor RTX APIs to create physics‑based models of cameras, LiDAR, radar, and ultrasonic sensors.
- Aligns real sensor recordings with high‑fidelity simulated data, enabling safe, repeatable testing of rare and hazardous scenarios before public‑road deployment.
Read more: M‑City digital twin upgrade
By combining rigorous statistical validation, open‑source reconstruction tools, and accredited inspection, the AV ecosystem is moving toward safer, large‑scale deployment of robotaxi fleets.
Get Plugged Into the World of OpenUSD and Physical AI Safety
Learn more about OpenUSD, NVIDIA Halos, and physical AI safety by exploring these resources:
- Watch the on‑demand NVIDIA GTC session, “Reconstructing Reality: Simulating Indoor and Outdoor Environments for Physical AI.
- Visit the NVIDIA Halos AI Systems Inspection Lab webpage.
- Follow the NVIDIA DRIVE LinkedIn newsletter: “NVIDIA Safety in the Loop.”
- Read the corporate blog explainer: How AI Is Unlocking Level 4 Autonomy.
- Get started with the Learn OpenUSD curriculum (now open source).
Stay up to date by subscribing to NVIDIA news, joining the NVIDIA Omniverse community, and following NVIDIA Omniverse on:
Categories & Tags
Categories:
Tags:
- Cosmos
- Into the Omniverse
- NVIDIA Blueprints
- NVIDIA Isaac Sim
- NVIDIA Omniverse
- Physical AI
- Synthetic Data Generation
Recent NVIDIA News
All links open in a new tab.




