Into the Omniverse: Manufacturing’s Simulation-First Era Has Arrived
Source: NVIDIA AI Blog
Editor’s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners, and enterprises can transform their workflows using the latest advances in OpenUSD and NVIDIA Omniverse.
Manufacturing’s traditional design‑build‑test cycle rested on a single assumption: real‑world testing was the only reliable test environment.
That assumption is now shifting.
Today, high‑fidelity simulation produces synthetic training data accurate enough for production‑grade AI. This is enabling perception systems, reasoning models, and agentic workflows to excel in live factory environments.
OpenUSD has emerged as the connective standard that makes this practical, and the manufacturers building on it are already experiencing measurable results.
SimReady: The Content Standard for Physical AI
As physical AI becomes integral to industrial operations, manufacturers face a foundational challenge: assets don’t travel reliably between 3D pipelines. Every time an asset moves from a computer‑aided design (CAD) tool to a simulation platform, physics properties, geometry, and metadata are lost—forcing teams to rebuild from scratch.
SimReady (link) is the content standard, built on OpenUSD, that defines what physically accurate 3D assets must contain to work reliably across:
- Rendering pipelines
- Simulation pipelines
- AI‑training pipelines
In addition, the NVIDIA Omniverse libraries provide a physics‑accurate, photorealistic simulation layer where AI models are trained and validated before deployment.
Explore NVIDIA Omniverse libraries
Four Ways Manufacturers Are Putting the NVIDIA Physical AI Stack to Work
1. ABB Robotics Closes the Sim‑to‑Real Gap at 99 % Accuracy
- Integration: ABB Robotics embedded NVIDIA Omniverse libraries directly into RobotStudio HyperReality, its simulation platform used by >60 000 engineers worldwide.
- Digital Twin: Robot stations are represented as USD files that run the same firmware as the physical machines, enabling training, part‑tolerance testing, and AI‑model validation before a production line exists.
- Synthetic Variations: Lighting, geometry, and other conditions can be generated at scale, covering scenarios that are impractical to replicate manually.
- Result: “We’ve managed to vertically integrate the complete technology stack and optimize it to a point where we’re now achieving 99 % accuracy on the simulated version,” said Craig McDonnell, Managing Director of Business Line Industries at ABB Robotics.
- Down‑stream Benefits:
- Up to 50 % reduction in product‑introduction cycles
- Up to 80 % reduction in commissioning time
- 30‑40 % reduction in total equipment lifecycle cost
2. JLR Compresses Four Hours of Aerodynamic Simulation to One Minute
- Approach: JLR applied a simulation‑first workflow to vehicle aerodynamics.
- Data: Engineers trained neural surrogate models on >20 000 CFD simulations correlated with wind‑tunnel data; 95 % of aero‑thermal workloads now run on NVIDIA GPUs.
- Tool: The Neural Concept Design Lab—built on Omniverse and deployed at JLR—visualizes aerodynamic changes in real time as designers tweak vehicle geometry, turning a sequential “design‑then‑simulate” process into a continuous loop.
- Outcome: Tasks that once took four hours now finish in one minute.
3. Tulip Brings Real‑Time Factory Intelligence to Terex for Operational Gains
- Challenge: Once a factory is in production, intelligence needs go beyond simulation.
- Solution: Tulip Interface’s Factory Playback platform layers intelligence onto existing infrastructure, converting operation records into actionable insights.
- Architecture: Built on the NVIDIA Metropolis VSS Blueprint, it extracts structured intelligence from camera feeds, machine‑sensor data, and operational context into a unified timeline.
- AI Layer: Uses NVIDIA Cosmos Reason vision‑language models to interpret camera streams and operator behavior in real time, running on‑premises on NVIDIA GPUs.
- Deployment: Implemented at Terex, a global industrial‑equipment maker with >40 plants.
- Projected Gains:
- +3 % yield
- ‑10 % rework
- Quote: “I am excited to see what manufacturers will do with the power of AI to augment their daily capabilities,” said Rony Kubat, Co‑founder & CIO of Tulip Interfaces.
4. (Add a fourth example here if needed)
If you have another case study, insert it following the same structure.
Getting Started
SimReady assets, Omniverse libraries, and NVIDIA’s Physical AI stack provide a solid foundation that developers can adopt, extend, and combine across any industrial application. Below are quick links to help you get started:
- See physical AI in action – Learn how NVIDIA and partners are using Physical AI on the factory floor at the Hannover Messe.
- Free, self‑paced courses – Start building autonomous robots, digital twins, and AI‑powered systems with these courses.
- Isaac Sim & Omniverse libraries – Explore them on the NVIDIA Developer Portal.
- Metropolis VSS Blueprint – Deploy the Video Search & Summarization Blueprint on existing camera infrastructure to gain new shop‑floor insights.
- SimReady Foundation – Review the specification framework on GitHub.
- Cosmos Cookbook – Browse domain‑specific Physical AI recipes in the NVIDIA Cosmos Cookbook.
- Omniverse developer hub – Access the full suite of resources at the Omniverse Hub.
- Community – Join the Discord community to connect with fellow developers and innovators.