Into the Omniverse: Physical AI Open Models and Frameworks Advance Robots and Autonomous Systems
Source: NVIDIA AI Blog
Editor’s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners, and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse.
Open Source Accelerates Robotics and Autonomy
Open source has become essential for driving innovation in robotics and autonomy. By providing access to critical infrastructure—from simulation frameworks to AI models—NVIDIA is enabling collaborative development that accelerates the path to safer, more capable autonomous systems.
At CES earlier this month, NVIDIA introduced a new suite of open physical AI models and frameworks to accelerate the development of:
- Humanoids
- Autonomous vehicles
- Other physical‑AI embodiments
These tools span the entire robotics development lifecycle—from high‑fidelity world simulation and synthetic data generation to cloud‑native orchestration and edge deployment—giving developers a modular toolkit to build autonomous systems that can reason, learn, and act in the real world.
OpenUSD provides the common framework that standardizes how 3D data is shared across these physical‑AI tools, enabling developers to build accurate digital twins and reuse them seamlessly from simulation to deployment. NVIDIA Omniverse libraries, built on OpenUSD, serve as the source of ground‑truth simulation that feeds the entire stack.
From Labs to the Show Floor
At CES 2026, developers brought the NVIDIA physical‑AI stack out of the lab and onto the show floor, debuting machines ranging from heavy‑equipment and factory assistants to social and service robots.
The stack taps into:
- NVIDIA Cosmos – world‑model foundation.
- NVIDIA Isaac technologies, including the new Isaac Lab‑Arena open‑source framework for policy evaluation.
- NVIDIA Alpamayo – open portfolio of AI models, simulation frameworks, and physical‑AI datasets for autonomous vehicles.
- NVIDIA OSMO – framework to orchestrate training across compute environments.
Caterpillar – Cat AI Assistant
- Solution: Cat AI Assistant, powered by NVIDIA Nemotron open models for agentic AI and running on the NVIDIA Jetson Thor edge AI module.
- Capabilities: Natural‑language interaction directly in the cab of heavy vehicles. Operators can ask “Hey Cat”‑style questions for step‑by‑step guidance and adjust safety parameters by voice.
- Behind the scenes: Caterpillar uses Omniverse libraries to build factory and job‑site digital twins that simulate layouts, traffic patterns, and multi‑machine workflows. Insights are fed back into equipment and fleets before deployment, making AI‑assisted operations safer and more efficient.
LEM Surgical – Dynamis Robotic Surgical System
- Status: FDA‑cleared and in routine clinical use for spinal procedures.
- Hardware: NVIDIA Jetson AGX Thor for compute, NVIDIA Holoscan for real‑time sensor processing, and NVIDIA Isaac for Healthcare to train autonomous arms.
- Data pipeline: Uses NVIDIA Cosmos Transfer (a fully customizable world model) to generate synthetic training data, and NVIDIA Isaac Sim for digital‑twin simulation.
- Benefits: Dual‑arm humanoid robot mimics surgeon dexterity, enabling complex spinal procedures with enhanced precision while reducing physical strain on surgeons and assistants.

NEURA Robotics
- Robots: 4NE1 humanoid and MiPA service robots.
- Training stack: Isaac Sim and Isaac Lab to train in OpenUSD‑based digital twins before deployment in homes and workplaces.
- Modeling: Post‑trained the Isaac GR00T‑Mimic model on top of the Isaac GR00T foundation.
- Enterprise integration: Collaborating with SAP and NVIDIA to embed SAP’s Joule agents using the Mega NVIDIA Omniverse Blueprint for large‑scale fleet simulation and refinement before rollout into the Neuraverse ecosystem.
AgiBot – Genie Envisioner (GE‑Sim)
- World‑model backbone: NVIDIA Cosmos Predict 2.
- Pipeline: Generates action‑conditioned videos grounded in strong visual and physical priors; combines this data with Isaac Sim and Isaac Lab, then post‑trains on AgiBot’s proprietary data.
- Outcome: Policies developed in Genie Envisioner transfer more reliably to Genie2 humanoids and compact Jetson Thor‑powered tabletop robots.
Intbot
- Model: NVIDIA Cosmos Reason 2 open model, providing a “sixth sense” for social robots.
- Use case: Identifies simple social cues and safety context beyond scripted tasks, enabling robots to decide when to speak and interact more naturally with humans.
- Reference: See the Cosmos Cookbook recipe for a demo of reasoning vision‑language models in action.
These showcases illustrate how the NVIDIA physical‑AI stack—spanning world models, simulation, and edge compute—is enabling a new generation of intelligent, autonomous machines across heavy industry, healthcare, and social robotics.
How Robotics Developers Are Using New Toolkits and Frameworks
1. NVIDIA Agile – A Sim‑to‑Real Engine for Humanoid Loco‑Manipulation
- What it is: An Isaac Lab‑based engine that bundles a complete, verified workflow for training robust reinforcement‑learning (RL) policies.
- Supported platforms: Unitree G1, LimX Dynamics TRON, and other humanoid robots.
- Key features
- Built‑in task configurations and Markov Decision Process (MDP) models.
- Training utilities and deterministic evaluation tools.
- Seamless stress‑testing in Isaac Lab → direct transfer to real‑world robots.
Result: Faster, more reliable deployment of locomotion and whole‑body behaviors.
2. Hugging Face × NVIDIA – Integrating Isaac GR00T N Models into LeRobot
- Integration points
- Access to Isaac GR00T N1.6 models.
- Direct use of Isaac Lab‑Arena from the LeRobot ecosystem.
- Benefits for developers
- One‑stop environment for policy training, evaluation, and benchmarking.
- Simplified workflow from simulation to deployment.
3. Reachy 2 Humanoid + NVIDIA Jetson Thor
- Open‑source robot: Reachy 2 (Hugging Face).
- Hardware partner: NVIDIA Jetson Thor.
- Capability: Deploy advanced Vision‑Language‑Action (VLA) models on‑board for robust real‑world performance.
4. ROBOTIS – Open‑Source Sim‑to‑Real Pipeline with NVIDIA Isaac
- Company profile: Leader in smart servos, industrial actuators, manipulators, open‑source humanoid platforms, and educational kits.
- Pipeline steps
- High‑fidelity data generation in Isaac Sim.
- Dataset scaling using GR00T‑Mimic for augmentation.
- Fine‑tuning a VLA‑based Isaac GR00T N model.
- Direct deployment to robot hardware.
- Outcome: Accelerated transition from simulation to robust real‑world tasks.
Quick Links
- Reinforcement Learning (NVIDIA Glossary)
- LeRobot – Hugging Face Robotics Ecosystem
- ROBOTIS LinkedIn Post
These tools and collaborations are reshaping how robotics developers prototype, train, and ship intelligent agents—from simulation labs straight to the field.
Get Plugged In
Explore these resources to learn more about OpenUSD and robotics development:
- Read Building Generalist Humanoid Capabilities with NVIDIA Isaac & GR00T N1.6 – a technical blog on developing generalist humanoid capabilities.
- Read Simplify Generalist Robot Policy Evaluation in Simulation with NVIDIA Isaac Lab – Arena – a technical blog on evaluating robot policies in simulation.
- Learn how to post‑train Isaac GR00T with this two‑part video tutorial: YouTube link.
- Watch NVIDIA founder and CEO Jensen Huang’s CES special presentation: YouTube link.
- Improve your robotics development skills with the self‑paced Robotics Learning Path: NVIDIA Learning Path.
- Participate in the Cosmos Cookoff, a hands‑on physical AI challenge where developers use Cosmos Reason to power robotics, autonomous systems, and vision‑AI workflows: Cosmos Cookoff.