ABB Robotics and NVIDIA bring Omniverse into RobotStudio for industrial physical AI
Original: ABB Robotics Taps NVIDIA Omniverse to Deliver Industrial-Grade Physical AI at Scale View original →
ABB Robotics and NVIDIA said on March 9, 2026 that they are integrating NVIDIA Omniverse libraries into ABB's RobotStudio suite to bring physically accurate simulation and synthetic data generation into mainstream industrial robotics workflows. The resulting product, RobotStudio HyperReality, is scheduled for the second half of 2026.
The companies frame the announcement as an attempt to close the long-standing sim-to-real gap. ABB says the combined system can reach 99% correlation between simulation and real-world robot behavior because the virtual controller runs the same firmware as the physical robot. The product exports robot stations as USD into Omniverse, where engineers can test robots, sensors, lighting, kinematics, and parts before any physical deployment.
- ABB says engineering time can fall substantially.
- Deployment costs are projected to drop by up to 40%.
- Time to market is projected to improve by as much as 50%.
- Setup and commissioning time can fall by up to 80% in virtualized production-line validation.
ABB says more than 60,000 robotics engineers use RobotStudio, which makes this less of a lab demo and more of a platform-level move. Early pilots include Foxconn in consumer electronics assembly and Workr in small and medium-size manufacturing. NVIDIA also says ABB is exploring Jetson integration into its Omnicore controller for real-time inference across the robot portfolio.
The importance of the deal is that industrial AI deployment usually fails on data quality, calibration, and validation. If the 99% sim-to-real claim holds in production, manufacturers get a faster way to train vision models with synthetic data, validate cells before installation, and ship robotic automation without the usual amount of physical trial and error.
Related Articles
NVIDIA announced its Open Physical AI Data Factory Blueprint on March 16, 2026 to speed development for robotics, vision AI agents and autonomous vehicles. The blueprint is designed to turn limited real-world data into larger, more diverse training pipelines with synthetic generation and automated evaluation.
NVIDIA on March 16, 2026 introduced its Physical AI Data Factory Blueprint, an open reference architecture for generating, augmenting, and evaluating training data for robotics, vision AI agents, and autonomous vehicles. The company says the stack combines Cosmos models, coding agents, and cloud infrastructure from partners such as Microsoft Azure and Nebius to lower the cost and time of physical AI training at scale.
NVIDIA said on March 20, 2026 that its Cosmos world foundation models have advanced again with Transfer 2.5, Predict 2.5, and Reason 2. The linked NVIDIA Technical Blog frames the update around higher-quality synthetic data, stronger long-tail scenario generation, and richer reasoning for robots and autonomous vehicles.
Comments (0)
No comments yet. Be the first to comment!