Google Absorbs Intrinsic Robotics to Build the Android for Robots
Google announced on February 28, 2026 that it is absorbing Intrinsic — its robotics software platform — from Alphabet's "Other Bets" division into the main Google organization, with the explicit goal of building the "Android of robots."
Founded in 2021 within X Labs and spun out in 2024, Intrinsic develops a cross-platform software layer that lets manufacturers use robots from different vendors through a unified interface. The platform abstracts away the complexity of integrating robotic arms, cameras, sensors, and AI models from dozens of manufacturers.
Under the new structure, Google plans to combine Intrinsic's robotics software with:
- Gemini AI models for natural-language instruction and reasoning
- Google DeepMind research for physical manipulation and planning
- Google Cloud for scalable deployment across manufacturing and logistics
The integration parallels Google's earlier consolidation of Google Brain into DeepMind in 2023. Intrinsic had previously collaborated with NVIDIA on robotics simulation via the Isaac platform, and those partnerships are expected to continue.
Google's move arrives as physical AI surges: Figure AI, Agility Robotics, Unitree, and Boston Dynamics are all expanding commercial robot deployments in 2026. By positioning Intrinsic as an open platform spanning manufacturers, Google aims to capture the software layer of the robotics market the same way Android dominates mobile operating systems.
Source: CNBC
Related Articles
On 2026-02-25, Google announced that Intrinsic is joining Google, and Intrinsic said it will operate as a distinct group inside Google. The companies positioned the move as a way to combine Intrinsic robotics software with Gemini models, Google Cloud, and close work with Google DeepMind for faster industrial deployment.
NVIDIA announced its Open Physical AI Data Factory Blueprint on March 16, 2026 to speed development for robotics, vision AI agents and autonomous vehicles. The blueprint is designed to turn limited real-world data into larger, more diverse training pipelines with synthetic generation and automated evaluation.
Generalist says GEN-1 crosses a commercial threshold for simple physical tasks by combining higher success rates, faster execution, and lower task-specific robot data requirements.
Comments (0)
No comments yet. Be the first to comment!