While many robotics companies are focused on improving physical movement, a Silicon Valley startup is betting that the future of humanoid robots lies in software.
Highlights
- Software-First Strategy: OpenMind is building OM1, a hardware-agnostic, open-source OS for robots — similar in vision to Android for smartphones.
- Context-Aware Intelligence: OM1 aims to move robots from rigid task execution to adaptive, human-like interaction in public and personal environments.
- FABRIC Protocol: Enables decentralized communication, identity sharing, and peer-to-peer learning among robots, like a messaging system for machines.
- Developer-Friendly Tools: OM1 supports multiple form factors — humanoids, arms, quadrupeds — and includes WebSim, a visual debugger for rapid prototyping.
- Real-World Testing: Ten quadrupeds will be deployed in live environments by September; one OM1-powered humanoid already rang the NASDAQ bell.
- Educational Push: A K–12 partnership with Robostore brings Unitree G1 robots into classrooms as part of an AI-native curriculum rollout.
- LLM Integration: OM1 connects with models like GPT-4o and Gemini, allowing natural language to control robotic actions and evolve behaviors dynamically.
- Community Skepticism: Some devs critique OM1 as “just a wrapper” over existing systems, but OpenMind positions it as a foundational framework, not an end product.
- $20M Seed Round: Backed by Pantera, Ribbit, Coinbase Ventures and more, OpenMind is focused on scaling OM1 and accelerating real-world validation.
OpenMind, founded in 2024 by Stanford professor Jan Liphardt, is building OM1 — an open-source, hardware-agnostic operating system designed to standardize robotic intelligence across platforms.
OpenMind’s ambition is clear: to become for robots what Android is for smartphones — a universal foundation that powers a wide range of hardware through a shared software ecosystem.
The Goal of OM1
As robots expand beyond industrial environments into homes, schools, and public spaces, OpenMind sees the need for a shift in software architecture — from rigid task execution to adaptive, context-aware interaction.
OM1 is being built to serve as the cognitive layer for this next generation of robots.
- Hardware-independent, supporting a variety of platforms (humanoids, quadrupeds, gripper arms, etc.)
- Open-source, encouraging community collaboration and transparency
- Privacy-conscious, with built-in architecture for secure data handling
According to Liphardt, robots must be able to collaborate and communicate with humans in increasingly nuanced ways.
“Machines are now able to interact with humans in ways I’ve never seen before,” he told TechCrunch. “We see ourselves as building a bridge between human and machine collaboration.”
FABRIC
One of OpenMind’s key innovations is FABRIC, a protocol designed to let robots verify identity, share context, and learn from one another in real time.
Unlike traditional robotics systems that operate in isolation, FABRIC supports peer-to-peer communication and adaptive behavior.
Liphardt draws a parallel to human communication systems like texting and calling. “Humans expect to interact with others anywhere in the world,” he explained. “Machines will be no different.”
FABRIC is described by OpenMind as,
- A decentralized middleware layer, not just a communication protocol
- An on-chain identity system that builds trust between machines
- A mechanism for fast, distributed learning, where one robot’s experience can be instantly shared with others
Developer-Centric Design and Multi-Form Factor Support
OM1 is explicitly designed to be modular and developer-friendly. Its runtime, as detailed on GitHub, supports:
- Humanoids
- Quadrupeds
- Educational bots
- Mobile devices
- Robotic arms
Built in Python, OM1 includes plugin support for technologies like ROS 2, Zenoh, and CycloneDDS, and comes with a browser-based visual debugger called WebSim for prototyping and monitoring.
From Schools to Wall Street
OpenMind is preparing for practical validation of its software. In September, it plans to deploy a fleet of ten robotic quadrupeds running OM1 into live environments to gather user feedback. Instead of waiting for perfection, the team is emphasizing rapid iteration based on real-world use.
In June 2025, a humanoid powered by OM1 rang the opening bell for an ETF listing on NASDAQ, signaling the company’s intent to place robots in highly visible, real-world scenarios.
OpenMind is collaborating with Robostore (Unitree’s U.S. distributor) to launch an AI-native curriculum using Unitree G1 robots across U.S. K–12 schools — an initiative aimed at familiarizing the next generation with embodied AI.
Connecting OM1 to LLMs
OpenMind has designed OM1 to integrate with popular large language models, including:
- GPT-4o
- Google’s Gemini
- DeepSeek (open-source)
These models can control robotic behavior through natural language outputs. OM1 routes LLM-generated text to ROS 2 events like “move,” “speak,” or “navigate,” enabling vision-language-action loops.
Skepticism Around Depth of Innovation
On platforms like Reddit’s r/robotics, some developers have raised questions about the novelty of OM1.
A few describe it as “a wrapper” that relays sensor input to LLMs and sends basic commands back via ROS 2, questioning whether the platform adds meaningful intelligence or simply orchestrates existing tools.
One commenter put it bluntly: “It’s a wrapper calling a model behind the scenes.”
While this critique reflects healthy skepticism in the developer community, OpenMind maintains that OM1 is intended as a foundation — not a fully intelligent agent out of the box.
Funding and Roadmap
OpenMind recently secured a $20 million seed round led by Pantera Capital, with participation from Ribbit, Coinbase Ventures, Pebblebed, and other strategic investors. The company plans to use the funding to:
- Expand its development team
- Scale OM1 across additional platforms
- Accelerate real-world testing and feedback loops
Liphardt emphasized the importance of deploying early and often. “Our goal is to run as many tests as possible so we can identify where robotic capabilities align with human needs.”