EVENTS about careers insights EN ZH | contact

Introducing Adam-U: a Purpose-Built Humanoid Data Collection Platform for Embodied AI

Operator using a motion-capture suit to teleoperate the Adam-U humanoid robot at the World Artificial Intelligence Conference 2025, demonstrating real-time embodied AI data collection with synchronized movement and tactile sensing.

During WAIC 2025 (World Artificial Intelligence Conference), Noitom Robotics (NR) —together with PNDbotics and Inspire Robotics—unveiled Adam-U, a humanoid data-collection platform designed for embodied intelligence. After a show-stopping debut at WAIC, Adam-U is making its second public appearance at the 2025 Shanghai Embodied Intelligent Robot Industry Conference & Expo (EAI Show), where we’re sharing full details for the first time.

See Adam-U at EAI Show

Booth: Hall N2 · 2A130
Dates: August 13–15
Venue: Shanghai New International Expo Centre (Pudong)

Born for data collection

Adam-U is built from the ground up for embodied-AI data capture. It integrates a 31-DoF bionic-architecture humanoid, tactile dexterous hand, and a low-latency, high-precision motion-capture teleoperation system. Out of the box, it captures synchronized multimodal data—motion, force-tactile, and visual—and includes a VR HMD for real-time, binocular vision feedback so operators can gather high-quality demonstrations fast.

Adam-U dramatically improves the usability of data collection and the yield from human demonstrations—fueling both reinforcement learning and imitation learning while closing the gap between human behavior and robot intelligence.

In embodied intelligence and humanoid robotics, the need for high-quality training data grows by the day. Data quality has become the bottleneck for training large embodied models. We’re honored to launch this product with outstanding partners, and we believe Adam-U will provide a trustworthy data foundation for research in this field.

— Dr. Ruoli (Tristan) Dai, Founder & CEO, Noitom Robotics

Five core advantages

31-DoF bionic design Full-joint mobility in the head, waist, and both arms. The structure maps closely to human musculoskeletal mechanics for more faithful motion reproduction.
Real-time mocap data streaming Natively integrated Noitom Robotics teleoperation stack outputs magnetically resilient, highly stable, high-precision motion data.
Six-DoF tactile dexterous hand Provided by Inspire Robotics for fine manipulation with tactile sensing—enabling complex task execution.
Developer-ready SDK ROS 2 and NVIDIA Isaac compatible, with synchronized streams for vision, joint states, and tactile signals—ideal for research and rapid secondary development.
Lab-friendly configuration Fixed base and standard power requirements. Introductory price: CNY 399,000 (approx USD $45,000)—a practical fit for universities and R&D teams.

From WAIC to EAI Show

Following its WAIC premiere, Adam-U returns to the spotlight August 13–15 at the Shanghai New International Expo Centre (Booth N2·2A130). Get a close look at Adam-U’s elegant mechanics—and try its mocap-powered teleoperation and multimodal data-capture workflow live.

Why Adam-U matters now

Across the industry, embodied AI is accelerating, and high-quality, multimodal training data is the limiting reagent. Motion capture-driven teleoperation and synchronized sensing are emerging as the clearest path to robust generalization. Adam-U operationalizes this approach—bringing reliable hardware, a real-time teleop stack, and a developer-first SDK together in a single platform.
Adam-U also reflects NR’s long-term vision for a mocap-based “data factory” for embodied AI—a full pipeline spanning high-precision capture, body-mapping, processing, and output that plugs cleanly into modern robotics stacks (C++/Python/ROS/Isaac).
At Noitom Robotics, we obsess over practical, end-to-end systems—speedy teleop pipelines, precise human-robot data collection, and tools that integrate with ROS, Isaac, and MuJoCo—so teams can ship results, not just papers. Adam-U is our next step toward that future.

Get in touch

Curious about pilots, research collaborations, or custom integrations?
Email us at: contact@noitomrobotics.com

Scroll to Top