Release
Jan 1, 2021
Price
Price TBA
Connectivity
2
Status
Active
Speed
Stationary (upper body only)
Ameca
Engineered Arts' humanoid robot platform designed for human-robot interaction research and public engagement. First revealed in December 2021 and debuted at CES 2022, Ameca went viral for its remarkably lifelike facial expressions. Now in its third generation (showcased at ICRA 2025), Ameca is deployed at museums and institutions worldwide including the Museum of the Future in Dubai and the National Robotarium in Edinburgh. Features grey rubber skin with a deliberately genderless design.
Listed price
Price TBA
Available for purchase or lease (contact sales)
Release window
Jan 1, 2021
Current status
Active
Engineered Arts
Last verified
Feb 23, 2026
Technical overview
Core specifications and system stack
A fast read on the mechanical profile, sensing package, and platform integrations behind Ameca.
Technical Specifications
Height
Not disclosed
Weight
Not disclosed
Battery Life
Not disclosed
Charging Time
Not disclosed
Max Speed
Stationary (upper body only)
Tech Components
Operational profile
How this robot is configured
Capabilities
7
Connectivity
2
Key capabilities
Ecosystem fit
Explore further
Benchmark set
Compare with similar robots
Shortcuts to the closest alternatives in the current ui44 set.
Research
ASIMO
Honda
Price TBA
Research
Asimov DIY Kit (Here Be Dragons Edition)
Menlo Research
$15,000
Research
QTrobot
LuxAI
$10,900
Research
DRC-HUBO+
KAIST
Price TBA
About the Ameca
The Ameca is a Research robot built by Engineered Arts. Engineered Arts' humanoid robot platform designed for human-robot interaction research and public engagement. First revealed in December 2021 and debuted at CES 2022, Ameca went viral for its remarkably lifelike facial expressions. Now in its third generation (showcased at ICRA 2025), Ameca is deployed at museums and institutions worldwide including the Museum of the Future in Dubai and the National Robotarium in Edinburgh. Features grey rubber skin with a deliberately genderless design.
Pricing has not been publicly disclosed. See all Engineered Arts robots on the Engineered Arts page.
Spec Breakdown
Detailed specifications for the Ameca
Maximum Speed
Stationary (upper body only)A top speed of Stationary (upper body only) is calibrated for the robot's primary operating environment and safety requirements.
The Ameca uses Compatible with OpenAI GPT models; also supports human telepresence as its intelligence backbone. This AI platform powers the robot's decision-making, perception processing, and autonomous behavior. The sophistication of the AI stack directly impacts how well the robot handles unexpected situations and adapts to new environments.
Ameca Sensor Suite
The Ameca integrates 4 sensor types, forming the perceptual foundation that enables autonomous operation.
This sensor configuration enables the Ameca to perceive its environment and operate autonomously in its intended use cases. Multiple sensor modalities provide redundancy and more robust perception than any single sensor type alone.
Explore sensor technologies: components glossary · full components directory
Ameca Use Cases & Applications
Research robots serve as platforms for advancing robotics science and engineering. They enable researchers to test theories about locomotion, manipulation, perception, and human-robot interaction in controlled and real-world environments.
Capabilities That Enable Real-World Use
The Ameca offers 7 distinct capabilities, each contributing to the robot's practical utility.
These capabilities work together with the robot's 4 onboard sensor types and Compatible with OpenAI GPT models; also supports human telepresence AI platform to deliver practical, real-world performance.
Ecosystem Integration
The Ameca integrates with the following platforms and ecosystems, extending its utility beyond standalone operation.
This ecosystem compatibility enables the Ameca to work as part of a broader automation setup rather than operating in isolation.
Ameca Capabilities
7
Capabilities
4
Sensor Types
AI
Compatible with OpenAI GPT m…
Lifelike Facial Expressions
The Ameca's lifelike facial expression capability is its signature feature and the primary reason for its global recognition. Using dozens of individually controlled actuators beneath a specially engineered face covering, the robot can produce a remarkably wide range of human-like expressions — from subtle eyebrow raises and lip movements to complex emotional displays combining multiple facial regions. This level of expressiveness is achieved through a combination of mechanical engineering (miniature actuators with fine positional control), material science (flexible face coverings that deform naturally), and animation expertise (expression choreography that follows the principles of human facial dynamics). The result is a robot that elicits genuine emotional responses from human observers.
Natural Conversation
Natural conversation capability enables the Ameca to engage in fluid, contextual dialogue with humans. Unlike simple command-response systems, natural conversation involves understanding context, maintaining dialogue history, generating appropriate responses, and timing speech to match conversational flow. Engineered Arts achieves this through integration with large language models and speech processing systems that handle both speech recognition and synthesis. The combination of conversational AI with the Ameca's expressive face and gestures creates an interaction experience that feels substantially more engaging than talking to a disembodied voice assistant — the robot's physical presence and non-verbal cues add communication channels that voice-only systems lack.
Gesture Recognition
Gesture recognition allows the Ameca to interpret human body language and hand movements as communication signals. Using its camera systems and computer vision algorithms, the robot can detect and interpret pointing gestures, waves, nods, shakes, and other non-verbal cues that form a natural part of human communication. This capability is particularly important for interactive and research applications where natural communication extends beyond spoken language. Gesture recognition complements the Ameca's conversation capabilities by providing additional context about human intent and emotional state, enabling more nuanced and appropriate responses.
Articulated Arms and Hands
The Ameca's articulated arms and hands provide physical manipulation capability that extends its utility beyond pure social interaction. With multiple degrees of freedom in each arm and individually actuated fingers, the robot can reach for objects, perform demonstrative gestures, and interact physically with its environment. For a research and engagement platform, articulated manipulation enables demonstrations of human-robot handshake protocols, collaborative object manipulation studies, and physical computing interaction patterns that are central to current human-robot interaction research.
Human-Robot Interaction Research
As a dedicated human-robot interaction (HRI) research platform, the Ameca provides researchers with a sophisticated testbed for studying how people perceive, respond to, and interact with humanoid robots. The platform's combination of expressive face, articulated body, sensory systems, and AI integration makes it suitable for studies spanning psychology, computer science, engineering, and design. Research areas include emotional response measurement, trust calibration, non-verbal communication, persuasive robotics, and long-term interaction dynamics. Engineered Arts supports the research community through documented APIs and a development framework that enables custom experiment design.
Telepresence Operation
Telepresence operation allows a remote human operator to control the Ameca's movements, speech, and expressions in real time. This capability serves two purposes: it enables human-quality interaction at remote locations (museums, events, conferences) without requiring on-site AI sophistication, and it provides a fallback for situations where autonomous AI responses are inadequate. The blend of autonomous and telepresence modes is particularly valuable — the robot can handle routine interactions autonomously while a human operator takes over for complex or sensitive conversations. This hybrid approach ensures consistently high interaction quality across a wider range of situations.
Public Engagement / Exhibition
The Ameca excels in public engagement and exhibition contexts — museums, science centers, corporate events, and trade shows. Its striking appearance and lifelike movements naturally draw attention and create memorable experiences for visitors. In exhibition settings, the robot can be programmed with context-specific knowledge (museum facts, product information, event details) and interact with a continuous stream of visitors throughout the day. The combination of visual impact, conversational ability, and expressive range makes the Ameca one of the most effective crowd-engagement robots available, explaining its deployment at prominent venues worldwide.
Connectivity & Integration
How the Ameca communicates with your network, smart home devices, cloud services, and companion apps.
Network & Communication Protocols
Ameca Technology Stack Overview
The Ameca by Engineered Arts integrates 7 distinct technology components across sensing, connectivity, intelligence, and interaction layers. The physical platform features a top speed of Stationary (upper body only), providing the foundation on which this technology stack operates.
Perception — 4 Sensor Types
The perception layer is built on Binocular Eye Cameras, Chest Camera, Embedded Microphones, Facial Recognition. These work in concert to give the robot a detailed understanding of its operating environment. This multi-sensor approach provides redundancy and enables the robot to function reliably even when individual sensors encounter challenging conditions such as low light, reflective surfaces, or cluttered spaces.
Connectivity — 2 Protocols
Intelligence — Compatible with OpenAI GPT models; also supports human telepresence
Compatible with OpenAI GPT models; also supports human telepresence serves as the computational brain, processing sensor data, making navigation decisions, and orchestrating the robot's autonomous behaviors. The quality of this AI platform directly influences how well the robot handles novel situations, adapts to changes in its environment, and improves its performance over time through learning.
Who Should Consider the Ameca?
Target Audience
Research robots are acquired by universities, government labs, and corporate R&D departments. They serve as experimental platforms for developing new algorithms, testing locomotion strategies, and advancing the field of robotics. Some are also used for educational purposes.
Key Considerations
Open-source software compatibility (ROS/ROS 2), sensor modularity, programmability, available SDK/API quality, community support, and published research papers using the platform are key factors. Documentation quality and the ability to modify both hardware and software are essential for research use.
Pricing
Availability
ActiveThe Ameca has a status of Active. Check with Engineered Arts for the latest availability details.
Ameca: Strengths & Trade-offs
Engineering compromises and where this research robot excels
What the Ameca does well
Solid sensor coverage
The Ameca integrates 4 sensor types, providing good perceptual coverage for its intended applications. This sensor complement covers the essential modalities needed for effective research operation while keeping complexity manageable.
Broad capability set
With 7 distinct capabilities, the Ameca is designed as a versatile platform rather than a single-task device. This breadth means the robot can handle varied scenarios and workflows, reducing the need for multiple specialized robots and increasing its utility across different situations.
What to consider carefully
Undisclosed pricing
Engineered Arts has not published a public price for the Ameca. While common for enterprise-class robotics, the absence of transparent pricing can complicate budgeting and comparison shopping. Prospective buyers will need to engage directly with the manufacturer for quotes, which may vary by configuration and volume.
Note: This strengths and trade-offs assessment is based on the Ameca's documented specifications as tracked in the ui44 database. Real-world performance depends on deployment conditions, firmware maturity, and environmental factors. For the most current information, check the Engineered Arts manufacturer page or visit the official product page. Use the comparison tool to evaluate these trade-offs against competing robots in the same category.
How Research Robot Technology Works
Understanding the engineering behind this category
Research robots serve a fundamentally different purpose than commercial or consumer models. They are platforms for discovery — enabling scientists and engineers to test theories, develop algorithms, and push the boundaries of what robots can do. The technology in research robots prioritizes openness, flexibility, and access to raw data over consumer-friendly packaging or commercial reliability. Understanding this distinction is important for anyone considering a research robot platform.
Navigation & Mobility
Research robots typically expose their navigation systems at a much lower level than commercial products. Researchers can access raw sensor data, modify SLAM algorithms, implement custom path planners, and test novel navigation approaches. ROS (Robot Operating System) and ROS 2 compatibility is standard, providing a common framework for sharing navigation modules across the research community. This openness enables rapid iteration — a researcher can swap between different SLAM implementations, test new obstacle avoidance strategies, or develop entirely novel navigation paradigms without being locked into a vendor's proprietary stack.
The Role of AI
Research robots serve as physical testbeds for AI algorithms that may eventually appear in commercial products years later. Reinforcement learning, imitation learning, few-shot task learning, and human-robot interaction studies all require robot platforms that can execute AI-generated commands in the physical world. The gap between simulation (where training is cheap and fast) and reality (where physics is unforgiving) makes physical robot platforms essential for validating AI approaches. Research robots must support rapid deployment of new AI models without extensive integration work.
Sensor Fusion & Perception
Research platforms prioritize sensor modularity and data access. Standard mounting interfaces allow researchers to attach custom sensors alongside built-in ones. Raw sensor data streams (not just processed results) are accessible for developing novel perception algorithms. Precise time-stamping and synchronization across sensor streams enable accurate multi-modal fusion research. Many research robots include more sensors than strictly necessary for any single application, providing researchers with rich datasets for developing and testing new algorithms.
Power & Battery Management
Research robots balance operational runtime with practical lab use. Sessions of one to four hours are typical, with quick charging between experiments. Some research setups use tethered power for long-running experiments where battery limitations would interrupt data collection. Power monitoring and logging capabilities help researchers understand the energy costs of different behaviors and algorithms — important for developing efficient approaches that will eventually run on battery-constrained commercial systems.
Safety by Design
Research environments present unique safety challenges because robots are constantly being programmed with untested behaviors. Hardware safety limits (joint speed caps, force limits, emergency stops) must be robust regardless of software commands. Safety-rated monitored stop and speed monitoring ensure the robot cannot exceed safe operating parameters even when running experimental code. Collaborative operation standards apply when researchers work alongside the robot during experiments. Many labs implement layered safety with physical barriers for high-speed testing and open-area operation restricted to validated, lower-risk behaviors.
What's Next for Research Robots
Research robot platforms are becoming more accessible and capable. Cloud robotics enables remote experiment execution and shared datasets. Digital twins and high-fidelity simulators reduce the need for physical hardware time while improving sim-to-real transfer. Standardized benchmarks and open datasets enable fair comparison of results across labs. The democratization of robotics research — through lower-cost platforms, open-source software, and cloud infrastructure — is expanding who can contribute to advancing the field.
The Ameca by Engineered Arts incorporates many of these technology pillars. For a detailed look at the specific sensors and components used in the Ameca, see the sensor analysis and connectivity sections above, or browse the complete components glossary for explanations of every technology used across the robotics industry.
Ameca in the Research Market
How this robot compares in the research landscape
Engineered Arts has not publicly disclosed pricing for the Ameca, which is typical for enterprise-focused robotics platforms that offer customized solutions and direct-sales relationships.
The Ameca's 4 sensor types provide solid perceptual coverage for its intended use cases. This mid-range sensor suite balances cost with capability, covering the essential modalities needed for research applications.
Being currently available for purchase gives the Ameca a practical advantage over competitors still in development or prototype stages. Buyers can evaluate the actual product rather than relying on spec-sheet promises that may change before release.
Head-to-Head Comparisons
Side-by-side specs, capability overlap analysis, and key differentiators.
For the full picture of Engineered Arts's portfolio and market strategy, visit the Engineered Arts manufacturer page.
Owning the Ameca: Setup, Maintenance & Tips
Practical guide from day one through years of ownership
Initial Setup
Research robot setup combines hardware assembly with software environment configuration. Unpack and assemble the platform following the manufacturer's documentation. Install the development framework — typically ROS or ROS 2 — and verify sensor connectivity. Calibrate all sensors using the manufacturer's tools and procedures. Set up the simulation environment (Gazebo, Isaac Sim, or equivalent) alongside the physical platform for parallel development. Establish version control for your experiment code and configuration. Document the initial calibration values and system state as your baseline for future reference. Plan network and computing infrastructure to handle the data rates your sensors will generate.
Ongoing Maintenance
Research robots need maintenance that preserves the precision required for valid experimental results. Regularly verify sensor calibration — drift in camera intrinsics or IMU biases can invalidate experiment data. Maintain clean workspace conditions to protect optical sensors. Document any hardware modifications or maintenance performed, as these can affect experimental reproducibility. Update software dependencies carefully, documenting versions used for each experiment. Joint and actuator wear in research robots that perform repetitive tasks should be monitored and factored into experimental design.
Software Updates & Long-Term Support
Research robot software updates require careful management to maintain experiment reproducibility. Document the exact software versions used for each experiment. Test updates in a separate environment before applying to your experiment platform. Contribute bug fixes and improvements back to the community when using open-source frameworks. Be aware that ROS and other framework updates may require code changes in your custom packages — budget time for integration testing after major framework updates.
Maximizing Longevity
Research robots often have longer productive lives than commercial products because they can be upgraded and repurposed. Extend your investment by maintaining clean mechanical and electrical systems, documenting all modifications for future lab members, and keeping spare parts for common wear items. When specific components become obsolete, community forums and lab networks can be valuable sources for replacements. Consider the platform's modularity when planning future research directions — a platform that can accept new sensors and actuators adapts to evolving research questions.
For Engineered Arts-specific support resources and documentation, visit the Engineered Arts page on ui44 or check the manufacturer's official website at Engineered Arts's product page.
Frequently Asked Questions
What is the Ameca?
How much does the Ameca cost?
Is the Ameca available to buy?
What sensors does the Ameca have?
What AI does the Ameca use?
How does the Ameca compare to the ASIMO?
Does the Ameca work with smart home systems?
How current is the Ameca data on ui44?
Data Integrity
All Ameca data on ui44 is verified against official Engineered Arts sources, including spec sheets, product pages, and press releases. Last verified: 2026-02-23. Official source: Engineered Arts product page. If you find outdated or incorrect information, please let us know — accuracy is our top priority.
Explore More on ui44
Manufacturer
Category
Explore more research robots
See how the Ameca stacks up — compare specs, browse the research category, or search the full database.