Guide Component intelligence

Components Glossary

Understand the technology stack inside every robot on ui44, from perception hardware to cloud-dependent voice systems. This guide turns raw component labels into buying context you can actually use.

Tracked

960

directory entries

Families

4

component groups

Guide map

960

tracked kind instances

Best next step

Compare

robots once you know the stack

562

Sensors components

143

Connectivity components

204

AI (Artificial Intelligence) components

51

Voice Assistants components

Decode spec sheets faster

Separate meaningful component trade-offs from marketing shorthand before you shortlist anything.

Spot the right comparisons

Know when sensors, connectivity, AI, or voice features actually change the buying decision.

Build a premium shortlist

Use the glossary as a field guide, then pivot directly into the component directory or compare flow.

Why Sensor Matters

Sensor quality is arguably the single most important differentiator between a capable robot and a frustrating one. A robot vacuum with basic bump sensors will randomly bounce off furniture, missing areas and getting stuck in corners. The same robot with LiDAR, cameras, and ultrasonic sensors can build an accurate map of your home, navigate efficiently in straight lines, detect and avoid obstacles (including small objects like cables and pet toys), and clean every reachable square foot. For humanoid robots, the gap is even more dramatic — the sensor suite determines whether the robot can safely walk through a room, pick up objects, or interact with people without risk of injury. When comparing robots, pay close attention to the sensor count and types listed on each detail page. More sensors generally mean better spatial awareness, but the specific combination matters more than raw count.

How Sensor Works

Most robots use a technique called sensor fusion — combining data from multiple sensor types into a unified perception model. A robot might use LiDAR to measure distances to walls and large objects, cameras to visually identify what those objects are, ultrasonic sensors for close-range detection of transparent surfaces (which LiDAR cannot see), IMU gyroscopes to track its own orientation and movement, and infrared cliff sensors to detect drop-offs like stairs. Each sensor type has strengths and limitations; by combining them, the robot builds a richer, more reliable understanding of its environment than any single sensor could provide. The robot's onboard processor runs SLAM (Simultaneous Localization and Mapping) algorithms that use this fused sensor data to both build a map and track the robot's position within it, enabling systematic navigation rather than random movement.

Taxonomy

Types of Sensors

Designed for quick spec-sheet scanning, not filler taxonomy.

LiDAR / Laser Distance Sensors

Light Detection and Ranging sensors emit laser beams and measure the time they take to bounce back, building precise 2D or 3D maps of the environment. LiDAR provides millimeter-accurate distance measurement and works in complete darkness.

Cameras (RGB, Depth, Stereo, Infrared)

Vision sensors range from basic RGB cameras for visual recognition to structured-light depth cameras that measure distance to every point in the frame. Stereo cameras use two lenses to calculate depth through parallax.

Ultrasonic Sensors

Emit high-frequency sound waves and measure the echo to detect nearby objects. Particularly useful for detecting transparent surfaces like glass walls and mirrors that LiDAR beams pass through.

IMU / Gyroscope / Accelerometer

Inertial Measurement Units combine gyroscopes and accelerometers to track the robot's orientation, tilt, and movement. Essential for balance in bipedal and quadruped robots.

Infrared / Cliff Sensors

Infrared emitters and receivers detect edges and drop-offs, preventing robots from falling down stairs or off elevated surfaces. Also used for wall-following and proximity detection.

Force / Torque Sensors

Measure the forces applied to robot joints and end-effectors, enabling precise manipulation and safe physical interaction with humans. Critical for humanoid robots.

Touch / Pressure / Tactile Sensors

Detect physical contact, pressure distribution, and surface texture. Used in companion robots for responsive petting interactions and in manipulation robots for grip control.

Buyer signals

  • Look for LiDAR or structured-light sensors if navigation accuracy matters — these provide the most precise distance measurements.
  • Camera-based visual SLAM is increasingly capable and more affordable, though it can struggle in low light.
  • For outdoor robots like lawn mowers, RTK GPS provides centimeter-level positioning accuracy.
  • Check whether the robot includes cliff sensors if you have stairs, and obstacle-avoidance sensors if you have pets or children.
  • For security and companion robots, look for microphone arrays rather than single microphones.

FAQ

What is the most important sensor for a cleaning robot?

LiDAR is the most impactful sensor for cleaning robot performance. It enables systematic navigation patterns instead of random bumping, accurate mapping for room-by-room cleaning, and reliable return-to-base behavior. Camera-based navigation is a capable and more affordable alternative, though it can struggle in very dark rooms.

Do more sensors always mean a better robot?

Not necessarily. The quality of sensor integration (sensor fusion) matters more than raw sensor count. A robot with five well-integrated sensors that complement each other will outperform one with ten sensors whose data is poorly combined. That said, robots with more sensor types generally have better spatial awareness and fewer blind spots.

Can robot sensors work in the dark?

It depends on the sensor type. LiDAR and ultrasonic sensors work identically in darkness because they use their own light or sound emissions rather than ambient light. RGB cameras need light to function, though infrared cameras and structured-light depth cameras have their own illumination and work in complete darkness.

Why Connectivity Matters

Connectivity directly affects your daily experience with a robot. A robot with strong Wi-Fi connectivity delivers responsive app control, real-time status updates, and reliable remote access when you are away from home. Bluetooth enables direct device pairing without a network. Smart home integration (Alexa, Google Home, HomeKit) lets you control the robot with voice commands and include it in automated routines — for example, triggering a cleaning cycle when you leave the house.

How Connectivity Works

Most home robots maintain at least two types of connection simultaneously. A persistent Wi-Fi connection handles primary data exchange — sending status updates to the cloud, receiving commands from the app, downloading software updates, and streaming map data. Bluetooth often handles initial setup, local device pairing, and beacon-based positioning. For robots with cloud AI features, the Wi-Fi connection carries voice recordings to speech recognition servers and returns processed commands.

Taxonomy

Types of Connectivity

Designed for quick spec-sheet scanning, not filler taxonomy.

Wi-Fi (2.4 GHz / 5 GHz / Wi-Fi 6)

The primary connectivity method for most home robots. Enables app control, cloud AI features, software updates, and remote monitoring. Wi-Fi 6 provides faster speeds and better performance in congested environments.

Bluetooth / BLE

Used for initial device setup, direct phone pairing, and short-range communication. BLE is particularly important for companion robots, providing constant-on connectivity with minimal battery drain.

Cellular (4G LTE / 5G)

Network-independent connectivity for outdoor robots, delivery robots, and commercial platforms that operate beyond Wi-Fi range. 5G enables high-bandwidth, low-latency communication for real-time teleoperation.

Zigbee / Z-Wave / Matter

Low-power mesh networking protocols for smart home integration. Matter is a newer universal standard backed by Apple, Google, Amazon, and Samsung that promises cross-ecosystem compatibility.

Ethernet / USB

Wired connections used primarily in commercial and industrial robots where reliability is critical. Ethernet provides the most stable, highest-bandwidth connection.

Custom Radio / UWB / LoRa

Specialized radio protocols: UWB for centimeter-accurate indoor positioning, LoRa for long-range low-power communication, and custom radio links for boundary wire communication.

Buyer signals

  • Check that the robot supports your Wi-Fi band — many older robots only support 2.4 GHz, which can be congested in apartment buildings.
  • Dual-band (2.4 + 5 GHz) is preferable for faster, more reliable connections.
  • If smart home integration matters, verify compatibility with your specific ecosystem (Alexa, Google Home, HomeKit, SmartThings) before purchasing.
  • For outdoor robots, check whether the robot relies on Wi-Fi or has its own positioning system.
  • If privacy is a concern, look for robots that can operate with local-only connectivity without requiring a cloud account.

FAQ

Can I use a home robot without Wi-Fi?

Many robots require Wi-Fi for initial setup and cloud-dependent features, but some can operate in a limited mode without it. Basic cleaning functions often work offline once the robot is configured, but you lose app control, scheduling, mapping updates, and voice assistant integration.

Does my robot work with Alexa or Google Home?

Smart home compatibility varies by manufacturer and model. Check the connectivity section on each robot's detail page on ui44 for confirmed integrations. Most major cleaning robot brands support both Alexa and Google Assistant.

Is 2.4 GHz or 5 GHz Wi-Fi better for robots?

2.4 GHz has longer range and better wall penetration, which is important for robots that move throughout a home. 5 GHz offers faster speeds with less interference but shorter range. Dual-band robots that support both provide the best experience.

Why AI Matters

AI capability is what separates a robot from a remote-controlled appliance. Without AI, a robot can only follow pre-programmed instructions in predictable environments. With capable AI, a robot can handle the unpredictable, dynamic nature of real homes — navigating around a toy that was not there yesterday, understanding when you say 'clean the kitchen but skip around the dog bowl,' and learning that you prefer vacuuming to happen while you are at work.

How AI Works

Robot AI operates at multiple levels. At the lowest level, control algorithms manage motors and actuators in real-time, keeping the robot balanced and moving smoothly. Above that, perception AI processes sensor data — running object detection neural networks on camera feeds, interpreting LiDAR point clouds into navigable maps, and fusing data from multiple sensors into a coherent world model. Navigation AI plans efficient paths through known spaces while adapting in real-time to obstacles.

Taxonomy

Types of AI (Artificial Intelligence)

Designed for quick spec-sheet scanning, not filler taxonomy.

Navigation AI / SLAM

The algorithms that build maps, plan paths, and navigate through environments. Modern navigation AI uses neural networks to identify furniture, room boundaries, and obstacles rather than treating everything as generic blocked areas.

Computer Vision / Object Recognition

Neural networks trained to identify objects in camera feeds — from recognizing shoes and cables on the floor to identifying specific people for personalized interactions. State-of-the-art models recognize hundreds of object types in real-time.

Natural Language Processing (NLP)

AI systems that understand spoken and written language. Modern NLP enables robots to interpret conversational commands rather than requiring specific keyword phrases. Large language models are increasingly integrated into companion robots.

Reinforcement Learning / Adaptive Behavior

AI approaches where robots learn through trial and error, improving their behavior over time. Used in locomotion, manipulation, and personalization (learning user preferences and schedules).

Edge AI Processors

Dedicated hardware for running AI models locally on the robot. Chips like NVIDIA Jetson, Google Coral TPU, and Qualcomm Robotics platforms provide real-time vision and navigation without cloud dependency.

Cloud AI Platforms

Remote server infrastructure for AI capabilities beyond onboard hardware. Cloud platforms excel at complex language understanding, large-scale learning, and model updates. The trade-off is latency and internet dependency.

Buyer signals

  • Look for robots that specify their AI platform or processor — names like NVIDIA Jetson, Qualcomm RB5, or custom AI chips indicate serious onboard processing capability.
  • Robots that depend entirely on cloud AI will be less responsive and non-functional without internet.
  • Check whether the manufacturer provides regular AI model updates, as AI capability improves significantly through software updates.
  • For privacy-sensitive applications, prioritize robots with strong edge AI that minimize data sent to the cloud.
  • Be cautious of marketing claims about 'AI-powered' features — verify what the AI actually does versus what is simply automated behavior.

FAQ

What does 'AI-powered' actually mean for a home robot?

It varies enormously by manufacturer. At the basic end, it might mean the robot uses a neural network for object detection. At the advanced end, it means large language models for conversation, computer vision for scene understanding, and adaptive algorithms that learn your preferences. Check specific AI components listed on each robot's page.

Do robots get smarter over time through updates?

Some do. Manufacturers with active software development teams push AI model updates that can genuinely improve performance — better object recognition, more efficient navigation patterns, new capabilities. This requires the manufacturer to invest in ongoing AI development.

Should I worry about privacy with AI-powered robots?

It is a reasonable concern. Robots with cameras and microphones collect sensitive data. Check whether the robot processes data locally (edge AI) or sends it to cloud servers. Look for manufacturers with clear privacy policies, data encryption, and options to delete stored data.

Why Voice Assistant Matters

Voice interaction removes the friction of app-based control. Instead of unlocking your phone, opening an app, and tapping buttons, you speak a command and the robot responds. This is particularly valuable for accessibility — elderly users, people with mobility limitations, and anyone whose hands are occupied can control their robot by voice alone.

How Voice Assistant Works

Voice assistant systems involve several processing stages. A microphone array captures audio while noise-cancellation algorithms filter out background sounds. Wake-word detection runs continuously on a low-power local processor. When triggered, the audio is processed through automatic speech recognition (ASR) to convert sound into text. Natural language understanding (NLU) interprets the meaning and intent. The robot then executes the action and generates a response through text-to-speech (TTS) synthesis.

Taxonomy

Types of Voice Assistants

Designed for quick spec-sheet scanning, not filler taxonomy.

Amazon Alexa Integration

The most widely integrated voice platform in home robotics. Enables voice control of robot functions, inclusion in Alexa routines, and access to thousands of Alexa skills.

Google Assistant Integration

Similar capabilities to Alexa, with particularly strong natural language understanding and Google ecosystem integration. Works with Google Home routines and provides access to Google's knowledge graph.

Apple Siri / HomeKit

Deep iOS integration with Siri Shortcuts for custom voice commands. HomeKit compatibility allows control through the Apple Home app. Prioritizes on-device processing for privacy.

Custom Voice AI / LLM-Powered

Purpose-built voice systems developed by the manufacturer, increasingly powered by large language models for natural, open-ended conversation optimized for the robot's specific capabilities.

Multilingual Speech Recognition

Voice systems capable of understanding and responding in multiple languages, sometimes within the same conversation. Quality varies — some offer full conversational ability while others only support basic commands.

Offline Voice Processing

Voice recognition that runs entirely on the robot's local hardware without sending audio to cloud servers. Faster response times, works without internet, and better privacy. Typically more limited in vocabulary.

Buyer signals

  • If you already use a smart home ecosystem (Echo, Nest, HomePod), choose a robot that integrates with that platform for seamless voice control and routine automation.
  • Check whether voice commands require an internet connection or work offline — this matters if your internet is unreliable.
  • For companion robots where conversation quality matters, look for LLM-powered voice systems rather than simple command-response engines.
  • Multi-language support is important if your household speaks multiple languages.
  • Check whether the robot has a far-field microphone array (can hear you from across the room) versus a close-range microphone.

FAQ

Can I control any robot with voice commands?

Not all robots support voice control. Check the voice assistant components listed on each robot's detail page on ui44. Robots with Alexa, Google Assistant, or Siri integration support voice commands through those platforms.

Which voice assistant is best for robot control?

Amazon Alexa and Google Assistant are the most widely supported and offer the best smart home integration. Apple Siri/HomeKit is less common but provides the most privacy-focused approach. Custom LLM-powered systems offer the most natural conversation but lack ecosystem integration.

Do voice-controlled robots always listen?

Robots with wake-word detection have microphones that are always processing audio locally to detect the trigger phrase. Audio is only sent to cloud servers after the wake word is detected. Many robots have physical mute buttons that electrically disconnect the microphone.

How Components Shape Robot Performance

Integration matters more than isolated specs

The integration principle

A robot's capability is not the sum of its parts, it is the quality of the system design. A strong component on paper can still underperform when the surrounding software, tuning, or connectivity stack is weak.

1

Start with perception

Navigation quality and spatial awareness begin with the sensor stack, not the marketing tagline.

2

Stress-test connectivity

Reliable control, remote alerts, and smart-home behavior depend on a stable connectivity stack.

3

Check the intelligence layer

AI quality determines whether the hardware turns into smooth autonomy or a frustrating spec-sheet mismatch.

4

Only then evaluate voice

Voice and assistant layers matter most after core autonomy, sensing, and reliability are already credible.

Continue exploring

Turn glossary knowledge into live product research.

Use the directory for component-level detail pages, open the 30-day trends view to see which technologies are gaining traction, and move into compare once you know which stacks deserve side-by-side scrutiny.

Research posture

  • Prioritize the component families that change safety, navigation, or reliability first.
  • Use component pages to validate whether a brand's claims map to real tracked hardware.
  • Use compare for the final shortlist, once the glossary gives you the right lens.