Sensors for Robotics 2023-2043 : Technologies, Markets, and Forecasts
- Acronyms
AGV: Automated guided vehicles AMR: Autonomous mobile robots CCD: Charge-coupled device CMOS: Complementary Metal Oxide Semiconductor Cobot: Collaborative robot CPS: Capacitive proximity sensors DVL: Doppler velocity log EKF: Extended Kalman Filter EoAT: End-of-the-arm-tooling FLS: Forward-looking sonar FoV: Field of view GPS: Global positioning system HRI: Human robot interaction IFM: Intelligent flying machines IMU: Inertial measurement unit IoT: Internet of things IR: Infrared LBL: Long baseline LCTF: Liquid-crystal tunable filter |
LiDAR: Light detection and ranging MEMS: Microelectromechanical system MWIR: Mid-wave infrared NIR: Near infrared OEM: Original equipment manufacturer PAC: Perimeter Access Control Radar: Radio detection and ranging ROI: Return on investment RPAS: Remotely Piloted Aircraft System RTK: Real-time kinematics SLAM: Simultaneous localization and mapping SMD: Surface Mount Device SONAR: Sound Navigation and Ranging ToF: Time of flight UAV: Unmanned aerial vehicles UGV: Unmanned ground vehicle VRT: Variable rate technology VTOL : Vertical take-off and landing |
- Overview of the report
Sensors in robots can be used for a variety of tasks ranging from measuring force, detecting objects, navigation and localization, to collision detection and mapping. With recent advances in sensor technologies and software, many sensors can be used for multiple purposes. For instance, cameras together with computer vision systems can be used for collision detection as well as navigation and localization. The chart below summarizes the commonly used sensors by application split. This report splits the tasks into four main themes including navigation and localization, collision and proximity detection, force and torque measurement, and others. It is worth noting this report mainly focuses on the sensors that are equipped directly on the robots and largely ignore the sensors used in the components (e.g., servo motors, controllers, etc.), such as current sensors, optical encoders, etc.
- Are 3D sensors getting increasingly popular or heading nowhere? (2)
The data gathered by 3D sensors typically have lower resolution than those from conventional 2D sensors, for example, cameras. In the case of LiDARs, a standard sensor discretizes the vertical space in lines (the number of lines varies), each having several hundred detection points. This produces approximately 1000 times fewer data points than what is contained in a standard HD picture, meaning that the resolution can be significantly compromised. Furthermore, the further away the object is, the fewer samples land on it. Thus, the difficulty of detecting objects increases exponentially with their distance from the sensor.
- Company Profile Access – IDTechEx Online Portal
The purchase of this report provides access to a selection of relevant company profiles available on the IDTechEx portal. Further profiles and updates are available through the IDTechEx subscription service. Please email research@idtechex.com for more information.
− Aidin Robotics − Airskin − Anybotics − Audite Robotics − ClearPath Robotics − Clearview Imaging − Ecovacs − F&P Personal Robotics − Franka Emika − Inivation − Interlink Electronics − LuxAI − Mov.ai − Neura Robotics − Omron |
− OnRobot − Pal Robotics − Peratech − Qineto − Robotnik − SICK − Tacterion − TE Connectivity − Techman Robot − Universal Robots − Velodyne − VitiBot − Vitirover − Yujin Robot |
- Typical sensors used for robots
As a simple categorization, the sensors used in robots can be divided into two categories: proprioceptive and exteroceptive. Internal data such as joint speed, torque, position, and force are measured by proprioceptive sensors (which consist of motor encoders and gyroscopes). These sensors are typically used for robotic control. Exteroceptive sensors collect information about the robot's surroundings and sense the environmental parameters, such as its distance and speed from a moving or stationary item, light intensity, temperature, chemicals, and more. This type may include tactile sensors, force and torque sensors, proximity sensors, range sensors, vision sensors, and others, used for robot guidance, obstacle identification, monitoring, etc. Exteroceptive sensing can be further categorized as extrinsic or intrinsic. More details can be found in the chart on the right.
- Sensors by applications
Although different sensors are typically used together to conduct certain tasks, it is worth classifying these sensors based on their use cases and application scenarios. Notably, there is no universal classification for this. In this report, IDTechEx specifically focuses on robotic sensors based on the following applications.
- Navigation and mapping sensors
Autonomy, as one of the key features of robots, refers to the ability for machines/robots to work independently without human control or intervention. Autonomy consists of many concepts including autonomous mobility, autonomously identifying and manipulating the objects, and many others. In recent years, autonomous mobility has gained significant momentum, particularly in the area of autonomous mobile robots (AMRs), automated guided vehicles (AGVs), unmanned aerial vehicles (UAVs), and some self-driving agricultural robots.
Although these robots are utilized for different tasks and purposes, they all need to have a robust autonomous mobility system. Autonomous mobility requires navigation, localization, and mapping. With the increasing demand for autonomous mobility, navigation, localizing and mapping sensors are becoming increasingly important. Typical navigation and mapping sensors include cameras (2D RGB cameras or 3D stereo cameras), ultrasonic sensors, LiDAR, radar, GPS, and IMU.
- The emergence of 3D cameras/3D robotic vision
The use of 3D automated vision in robotic work cells is on the rise. The robot can recognize an object's position, size, depth, and color thanks to this technology. Using visual components, industries like logistics, food processing, life science, and manufacturing are exploring ways to automate their processes.
It is worth noting that there is no “one size fits all” solution because the integration of vision sensors depends on a number of factors including applications, equipment, product, environment/workspace, and budget. Therefore, IDTechEx believes that there is no ‘standard solution’ when it comes to setting up real-ime 3D imaging in a robotic system. However, there are indeed several standard techniques although, in reality, they all need to be tailored to benefit specific tasks. These techniques are introduced as follows:
Laser triangulation – a laser scanner's light beam traverses the objects it is scanning. As the object passes across the laser line, a camera positioned at a certain angle captures an image of the laser line, distorted by the object's profile. More details are explained on this slide.
Structured light – a projector creates a thin band of light to project a pattern on an object. Cameras from different angles observe the various curved lines from the light to develop a 3D image of the object.
Time of Flight (ToF) – a camera uses a high-power scanner to emit light reflected from the object back to the image sensor. The distance of the object is calculated based on the time delay between the transmitted and the received light – more details can be found in the report.
Stereo vision – The robotic system uses two cameras to record the same 2D view of an object taken from two different angles. The software then uses the established position of the two cameras and compares corresponding points in the two flat images to identify variations and produce an image with depth information. This is particularly useful in autonomous mobile robots (AMRs) and automated-guided vehicles (AGVs).
- Torque sensors – introduction
Force sensors play an important part in collision detection and force measurement. Force sensors can be installed in different positions in a robot (e.g., at joints for collaborative robots or grippers/end-effectors for industrial/collaborative robots) to enable them to manipulate parts with high precision and accuracy. | ![]() |
Force sensors can detect the force and moment (x, y, z, yaw, pitch, and roll) applied to a robot from external sources. Hence, according to FANUC, one of the biggest manufacturers of industrial robots globally, force sensors can be applied in the robotic system to control velocity and force when objects are fit, aligned, buffed, trimmed, or assembled thereby improving product quality and process integrity. In essence, force sensors are integrated so as to create intelligent robots that can “feel”, enabling, handling parts of varying textures, the most demanding mechanical assembly, and material removal operations. |
- Piezoresistive vs. Piezoelectric vs. Capacitive technologies
Capacitive sensors can operate over a wide range of temperatures and tolerate short-term overpressure conditions. They are not affected by changes in temperature, and the temperature coefficient of sensitivity of a capacitive sensor is 10 times better than a piezoresistive pressure sensor. Meanwhile, the power consumption for capacitive sensors is low as they don’t need power sources.
Piezoresistive sensors are less expensive compared with capacitive sensors. They are also highly resistant to pressure changes, shocks, and vibrations. They can be applied over a wide range of pressure (up to 20,000 psi), and many of these sensors with thin-film resistors are much more resistant to higher temperatures and overpressures. Piezoelectric sensors can tolerate very high temperatures (e.g., some materials can be used at up to 1,000 ºC). These sensors are usually self-powered, enabling them to have very low power consumption. Although they can be used at a wide range of pressures (e.g., 0.1 psi to 10,000 psi), the maximum pressure tolerance is lower than piezoresistive sensors.
- Comparison of proximity sensors
The diagram on the right compares different proximity sensors. It is observable that most proximity sensors are relatively small (footprint smaller than 40,000 mm3), with maximum sensing distances shorter than 2000mm (2 meters).
Ultrasonic sensors typically have the largest sensing distance compared to other proximity sensors, whereas capacitive proximity sensors have the shortest sensing distance. The correlation between footprint and max sensing distance determines the ideal application scenarios of these sensors. For instance, sensors with large sensing distances with a large footprint (e.g., ultrasonic sensors) can be widely used in tasks that need long-range detection, such as underwater robots. On the contrary, the sensors with relatively short detection distances and footprints would be more suitable for tasks in limited space, such as collaborative robots in production lines.
- What are industrial robots and what does the current market look like?
Industrial robot categories | Market trend |
Industrial robots are generally composed of three basic parts: the main body, the driving system, and the control system. The main body is the base and the actuator, including the arm, the wrist, and the hand. Some robots also have a walking mechanism. Most industrial robots have 3-6 degrees of freedom (DoF) of movement, of which the wrist usually has 1-3 degrees of freedom of movement; the drive system includes a power device and a transmission mechanism. Finally, a reducer and a servo motor make the actuator produce corresponding movements. Regarding the classification of industrial robots, IDTechEx is yet to see standards specified in the world. However, they can be divided according to load weight, control method, degree of freedom, structure, and application fields. According to the configuration, it is divided into multi-joint robots, rectangular coordinate robots, SCARA robots, parallel robots, and collaborative robots. The results of classification according to applications are as follows. ![]() |
Japan, Germany, and Switzerland remain strong in industrial robots. This is because reputable industrial robot companies are mainly located in these well-developed countries. However, the emerging markets in China and South Korea are catching up very quickly. Industrial robots are also transitioning to an intelligent and modular design. With the increasing complexities of tasks, industrial robots are required to ‘perceive’ the environments and accurately identify and inspect complex situations. Industrial robots are transitioning from ‘pre-programming,’ ‘on-site control,’ and ‘remote control’ to self-learning and independent working. In order to fulfil the requirements of these tasks, typical sensors used in industrial robotic arms include vision sensors, force and torque sensors, photoelectric sensors, along with a few sensors that are indirectly used on the robotic arms such as IMUs, voltage sensors, optical encoders, and many others. In this section, we will primarily focus on vision sensors (cameras), force and torque sensors, and photoelectric sensors used in industrial robotic arms. |
- Sensors for AGV and AMR – overview
Mobile robots primarily refer to autonomous guided vehicles (AGVs), autonomous mobile robots (AMRs), and many others. As the name indicates, autonomous driving, also commonly known as autonomous mobility, is one of the key functions of mobile robots. To achieve autonomous driving, object detection, collision detection, navigation and localization are important.
Multiple sensors are often combined to achieve autonomous navigation, object detection, and collision avoidance. Typical sensors include cameras (e.g., RGB cameras, IR cameras, etc.), laser scanners, LiDARs, radars, force sensors, GPS, ultrasonic sensors, and many others.
The chart shows an overview of typical sensors and typical functions needed for fully functional mobile robot.
- Cobot – functions and typical sensors
Collaborative robots, also known as cobots, refer to the robots that work side-by-ide with human operators. Cobots can be used in many industries and tasks, such as, material handling, picking and placing, quality inspection, assembly, and a few others.
Unlike traditional industrial robots, there is no physical separation between cobots and human operators, therefore, safety in cobots always comes as a priority. In order to ensure a safe collaboration, proximity and collision detection sensors are commonly used. Examples include force and torque sensors, tactile sensors, and vision sensors.
Aside from safety requirement, an accurate control of cobot’s position, along with the ability to control and force exerted on the object also plays an important role. At this stage, torque and force sensors are usually equipped to measure the force exerted. In order to determine the position of the cobot, cameras and IMUs are often used. Cameras, along with computer vision technology can be used to detect the distance between target object and the robotic arm, thereby informing the movements. IMUs are used to determine the posture of the cobot.
- Overview of the sensors in drones (1)
Drones are becoming increasingly popular over the past several years. Depending on different purposes, a wide suite of sensors could be equipped. LiDAR, thermal cameras, RGB cameras, and IMUs are just several examples. Below is a chart showing different sensors for various applications.
- Overview for sensors for service robots
A service robot, by definition, refers to a robot that frees humans by performing some useful tasks for them. Service robots can present full or partial autonomy. In order to achieve full or partial autonomy, service robots have a series of sensors onboard.
- Cleaning robots – overview of tasks and sensors
As one of the top applications of service robotics, cleaning robots have been investigated for many years. Recently, due to COVID, cleaning robots gained a lot more momentum. It is reported by the International Federation of Robotics (IFR) that there are at least 50 more cleaning robotics manufacturers as of today compared with two years ago.
Cleaning robots refer to all the robots with capabilities of cleaning and disinfecting their surrounding environments, regardless of the measures. There is a variety of measures used by robots to do the cleaning. These measures can be categorized into two types that are mechanical cleaning and non-mechanical cleaning (often referred to as killing microorganisms). Typical mechanical cleaning methods include physical wiping or scrubbing whereas non-mechanical methods usually refer to using ultraviolet radiation and spraying disinfectants.
The chart on the right shows a few key criteria of cleaning robots, along with their key enabling sensors.
- Sensors in social robots – overview
Social robots are designed to interact with humans. Interaction, as one of the core features of social robots,
needs the support of many sensors including voice detecting sensors (microphones), motion detection sensors, touch sensors, cameras, and many others.
Aside from sensors used for interaction purposes, safety sensors also play an important role. Many social robots are designed for children, whose actions are unpredictable; it is, therefore, crucial to ensure that the potential risks and injuries can be mitigated during human-robot interaction. Safety requirements can be divided across physical safety and emotional safety. With regards to physical safety (i.e., obstacle avoidance, emergency system if kids want to disassemble the robot, etc.), sensory and object detection systems are usually equipped in a social robot to make sure that robots can perceive the environment and identify objects. The sensing systems and types of sensors equipped in mobile social robots are very similar to those used in mobile delivery robots. Typical sensors, navigation, and localization systems include cameras, LiDAR, SLAM (Simultaneous localization and mapping), obstacle avoidance, machine vision, and many others.
By contrast, emotional safety primarily requires intelligent software systems so that robots can correctly perceive and understand users, thereby making responses that wouldn’t make users feel emotionally uncomfortable. Therefore, this report will not discuss this part in detail, but more information can be found in IDTechEx’s latest research on Service Robots 2022-2032.
- Overview of common sensors in different applications – market size (USD billions)
By 2043, the total market size of sensors used in the robotics industry will exceed US$80 billion where the force and torque sensors for cobots and AMRs will have the largest proportion of the market.