A robot with no way to sense its position or environment is simply an automaton that performs movements blindly. That is changing due to a trend toward adding vision, torque, and other sensors to robots to make them more aware of their surroundings. This is the result of companies seeking greater productivity, throughput, safety, and quality that is creating a surge in automation, smart manufacturing, and robotization. Let’s take a look at some of the technologies and key companies in this space.
One of the most rudimentary sensors for robots tells the position of each joint of a robot device. This provides feedback to the control system so it can calculate the movement of each joint of the arm and of the end-of-arm tool (EoAT) in 3D space. When combined with precise actuators and controllers, position sensors enable repeatability with extremely high accuracy. Advances in MEMS sensor technology like the have improved the precision of positioning by accurately sensing tilt, rotation, acceleration, shock, and vibration using inverse kinematics.
Force/torque sensors give robots a sense of touch. They detect the amount of force being exerted to provide a level of awareness that keeps the robot, products, and EoAT safe. These sensors, like the , provide feedback that enables robots to perform a wider range of sensitive tasks with precise control. They also aid in the maintenance of robots by detecting wear in the joints and tooling.
Robots, especially collaborative robots (cobots), are able to work safely near humans and other robots if they can sense the close proximity of a person or object and react rapidly to stop movement. Researchers in South Korea have created that measures impedance. The sensor creates a wide-angle magnetic field and senses changes in that field to detect objects nearby. The sensors have been commercialized by and are in use on the Universal Robots UR10 model and the Neuromeka Indy 7.
Even with all of the sensors mentioned above, a robot is still working in the dark unless it has a vision system for visible light or other areas of the light spectrum. Visible light is good for pick & place tasks, inspection and quality checks, detecting the presence of an item for processing, sensing a nearby human or robot, assembling parts, precision farming, customer service, autonomous mowers and vacuums, and personal care robots. The uses Intel RealSense, which provides a sense of depth perception to allow robots to see and understand the world better.
Light detection and ranging sensors are becoming more popular as their capabilities increase and costs decrease. They are useful for mapping a 3D space around a robot and have become popular recently to help mobile robots that navigate autonomously. Velodyne, the company that invented 3D lidar, has released for applications from mobile warehouse robots to autonomous cars and drones. The new sensors have higher resolution for identifying objects and greater range for detecting objects farther away. This is especially useful for the new generation of autonomous cars and trucks that are starting to appear on our roads. Lidar sensors that are lightweight and low cost are becoming more popular for use in drones for mapping and creating 3D maps of vegetation, terrain, mining materials, and other applications.
A plethora of sensors
SICK AG, based in Germany, offers that include capacitive and magnetic proximity sensors, lidar, and vision systems, as well as sensors that detect distance, dust, fluids, gas, inertia, motor position and speed, and even traffic. Sensors from other companies include those to discern airflow, electric current, displacement, heat, time, humidity, infrared light, magnetism, position, pressure, proximity, and temperature.
Integrated sensor arrays
TDK InvenSense released its that includes an InvenSense Inertial Measuring Unit (IMU), a capacitive barometric pressure sensor, and a multimode digital microphone. The kit also includes Chirp Ultrasonic Time of Flight (ToF) sensors, a Micronas motor controller, an angle sensor, and pressure sensors.
The future of robot sensors
Expect to see more and varied sensors incorporated into robotic systems to give them better awareness of the world, as well as greater safety, productivity, and capabilities to perform an ever increasing array of tasks. As with most electronic systems, more sensors will combine into modules for various use cases to simplify the integration of sensor data, reduce costs, meet weight and power requirements, and improve ease of installation and maintenance. The future holds robots that will be safer and more aware, capable, and user friendly.