The future of sensor technology: emerging trends and innovations
Modern devices fitted with sensors eliminate the need for manually pushing buttons, operating autonomously dependent on sensor inputs.
Sensors are integral to almost every electronic device that assists humans in day-to-day activities. They can monitor vital human body signs and detect abnormalities. In automobiles, they can help with autonomous driving by recognizing traffic signs, detecting obstacles, and warning lane departure. In the future, sensors will enable simple user interactions with all smart devices. This article gives insights into emerging trends and futuristic applications of sensors in areas like automotive, healthcare, and industrial robotics and their advancements in terms of operations and safety.
Sensor perception, like human natural senses
Designers create sensing solutions to simplify human interaction with devices. Different sensors are combined with state-of-the-art software to create a picture of the real world. By fusing several smart sensors into one intelligent system, sensing solutions enable people to interact with devices effortlessly.
Figure 1: Sensing solutions mimicking human senses
Smart ear/microphone
Human ears can distinguish between up to 400,000 sounds, ten octaves, and 7,000 tone pitches. MEMS microphones give smart ears to audio and voice-controlled devices. The new generation microphone features ultra-low self-noise (high SNR), extremely low distortions (THD) even at high sound pressure levels (SPL), very tight part-to-part phase and sensitivity matching, a flat frequency response with a low LFRO (low-frequency roll-off) and an ultra-low group delay. These microphones have selectable power modes and are of tiny package size.
Nose/CO2 sensors
Over 400 receptors discern more than 10,000 fragrances and odors. The carbon-di-oxide sensors work on photoacoustic spectroscopy (PAS) technology to deliver a compact, fully functional device, overcoming the challenges of existing CO2 gas detectors. The PAS CO2 sensor module integrates a PAS transducer, a microcontroller, and a MOSFET on the PCB.
Eyes (ToF 3D image sensors/ radar sensors)
Time-of-flight (ToF) image sensors enable electronic devices to acquire an accurate 3D map of the scene in front of the device. The surroundings, objects, and people are transformed into the digital space in real time. Algorithms use that data to measure distances and sizes, track motions, and convert objects' shapes into 3D models. Designers create products to integrate seamlessly into the most miniature 3D ToF camera modules, accurately measuring short and longer-range depth with minimal power consumption.
Radar sensors offer many advantages over passive infrared (PIR) technology in motion detection applications. These include greater accuracy and precise measurement of detected objects, paving the way for new speed detection and motion sensing capabilities. They can operate in harsh environmental conditions like rain, snow, fog, dust, etc. The advanced models are sensitive enough to capture breathing and heartbeat, as the radar can feel the presence of vital functions.
Touch (touch sensors)
Touch sensors use several methods, such as resistive film, capacitive, infrared, ultrasonic surface acoustic wave, and electromagnetic induction. The touch panel is the assembly of an input (touch panel) and an output (display) device. Discover the sensors Farnell offers, tailored to meet different requirements and specifications – Microphones,Flow sensors,ToF 3D image sensors,Radar sensors, and Touch sensors.
Automotive sensors for reliable mobility
Manufacturers typically equip modern cars with an assembly of sensors. These sensors obtain and show information on the dashboard, like tire pressure, fuel level, and engine temperature. A few other sensors maintain vehicle efficiency by monitoring information about the position of motor components, wheel speed to control traction or antilock braking systems, and internal and external air temperature to keep the interior comfortable, among many others. Manufacturers can incorporate several sensors into their vehicle design to make their cars safer, more reliable, efficient, and more comfortable.
Today’s sensors also provide detailed information. The vision-based technologies increasingly used for automated driving solutions require high bandwidth and fast response times for safety and responsiveness. Modern automobiles incorporate several new technologies for smarter mobility. They are:
LiDAR (Li Detection and Ranging)
LiDAR scanners are critical components in prototype systems for autonomous vehicles and current systems for adaptive cruise control (ACC), collision avoidance systems, traffic sign recognition, blind spot detection, and lane departure warning. LiDAR-based systems can function without their sensors —they are the “eyes” of the system.
Figure 2: Application of LiDAR technology Sensors in cars
LiDAR technology requires independent sensor systems equipped with safety and environmental features as shown in Figure 2. For example, units should be rated for operating temperatures from -40 to 125 °C (-40 to 257 °F) to counteract the heat of other system components and the external environment. Sensors must possess an optimum signal-to-noise ratio to detect the signal through any distracting background. Since optical detectors must handle varying environmental light levels, the sensors within these detectors should possess a wide dynamic range. To learn more about LiDAR technology, click here.
Products category: LiDAR sensors
Sensors realize autonomous navigation in electric vertical takeoff and landing (VTOL) aircraft
Battery-powered eVTOLs are a new class of aircraft as shown in Figure 3. They are a cross between an electric passenger vehicle and an oversized drone with an interior like a luxury automobile. Aviation experts expect eVTOLs to fly like helicopters, with sensor-enabled control systems familiar to veteran pilots. Inertial sensors are required to monitor the prototype’s flight control and flight characteristics like pitch, roll, and angular rate values. You can use sensors to measure propeller thrust, vibration, strain, and load characterization.
Product category: Inertial sensors,Accelerometers,Gyrometers
Figure 3: Electric vertical takeoff and landing (eVTOL) aircraft
Sensors for automation using Industrial robots and human safety
Robotic applications need accurate and precise sensing elements to be integrated into equipment to handle various challenges effectively. For example, sensing systems must identify the presence of humans for collaborative applications, thereby preventing possible collisions between the robot and nearby workers. Sophisticated sensor technologies are a core requirement for achieving this. Torque sensors measure the mechanical torque at the rotational joint on a collaborative robot (cobot), detect fault or overload conditions, and prevent injuries and potential cobot failures. In addition, sensors can be used in equipment such as light curtains, which stop machines when people enter critical areas, and digital ambient light and proximity sensors (APDs) can serve as artificial eyes in laser scanners. Sensors can also be used to monitor a robot's surroundings for object detection, load capacity, and grip forces that help ensure safe, reliable, and efficient operation in the workplace. Pyroelectric infrared motion sensors are a unique design concept characterized by a compact, high-sensitivity, circuit-integrated package, a diverse range of lens lineups, and low current consumption types are available. Integrating more sensors into robots will help increase up-time and optimize maintenance schedules. To learn more about industrial sensors, click here.
Product category: Torque sensors,digital ambient light and proximity sensors (APDs),Pyroelectric infrared motion sensors
Figure 4: Industrial automation unit with robots and sensors
In industrial automation, basic sensors monitor the machine's operation and other routine tasks. Additionally, modern manufacturing includes sophisticated sensors (Position and speed, ToF and RADAR, pressure, CMOS image, and current sensors or MEMS microphones) for streamlining the production process. The assembly lines in Smart factories can detect various aspects like distance, size, shape, composition, and the surface of an object to be manufactured and accurately track their movements and motions. Despite the advantage of low-cost sensor capabilities, there is a reason not to integrate a range of sensors into robots to monitor their health. Monitoring all these sensors together is challenging. Temperature increase, juddering motion, and higher power consumption may portend bearing or gearbox failure. Sensors allow monitoring of electric power consumption and trigger actions to make factories more energy efficient. If equipment electrically fails due to a short circuit, it triggers power supply interruption to avoid damage to the factory.
Advancing automotive image sensing technology
Present-day vehicle systems alert drivers in situations like departing from a lane, detecting objects nearby or cars in blind spots, and keeping speed and distance in highway cruise mode. Automotive image sensors enable these safety features.
Many sensor designs use dual gain pixel technology and operate in the high dynamic range (HDR) to enhance Advanced Driver Assistance Systems (ADAS) features in automotive applications. Modern image sensors use split-pixel technology based on large and small sub-pixels to produce High Dynamic Range (HDR) images, by which the sensor area dedicated to a single pixel is split into two parts: a larger photodiode that covers most of the area and a smaller photodiode that uses the remainder. However, there can be image quality degradation, higher dark noise, reduced performance, and other drawbacks at split pixels, especially at higher temperatures.
Figure 5: Comparison between split pixel and multi-exposure techniques
An alternative to the split-pixel approach is multi-exposure, where additional space is allocated to pixels to accommodate potential overflows from large signals or charges. This method is like using a bucket for catching raindrops but with a larger basin to hold water if the bucket overflows. The “bucket” signal can be easily read with high accuracy, so it achieves excellent low-light performance, and the overflow basin contains everything that overflowed, extending the dynamic range and ability to capture bright objects and scenes in true color. Figure 6 compares the two exposure techniques.
Product category: Image sensors
Conclusion
Sensors are all set to revolutionize measurement units, providing them the intelligence to self-monitor, transmit status diagnoses to the operating system, and create a reliable measurement and calibration data network. Predictive maintenance for machines and devices will become increasingly more efficient, accessible, and affordable, improving uptime. In the future, maintenance will rely on sensors instead of being carried out by human resources following a needs-based timetable. There will be autonomous navigation of aerial vehicles, and connectivity will be wireless over long distances with an integrated power supply. Sensors will self-learn over the entire lifespan without maintenance, modifications, or calibration. They will also provide better insights into human behavior, shaping expectations concerning air quality, travel, automobile maintenance, lifestyle, insurance, energy consumption, etc. New lidar systems will equip autonomous vehicles with real 'vision.'