Embedded Sensors and Intelligence on the New Edge

With a move to edge architecture underway, embedded sensors are gaining new attention in IoT.

Much of what comes together in the Internet of Things depends on the first node in the network, which is most often a sensor. Transformational Industry 4.0 analytics and automation require even better input, which has kept vendors busy trying to bring smarter embedded sensors to market.

The role of smart sensors is looked at even more closely now, as an industry-wide architectural switch is underway that moves more processing to the edge, reducing reliance on “cloud-only” processing. As sensors proliferate, it becomes very difficult to handle all sensor information in the distant cloud, so sensors and the edge nodes they reside upon the need to get smarter.

This is among the drivers leading Allied Market Research projecting the global IoT sensor industry will grow from $12.37 billion to $141.8 billion by 2030, achieving a CAGR of 28% over that period.

As in the past, today’s embedded sensors must measure temperature, humidity, pressure, proximity and a wide assortment of other phenomena. Miniature accelerometers, magnetometers and other devices have come to form complex sensor fusions that combine disparate types of sources.

But now included among options are edge AI devices that handle machine learning at the source, while using far less electrical power than cloud processing alternatives. Though still nascent, lidar and radio waves are also on tap for IoT sensor measurements.

The world that is covered with sensors is expanding. At the same time, the limitations on what is possible are shrinking:

  • *Ruggedized IoT sensors connected to IoT gateways are being used to measure and monitor grape crops for wine in California’s Napa Valley, as part of Cisco System’s Industrial Asset Vision platform.
  • Bosch Sensortec sensors with onboard AI act as ‘digital noses’ to detect gases, particulates and – a matter of growing concern –  airborne viruses.
  • The city of Suffolk, Virginia is re-imagining traffic signaling services using Iteris’s ClearMobility platform, which in turn uses Vantage Apex smart sensors that couple high-definition video and four-dimensional radar sensors with integrated AI algorithms.
  • NevadaNano’s MPS Mini pairs an on-chip chemical sensor array with an on-chip molecular property spectrometer to detect combustible gases.

Smart Sensors Starting to Learn

“A typical smart sensor usually has four main sections – the sensor itself, an analog-to-digital conversion function, a computational – or microcontroller unit – and a communication engine which today can be either wireless or wired,” according to Raymond Yin, director of technical content at Mouser Electronics, as well as host of Mouser’s “The Tech Between Us” podcast. The variations are many, he cautioned.

For example, many smart sensors have multiple individual sensor types tailored to a specific application.  There is also variation in the function of the integrated microcontroller unit (MCU).  Some integrated MCUs are merely state machines that control the data conversion process and communication where others are fully running sensor fusion algorithms, he said.

As an example, Yin cited the LSM6DSO32XTR iNEMO Inertial Module from STMicroelectronics, which integrates accelerometers with a gyroscope and temperature sensor, and includes a machine learning core that aids in the detection of applications such as walking, running, and driving.

Power Conservation With Purpose

Manuel Tagliavini, principal analyst covering MEMS and sensors at Omdia, said a smart sensor can be defined as an electronic component that is not only able to read and store physical measurements – such as acceleration, light, flow, humidity, and so on – but is also able to perform more complex operations that could have different purposes.

“In a few words,” he said, “being able to perform operations through an advanced ASIC or an embedded MCU is what lets a sensor be named as ‘smart.’”

Increasingly, as sensors move further to the edge of the Internet of Things, those purposes revolve around power conservation.

“People have to think about ‘sleep’ functions that keep the entire sensor component, and maybe other systems linked to it, in a low power mode until something in the physical world happens,” Tagliavini said via an email message.

The obvious goal is to save on electrical power and, for portable devices, on battery power, which can be an obstacle if systems continually report back to the cloud. Battery-less sensors that harvest energy from the ambient environment are gaining use in some applications.

Always-on the Edge

In mission-critical applications on the edge, power consumption concerns go beyond the sensor to include the processor. For processors intended for machine learning, the concerns are acute.

In cloud data centers, rich electrical power supplies are a given. That’s not the case on the IoT edge.

Such considerations now play out in designs for next-generation AI edge chips. That’s shown, for example, in recent offerings such as the NDP102 Neural Decision Processor from Syntiant Corp.

It’s intended to apply AI processing to audio and other input ranging from the type of alerts we use to wake Siri or Alexa smart appliances to the tilt angle of an oscillating punch press ready to fail on a factory floor.

“We’re doing quite a lot around sensors and vibration, condition-based monitoring and healthcare,” Kurt Busch, CEO, Syntiant, told IoT World Today. He said vibrations and temperature events that signal machine maintenance issues are best detected and acted on before a costly downtime failure occurs.

Importantly, he noted, the Syntiant neural processor is designed to work at under 100-microwatt power consumption in always-on sensor applications. Syntiant has been among a handful of companies working to conduct neural processing in the analog, rather than digital domain, with speed and power conservation as prime goals.

Taking a Page From the Fitbit

Consumer devices such as the iPhone, AirPod and Fitbit have played a large role in sensor feature advances and price reductions. The gold-rush style push for assisted and autonomous vehicles may do the same, or more, especially by promoting sensor fusions that mix sensor techniques, according to Omdia’s Tagliavini.

Assisted and still-to-come autonomous driving methods require multiple and diverse sensor measurements for obvious safety reasons, he said.

That means “collecting and computing the data read from a radar, a lidar, an inertial unit and a GPS is a critical activity that requires reliability, redundancy and timely results,” he said. Advances here will be felt in the wider IoT universe.

Still, smart sensor advances in mission-critical industrial applications – ones that update at longer intervals than do ever-changing consumer apps – may take longer to disseminate, he advises.

Gauging Sensor Requirements

In the face of the blast of new technologies, basic trade-offs remain similar. The challenge of moving to the edge and deploying machine learning does little to alter the basic system decisions that have always influenced sensor system design, Mouser’s Raymond Yin said. The questions continue to be:

  • Do sensors meet resolution and accuracy requirements?
  • Are sensor results consistent and reliable enough for operational requirements?
  • Do sensors provide requisite data to meet system goals?
  • Do sensors meet power, size, timing requirements?

Similarly, specifications for the connectivity and compute portions of the overall system need to be determined based on application or use case, he said.

The role embedded sensors and AI play in emerging IoT sectors is an evolving one. Sensor advances in imaging, MEMS, lidar, Wi-Fi, UWB, radar and elsewhere clearly are plentiful – as are diverse machine learning cores that work to “make sense of the sensors.”

How system designers align these technologies with profitable use cases is likely to define the next IoT era’s ultimate success.