Artificial intelligence (AI) is fairly ubiquitous in the embedded and industrial IoT spaces. In many instances, it’s actually machine learning (ML) that’s being utilized. Although, these two terms are often used interchangeably, they are not the same thing.
Very simply, ML is a form (or subset) of AI, which can be quite large is scope. AI is used by the machine to perform tasks on its own, or independently, whereas ML takes the inputs from sensors and also learns from past data to optimize its performance.
Cloud Vs. the Edge When It Comes To AI
One current question is whether the AI and/or ML algorithms should be executed at the Edge of the IIoT or in the Cloud. A few years ago, this question likely would have elicited a chuckle, as the compute power at the Edge was not close to what was needed to run AI algorithms. But that’s changed, thanks to the latest round of microprocessors form the likes of Intel, NVIDIA, AMD, and others.
Performing AI at the Edge is a regular function these days. While the argument still exists, because there are valid reasons to handle your AI in the Cloud, the current trend is to perform your AI as close to the data as possible, which means at the point of the sensor, aka the Edge.
For example, the WINSYSTEMS PX1-C441 single board computer (SBC) combines lots of compute power with small size, a rugged design, and an extended operating temperature range to handle those “AI at the Edge” applications. Built to a PC/104 form factor, the SBC is designed with the latest generation Intel Apollo Lake-I dual- or quad-core SoC processors as well as the popular PCIe/104 OneBank expansion.
The PXI-C441 includes up to 8 Gbytes of soldered down LPDDR4 system memory and a non-removable eMMC device for solid-state storage of an operating system (OS) and applications. In addition, the board supports M.2 and SATA devices.
Keep Safe with Micro AI
A new AI concept, known as Micro AI, allows AI algorithms to be performed on many legacy MCUs, and thereby on legacy Edge embedded computers. This can potentially reduce the overall cost for the OEM, who can go to market with an Edge computer like the PXI-C441 and be confident that all the necessary applications will still run without having to make any software modifications.
Hardware-assisted Micro AI is also being used as a security measure to thwart some of the emerging cyber-attacks, such as malware and side-channel attacks at the hardware level.
Micro AI is a subset of full-blown AI, so the processes that can be performed are limited. However, if that’s what your application requires, then it may be just what you’re looking for. In other words, you get AI functionality without having to change any hardware.
What remains to be defined is exactly how the different ranges of AI processing will impact Edge computing. It’s likely that the algorithm developers will adapt to the existing hardware, which is a good thing for embedded computing providers, as they will then have the ability to offer some form of AI processing on their systems with little to no hardware modifications.
As always, you’ll need to check with your embedded computing supplier to ensure that the hardware matches your application. This is where it pays to work with an experienced supplier, one with a large partner ecosystem, particularly on the software and algorithm side. This will help ensure that those algorithms are running properly “out of the box.” WINSYSTEMS fits that bill, thanks to its highly qualified partner network. And it has the expertise to ensure that the hardware and software solutions are optimally integrated to properly run those algorithms with minimal tweaking.