By George T Hilliard
Fog computing has been around for a while and its definition has deviated somewhat from its original definition, thanks to the latest available products and how much compute power is now available at the edge of the IoT. I wrote a blog on this topic a while back called Cloud, Fog and Edge Computing – What’s the Difference? It’s been quite a popular blog as people seem to be struggling with where to put their resources.
The simple and somewhat outdated definition of each of those components is that the Edge is the place were data is taken in by some sort of sensor; the Cloud is the end point where calculations occur on that data; and the Fog is the middle man, where some decisions can be made that don’t require the intense compute power of the Cloud.
Today, that’s changed because the amount of compute power currently residing at the Edge rivals that of the Cloud of a few years ago. Obviously, the capabilities of the Cloud have kept up with the times too, but there are many examples of where decisions can now be made at the Edge without having to go back to the Cloud. The two main reasons for doing that are to remove the time latencies that come with moving data over potentially long distances, and the cost incurred in sending and receiving data over what is often a cellular connection.
That said, where does this leave the Fog? Is it even a necessary component? The answer is a common one: it depends. It depends on how your system is constructed; it depends on what you’re trying to achieve; it depends on the amount of security you want to build into your system; it depends on your design budget.
According to the Open Fog Consortium, fog computing is “a system-level horizontal architecture that distributes resources and services of computing, storage, control and networking anywhere along the continuum from the Cloud to the Edge.” The term was originally coined by Cisco, in an attempt (and a good one) to sell those intermediate server-type products.
The Fog offers three key attributes: it’s a horizontal architecture; it provides the Cloud-to-Edge continuum, and it can operate at a system level. Let me explain what each of these attributes mean. The horizontal nature means that multiple applications can be supported and the fog essentially “normalizes” the data. Hence, you can have a video camera sending a huge amount of data on the same network with a temperature sensor that pings far less frequently and sends just a few bytes at a time.
The continuum enables services to be distributed closer to the Edge, thereby bringing “near-Cloud” performance to the Edge. And finally, by operating at the “system” level, the fog can extend the Edge into places that it may not have otherwise had access to.
Compute systems employed for fog computing must be efficient, meaning that they offer reasonable compute capability, yet require little power and produce little heat. For example, the WINSYSTEMS’ ITX-F-3800 single-board computer (SBC) fits that bill. Measuring just 84 by 55 mm (3.3 by 2.2 in.), the board fits the Femto-ITX form factor. It’s based on Intel’s E3825 processor which provides more than adequate CPU and graphics performance.
The ITX-F-3800 includes 2 Gbytes of DDR3L SDRAM memory that’s soldered down to the board and offers a data transfer rate of 1066 MHz. A host of I/O is also included, such as Ethernet, audio, video, and USB. The board supports Microsoft’s Windows 10 variants including Windows 10 IoT Enterprise, and Windows 10 IoT Core, as well as Linux, and many popular commercial real-time operating systems (RTOSs).