Finding solutions to keep process data flowing

Finding solutions to keep process data flowing

Skkynet’s Andrew Thomas shares how organisations can manage data during a network outage

Andrew Thomas |

Real-time data is more valuable than ever. Enterprises are waking up to the importance of the information coming from their production lines and processes, especially for helping IT departments to gain deeper insights into how to improve efficiency and reduce costs.  

Andrew Thomas, CEO of data and industrial internet of things solution provider Skkynet, believes there are a few things that can be done to ensure that process data keeps flowing.  

“Moving production data from operations to management in real time requires networking, often using a demilitarized zone (DMZ) to isolate networks,” he says. “The data moves from the operations technology (OT) side to the DMZ, and from there it is passed to the IT department.”  

But networks are not always 100 per cent reliable. If a network goes down, data can be lost, cause gaps in trend lines and hide potential problems in operations systems. “To handle this inevitable problem, there are at least two viable solutions: redundancy and store-and-forward,” says Thomas.  

“Redundancy means establishing two separate data pathways, so that if one goes down, the other will be available to maintain the data connection. It is implemented in industrial applications using duplicate sets of hardware and software and running connections across separate network connections.”  

However, there are drawbacks to this approach. “Extending redundancy beyond the plant floor is costly,” says Thomas. “A fully redundant system requires completely identical but separate hardware, software and networks. True redundancy means using two separate data paths.” 

This is where organisations may see value in a store-and-forward approach. “This simply means storing the data temporarily on the sending side when there is a network outage, and then forwarding it along after the connection is restored,” he explains. “At each node in the chain – from IT to DMZ to OT – a data historian receives and stores the data, then forwards it whenever the network is available.”  

Thomas also highlights that the two approaches can be used together, making part of the system redundant, while also implementing store-and-forward technologies when necessary. “The important point to remember is that there are proven ways to keep your process data flowing even over a networked system,” he says. 

This article was originally published in the Autumn 2022 issue of Technology Record. To get future issues delivered directly to your inbox, sign up for a free subscription.

Subscribe to the Technology Record newsletter

  • ©2024 Tudor Rose. All Rights Reserved. Technology Record is published by Tudor Rose with the support and guidance of Microsoft.