Asset Management in Industrial Applications
Today’s industrial companies face a unique challenge: managing assets that are in remote parts of the world. These assets, which range from production equipment like offshore oil rigs to the industrial internet of things (IIoT) devices, such as sensors and communication hubs, are key components for 21st-century businesses. The remote and distributed nature of these assets present a serious challenge for technicians, engineers, and IT departments:
How do we collect data from, update, and protect these remote assets?
If a remote device goes offline or stops responding, how do we fix it without sending a technician on a repair journey? How do we maintain an effective cybersecurity posture despite the limited resources of these embedded devices? And, going beyond these basic necessities, how do we develop and utilize a data pipeline that continues adding value both internally and to our customers?
The Basics of Remote Asset Management
The guiding principle for our work is always to be able to update the device. If we can achieve this central goal, then the rest of our work can follow. If we can’t update, then we’re stuck.
So, how do we remotely update firmware and other programs with confidence? Yes, we need to deploy over-the-air (OTA) updates to patch security vulnerabilities and optimize processes, but the truth is that most IoT devices rarely, if ever, receive these necessary updates because the manufacturers are afraid that a single software bug could completely wipe out their devices. They believe the risk is too great.
That’s why we use technology to mitigate that risk. One proven strategy is A/B firmware slots, which guarantees that we always have functional firmware. For instance, let’s say our device is running on partition A and receives an OTA firmware update. It will write the new code onto slot B and then try to reboot into that updated version. If it successfully boots, it can run some basic diagnostics, such as checking network connectivity, and, once everything checks out, slot B becomes the new default firmware. However, if it encounters any problems with B, it automatically reverts to slot A.
We find this feature in Google’s Android and ChromeOS, as well as open-source, IoT-focused platforms like Nerves and NervesHub. Other proprietary options, such as AWS IoT Core, also enable secure cloud-to-device connectivity and OTA updates.
Besides patching security vulnerabilities to protect these assets, these technologies also allow developers to apply Agile principles throughout the device’s lifecycle. For instance, when we know that updates aren’t going to kill our devices, then we’re free to implement continuous integration/continuous delivery (CI/CD) pipelines for process optimization.
The final piece of the puzzle here is access controls. Organizations need to be able to control who has access to these devices. This is especially important for succession planning within IT departments. All the major cloud providers have tooling for this use case, and there’s also a rich open-source ecosystem that includes controls like PAM, Pluggable Authentication Modules, which comes preinstalled on most Linux distributions.
Other technologies to consider include OpenSSH for remote terminal access via secure shell, Secure Sockets Layer (SSL) to verify the connection between a remote asset and the cloud, and Active Directory for companies that utilize the Microsoft ecosystem. The key here is to find the right security solutions for your unique use cases and to build security into your systems architecture from the very beginning of the design process.
Developing a Comprehensive Data Strategy
The points outlined above are the bare minimum that today’s connected industries need to do to survive. Of course, we want to go beyond that; we want to thrive. This is where data science enters the picture.
Data scientists develop solutions and methods for collecting data from remote assets, such as usage for cellular data plans and production data from industrial equipment. They then aggregate, clean, and analyze this data to deliver valuable insights that inform business decision-making.
For instance, a major use case that we’re seeing industries across the board adopt is predictive analytics for preventative maintenance. Essentially, data scientists use sensor data from remote assets to build machine learning (ML) models that can predict when a device is going to break. This lets us plan ahead and build contingencies; we can send someone to fix the machine, deploy another device, or even decide to prioritize something entirely different.
The takeaway is that this technology lets us make an informed, premeditated decision. That puts us in a lot better position than having to react in the moment when something breaks.
Again, we want to stress that this is something that we need to bake into our design from the beginning, not something that we can just bolt on later. Get the data science team involved early and often, and they’ll help your engineers figure out what data to collect and where to send it.
We’re just starting to see the impact of big data analytics for Industry 4.0. Improved automation, process optimization, and cost savings are just a few of the advantages that early adopters are realizing.
Ready to take control of your remote assets? Phoenix Contact is a leading provider of next-generation industrial equipment and services. Get in touch with us today.
Published By David Hoysan
Originally published at https://www.linkedin.com.