Water, water everywhere and not a drop to drink. When poet Samuel Taylor Coleridge wrote that in 1798 he wasn’t making an analogy about data in a smart factory. But he could have been. That saying is used today when despite being surrounded by something, you cannot benefit from it. And that is overwhelmingly true today about data in many manufacturing operations.
Factories full of smart devices, automation systems, and software applications are awash in oceans of data. Intuitively, organizations know that there is value in that data. That they should be able to leverage that data to overcome challenges like supply chain disruptions, labor shortages, and skyrocketing energy prices. By harnessing the data, companies believe they can achieve competitive advantage and continue to deliver value to customers, despite adverse conditions.
Let’s start with what doesn’t work. Some companies have tried an approach that can best be summarized as capture everything, dump it all in a data lake, and point some artificial intelligence (AI) and machine learning (ML) algorithms at it to look for insights that no one could have predicted. While appealing in its simplicity it is often just magical thinking that results in enormous costs and little value. Continuing the water analogy, this approach is like panning for gold in a rushing river. While there was occasionally a story of a miner getting lucky, it was Levi Strauss who got rich by selling tools to all of those who tried.
Instead, manufacturers today need a data strategy that thoughtfully and clearly lays out the answers to some key questions.
First the strategy defines the sources of data. For most companies today there is no shortage of data sources. But more is not always better. Data storage is not free when considering the scale of data generation in a smart factory. And in some cases, the real data point of interest cannot be directly measured but instead must be derived from simulation models and digital twins. Consider the temperature at the center of an oven. That might be impractical to measure directly, but, with a physics-based digital twin you can use easily and directly measured values as boundary conditions in a simulation model to accurately predict the temperature at the center of the oven.
When considering sources of data there is real benefit in providing many different types of data that, together, provide a more complete picture of the production system for the connected enterprise. Most manufacturers think first of OT (operational technology) data. This is the data from sensors, devices, and control systems close to the production processes. Typically, these data points are captured at high frequencies that are matched to the speed of the production processes. This core data can be augmented with secondary data or sensors such as environmental data, power and utility data, video, point cloud data, and more.
There is a lot of value to be gained by augmenting this real-time data with operations management and manufacturing execution data such as supplier and material tracking, involved workers and their qualifications, and quality system data such as non-conformances or approved deviations. Manufacturers are also tracking and managing the configuration of their production systems – the complete set of information that defines the machines, production software, and setting configurations in place at every moment in time.
Beyond data sources, the strategy further defines the data model that makes sense for their specific production system. The model provides a structure for adding context and meaning to the raw data. Without this context, data is essentially useless. If I tell you the water temperature is 72 degrees but don’t tell you the units, the time, the location, the rate of change, the desired temperature, and much more, then the 72-degree data is useless.
The manufacturing data strategy must include planned uses. There are many ways that companies can create value from trusted data. Some companies want to use the data to help their front-line manufacturing workers make better decisions and be more productive and safe. This might take the form of operator interfaces on machines, workflows or augmented reality experiences on mobile devices, or centralized dashboards that provide a shared source of visible truth of real-time manufacturing activity. Other companies want to make the data available and accessible to enterprise systems while abstracting out all of the complexity and domain expertise normally required to effectively use manufacturing data across the enterprise. Low-code or no-code application development environments promise the rapid creation of special purpose software programs that are unique to a specific company, plant, or even manufacturing cell without requiring the skills of scarce and expensive professional software developers. Many companies are also looking to take advantage of advanced data analytics and machine learning algorithms. The insights provided by such tools can be truly remarkable when the underlying data is contextualized and trustworthy. All of these uses of the production system data are new and substantial value creation opportunities.
Further, the strategy contemplates the optimum locations for the data between the edge and the cloud considering costs, performance, scalability, and accessibility. The organization must decide how they will govern and manage that data to ensure its fidelity and security. Afterall data that can’t be trusted is worse than no data at all.
Overall, the strategy puts data at the center of their production system and creates a powerful reservoir that can be tapped for many new sources of value. By holistically harnessing the combined power of data, advanced technologies such as AI and ML, and expertise, manufacturers can optimize their entire operations. This powers tangible benefits and business outcomes for the organization, with actionable insights, across the manufacturing lifecycle – from the design of new production system elements to shipping the finished product.
Optimizing outcomes with the cloud
Beyond a data strategy, many companies are looking for additional ways to drive more value from manufacturing. Increasingly, that means implementing and using a set of operations management applications such as a manufacturing execution system (MES), quality management system (QMS), supply chain management system (SCM), and asset performance management system (APM). These applications provide a planning and execution environment that coordinates activities, creates manufacturing records, and provides high levels of visibility across the spectrum of manufacturing activities.
Historically, those applications have been heavyweight on-premise software deployments requiring the provisioning of on-site server-class IT infrastructures, long deployment programs, high levels of customization, and dedicated IT professionals to manage and maintain the installation. Those barriers to entry prevented many companies from fully embracing the value provided by these applications.
But of course, in manufacturing like in every other organization across the enterprise, the cloud changes everything. Using native-cloud, multitenant, highly available and secure SaaS offerings for operations management, more and more companies can quickly implement and get value from these applications. Even companies that have deployed MES and related applications to some of their top-tier plants can now scale out those benefits across the rest of the manufacturing fleet.
About the Author:
Brian Shepherd, senior vice president of software and control, Rockwell Automation.