Earlier this week, I shared my views about the future of data in PLM. Check this out here. Data is a foundational block for every organization and it is an ultimate source of intelligence that can be used by manufacturing companies to improve their processes and products.
Data is an interesting thing. Every person in a company needs the data, everyone in the company relies on the data, but when it comes to data management, nobody wants to manage it or, let me put it differently – nobody is taking responsibility for holistic data management in the organization. The result is data silos, disorganized data, mistakes, a lot of data synchronizations between different systems, and many other inefficiencies.
In my recent articles, I shared a perspective on what is the evolution of PLM data architecture over the last few decades.
The data architecture moved from files to networks to databases cloud-hosted storage. What is remarkable over all these years of PLM development was the strong associations between database and PLM system. The limitations of technologies and on-premise architecture was an obvious reason why it happened. The SQL databases were dominant technology to manage data and PLM companies used those databases that had IT approvals.
Another interesting aspect of PLM systems was the tight connection between data organizations and PLM functions and user experience. The flexibility of data schemas was the key element in PLM systems and, often so-called “PLM editors” were applications exposing data structure in one way or another. Fundamentally, nothing wrong with that, but it brought a few issues over time.
Files and Granularity
CAD files are the original source of most product data are becoming very inefficient. Files are hard to manage, store, and share. The low granularity of files creates many difficulties when the data must be changed partially because the file boundary is artificial. Organizations need to escape from files, but it requires a different approach to data management.
PLM business models are heavily dependent on the data located in the systems and unfortunately rely on the ability to preserve the data, growing the scope of the data, and upsell additional applications. Ease of integration was always a problem for PLM and other enterprise systems. The hunt for data locked in the systems was a fundamental business strategy. Ownership of the data (data records) is still a significant business advantage.
It is technically impossible to put all the data into a single database. Although some companies tried to implement it, such a strategy never really worked. A modern manufacturing environment brings the need to link the data between multiple systems and organizations, including manufacturing companies, contractors, suppliers, service organizations, and customers. A single database is a bad technology to make it happen.
Moving to a new type of data architecture, raised a question about re-thinking the paradigms. Data is not distributed between multiple services, but because of nature and new technologies, it can actually improve the availability of the data. Web services and REST API can make data easily available and accessible. But, it requires a bit more conceptualization and future openness.
Altogether, it brings me the need for PLM data conceptualization and disconnect of PLM data from a single database. To make it happen, organizations need to focus on two parallel activities: (1) rethinking data management architecture, including moving to modern data architecture, microservices, databases, and accepting polyglot persistence models. Databases are not more than a useful tool and modern PLM systems can use multiple tools. (2) to switch from “database” thinking to “data” thinking. What is important is to make data holistic, managed across multiple data services, databases, and tools (including cloud data services).
What is my conclusion?
Data is a source of intelligence and decision making. To make it work, organizations need to move to a holistic approach of data management – moving from files and databases to a broader scope of data, focusing on integrations of multiple applications, data services, and storage. A special activity must be focused on how to make data move out of legacy sources, databases, Excels, and files. It will bring a new level of data-driven systems and will replace old fashion PLM databases. Just my thoughts…
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network-based platform that manages product data and connects manufacturers and their supply chain networks. My opinion can be unintentionally biased.