I’ve got an email with this question from the PI DX Team, an outfit that stands behind a cross-industry community of manufacturing companies pursuing successful digital transformation. PI DX works with many industrial companies and vendors to enable cross-industry collaboration and to support innovation. The email invites me to the online event (ah… I miss real events) with a fascinating topic – Is Digital Thread Doomed Without Open Architecture? Check the link if you qualify and consider joining the event tomorrow. I’m going to attend.
Digital Thread (DT) is designed to support connectivity between organizations and product information across multiple lifecycle status. The description of DT I took from pi.dx website is amazingly similar to many descriptions of PLM in general:
The digital thread is designed to track every process through a product’s life from initial development, manufacturing, servitization and is the ultimate communicative framework for collaboration.
The following passage brings a bit more of pragmatism into DT space, but again, amazingly repeats one of the big goals of PLM to integrate data from multiple organization and supply chain:
Across organisations and their supply chains, IT architectures must integrate data from systems usually customised by role, product, vendor or organisation. This drastically hampers the ability to integrate and share data in real time- massively obstructing the digital thread.
Open is a cool and welcoming word. But what is the meaning behind what is open architecture? PI DX used the word “open architecture”, which demands the explanation. What architecture can be considered as an open and what are the criteria. Here is what Gartner says about open architecture:
Open architecture is a technology infrastructure with specifications that are public as opposed to proprietary. This includes officially approved standards as well as privately designed architectures, the specifications of which are made public by their designers.
I found a very pragmatic definition in SEI blog – When and Where To Be Closed. From the practical standpoint, DoD defines Open Software Architecture (OSA) includes the following three criteria:
- It is modular, being decomposed into architectural components that are cohesive, loosely coupled with other components (and external systems), and encapsulate (hide) their implementations behind visible interfaces.
- Its key interfaces between architectural components conform to open interface standards (that is, consensus based, widely used, and easily available to potential users).
- Its key interfaces have been verified to conform to the associated open interface standards.
The following Wikipedia article for Open Platform provides another good explanation:
In computing, an open platform describes a software system which is based on open standards, such as published and fully documented external application programming interfaces (API) that allow using the software to function in other ways than the original programmer intended, without requiring modification of the source code. Using these interfaces, a third party could integrate with the platform to add functionality. The opposite is a closed platform.
I found multiple criteria to consider, but there is no strong consensus about what is open and what is not. It is more a combination of factors that can help the system to be more open than close.
Now, I want to come back to the question – why is open architecture so rare in PLM? From the standpoint of any industrial company or manufacturing organization in the 21st century is looking at how to organize data across multiple systems, provide real-time data visibility, eliminate the redundancy of data silos, and support seamless information and process flow. It is hard to find a person who will disagree with the statement above, yet IT and PLM architectures are suffering from openness and transparency.
When it comes to PLM and related software, the biggest challenges are in the way PLM infrastructure and systems are encouraging vertical integration instead of welcoming the openness and interchangeability of services. Marc Halpern of Gartner called it black holes of vertical platforms. Check my article – Can we move from black holes of vertical platforms to online data services? PLM vendors strongly encourage vertical integration and it brings many challenges to manufacturing companies.
The PI DX invite made me think about this problem again and I’d like to share my top 3 reasons why open architecture is rare in PLM environments.
1- Business Model and Data Locking
Remember the famous movie Twins with Arnold Schwarzenegger and Danny DeVito? Money Talks, Bullshit Walks. This means that cheap talk will get you nowhere, while money will persuade people to do as you like. You ask me how it is related to the PLM openness? Very simple. The cheap talks are for many discussions about the openness of PLM architecture, API, transparency, participation in different standard organizations or even joining ISO standardization groups. All these things are cheap because there is an elephant in the room and it is the PLM vendor’s business model. And the PLM business model means data locking. Because, otherwise, the competitors and vendors responsible for adjacent solutions will be able to capture the data and expand the solution further. And nobody really wants it to happen. As a result, we have what we have – PLM systems have some API and some modularity, but the integrations are not simple and they keep customers in the close circle of tools provided by a single vendor because there is a rare situation when a vendor will benefit from giving the data away.
2- Incompatible and proprietary CAD formats and data
CAD is a big portion of PLM foundations. These CAD systems were developed for a long time and historically were very protective against the ability of each vendor to handle data openness. Bluntly saying, CAD file was a means to defend and not to allow companies to develop better interoperability standards and open architecture. Beyond CAD formats, CAD features (especially 3D design features) are incompatible, which brings another level of complexity in data.
3- High Level of Custom Solutions and Customizations
Customization made a substantial contribution to the problem of openness. Almost every manufacturing organization in the world fundamentally believes in its unique processes and needs. It can actually bring a good outcome. If a company decides to develop a system with open architecture, the result is an open system. However, if the decision follows the customization of the existing solutions including building proprietary features and interfaces, the result can be messy. Solutions built on multiple layers of incompatibilities leading to complex data structures and the inability to map and transform them.
What is my conclusion?
Digital Thread is a complex system that has no chance to be sustainable when it relies on a single system or software components. Industry made a lot of progress to improve current architecture to make it open, but, in my view, the leapfrog will be possible only with the invention of a business model, to supports openness and data sharing. This model doesn’t exist (yet). Once, the new model is found, it will create a foundation for software companies to build the technology, components, and solution that modular and interchangeable. The advertising business model, which largely invented by Google, was a foundation of many consumer solutions for the last 20 years. The PLM industry is waiting for the effect similar to Google in the enterprise to unlock the data and make systems open and transparent. Once it happens, we will see a lasting event of the open architecture of the future PLM. Just my thoughts…
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network-based platform that manages product data and connects manufacturers, construction companies, and their supply chain networks. My opinion can be unintentionally biased.