UncategorizedNo Comments

default thumbnail

Sunday quarantine time is the best moment to reflect on the history of PLM and future trends. Lionel Grealou gave me a perfect reason to do so with his article – Towards PLM 4.0: Hyperconnected Asset Performance Management Framework. The article is available on LinkedIn here as well as via Lionel’s personal blog Virtual+Digital here. Check the article and comments – it worth your time.

The common theme in the discussion and comments can be summarized in the following passage:

… most companies are at PLM 1.0 or PLM 2.0. Also agree that some companies have a PLM 3.0 or 4.0 vision but don’t understand the “how” to get there.

The comments from Jos Voskuil is placing an accent on a bad understanding of PLM by C-level people in the company. Jos’ comments, especially in the part about AR/VR, is about that.

The challenge I see in my environment is that in particular, the “HOW” becomes too complex for C-level to understand the full chain of dependencies that need to be addressed – people, processes and tools. I agree the C-suite often feels they do not need to understand the detailed dependencies and therefore they invest in AR/VR not realizing it is not a data flow, but a recreation of data (just an example)

In my view, this is a great explanation about why the current complexity of PLM and its value proposition is damaging PLM adoption.

I took 8 points of PLM 3.0 from Lionels’ article and shared my thoughts about why PLM is stuck to move forward.

1- Connecting project and program management

The majority of program managers are not familiar with PLM infrastructure and traditionally are afraid to use and integrate with the PLM tool, which is considered purely “engineering”. On the other side, PLM introduced its own project management tools, which even more increased the distance.

2- Functional integrations (mechanical+electrical and software).

The progress is mostly made via acquisitions and merging of products. PLM companies acquired vendors to mix MCAD and ECAD/PCB tools (eg. Siemens+Mentor Graphic, Autodesk + Eagle) and few others, such as integration between Altium and Solidworks into Solidworks PCB. I can expect things will be getting better. The integration is the most critical part of such bundles in the future.

3- Downstream integrations + SBOM

The love for PLM is dropping significantly outside of the engineering/R&D. There is very little trust in PLM and accessibility and availability of the tools.

4- Embedding simulation in CAD

I can see tons of improvements here. The entire Digital Twin story fits here and I expect it to continue the progress. PLM companies are continuing to acquire simulation businesses and we will see more tools available soon.

5- Data alignment between PLM-ERP-MES, smart factories, and IoT.

PLM-ERP integration is usually a very complex process. It is an expensive and complex product substance, which is very hard to define, manage, implement and support. New cloud tools give some hope for easier integration. MES tools are actively promoted by PLM vendors as well as IoT, but these are very new projects and only future will show the value of these projects

6- Integration with ALM

PLM vendors acquired some ALM companies. At the same time, the integration is going very complex and domains are remaining heavily disconnected.

7- Cloud and hosting infrastructure

Major PLM vendors all announced cloud PLM, but for most of the parts, it is hosting existing tools with a very questionable business model. New multi-tenant data management systems can solve these problems by providing flexible infrastructure and value for the cloud PLM, but it is still early in the development process to compete with mature functions of established PLM vendors. Give it some time.

8- Data intelligence

I can hear a lot of debates and discussions about data intelligence. It is promising, but it is too early to see results. Most of them are coming from very specific projects.

What is my conclusion?

There are many promising technologies, but integration is remaining the biggest problem for manufacturing companies in adopting PLM 3.0. The companies are struggling to expand upstream and downstream. Existing vendors are careful about the changes. At the same time, very few alternatives can be seen around. Cloud structure, new data management, and cloud infrastructure can simplify many integration challenges and unlock PLM 3.0 for future business upstream and especially downstream. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups, and supply chain. My opinion can be unintentionally biased.

Share

The post Why did manufacturing stuck in PLM 1.0 and PLM 2.0? appeared first on Beyond PLM (Product Lifecycle Management) Blog.

Be the first to post a comment.

Add a comment