Containerization is a term that recently started to pop in the roadmap of PLM vendors. What is that? Why is it happening and what you should know about it in order not to get confused. Let me start with some background.
What is container technology?
A container in a cloud computer is a technology to support operating system virtualization. It isolates resources, procedures, and dependencies. The code of the application can be bundled with the configurations and all needed dependencies to run in a container. Containers are used to build blocks to produce operational efficiency, version control, improve IT productivity and consistency. There are many advantages to using containers. Here are some very typical ones – (1) consistency of cloud storage; (2) version control of your applications); (3) IT productivity. There are some others.
What problems containers solve?
The container is a solution to how to get the software to run reliably and move from one computing environment to another. What does it mean in a “computing” environment? Think about the PLM system developed back in the 2000s. Originally, it was supposed to be downloaded to your machine, installed on your desktop or servers by your IT department, and configured to be used in your environment. After it is done, you actually start the PLM implementation stage – converting your company business needs in the data and processes supported by the applications.
Containers, IT and cloud
The modern cloud environment changed everything. Industrial companies and their IT departments are looking at how to optimize their processes, servers, infrastructure, and computing cost. Containerization is a common practice for most IT departments these days. When it comes to the PLM systems, IT is looking for PLM systems that run 20-25 years to support modern container infrastructure. That is the real reason for PLM vendors to take their old systems and containerize them. You can see signs of all PLM vendors with the old product architecture doing so. Check Aras; Oracle Agile PLM, Teamcenter, Windchill, and others. All PLM vendors are placing the containerization in the roadmap.
The container is an important instrument in cloud architecture and deployment. It is beneficial to organize a variety of computing environments and it is widely used by cloud application providers as part of their SasS infrastructure.
Containerizing Legacy App
Another interesting topic is the containerization of legacy applications. A good article on the IBM website can give you an idea of the pros, cons, and approaches to thinking about how to containerize the legacy. PLM vendors are spending resources to containerize, which is primarily driven by the demands of the large customers to optimize their IT.
What problems cannot be solved by containers?
The magic word “containerization” can solve many IT problems, but won’t solve the problem of the old PLM data architectures. Even if you take an existing app and containerize it. the same data management principles will stay. You can even containerize the database access and turn an application to use the database as a service approach. But it won’t change the fundamentals – it will be the same PLM system with the same single databases running in containers. The application itself won’t change. The user experience will be the same, but most importantly, the data management and tenancy model won’t change either. This means you will be getting the same system running using a modern cloud stack, but doing the same as it was doing 20 years ago.
What is my conclusion?
Containerization is an important step to keep your PLM legacy alive and not to get fired as a result of IT modernization. Containerizing existing PLM systems will allow you to be not dependent on complex hardware installations and IT migrations. You can have a PLM system in multiple environments for testing and easy to run multiple installations when needed. However, don’t expect containerization to solve the existing problem PLM systems such as scalability and collaboration. These aspects of PLM systems are inherited parts of fundamental PLM systems and data architecture and they won’t change as a result of moving to containers. Containers won’t turn your two-decade-old PLM systems into modern SaaS applications. Just my thoughts…
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network-based platform that manages product data and connects manufacturers and their supply chain networks. My opinion can be unintentionally biased.
The post Old PLM Systems, Containerization, and SaaS Trajectories appeared first on Beyond PLM (Product Lifecycle Management) Blog.