Earlier today, my attention was caught by PLM Stack article – Should you care about the programming language used in your PLM stack. Great question! Thanks, Yoann Maingon for asking. He gives an excellent review of various aspects of programming languages and purposes to use them,
By analyzing a fit of languages and specific tasks, he made a conclusion about PLM being web application matching specific skills. Here is my favorite passage:
So how does a PLM application fit in these application type? We already said that today a PLM solution has to be a web solution. So it can actually be a good fit with multiple languages, the main need for PLM is to accept a large number of users and to integrate with various authoring tools. So scalability and API capabilities are good things to look for.
As much as I like the idea of “favorite” language, the time of a single language is over. The article made me think about some evolution of technologies and tools used for PDM / PLM development over the course of the last 20-25 years.
1990s… Single Databases. Single Language or Framework
Back in 1990s, the focus of PDM (It was before PLM) technologies was on the database. The debates of those days were between using “standard” RDBS or proprietary database. Some of PDM tools were still using file systems. Sometimes because of cost and sometimes because of complexity. Relational databases won and it happened mostly because of IT dominant position in this technology. When you need to sell PDM solution to IT of a large organization, you won’t take any chances to use a database that won’t be approved by IT.
Programing languages decision was less important back those days. The salability of servers and applications was one of the most sensitive criteria when choosing one or another technology or tool. Most of application back those days were 2 tiers.
2000s… Web. Polyglot Programming
As technology move into web, the differentiation of technologies increased. Web applications introduced 3-tier or multi-tier applications written in different languages. Server technologies separated from the client and especially web/browser dependent tools. It was a time when a single language idea was eliminated. Interoperability between frameworks and various component technologies came, but language selection became less important.
Databases technologies remained the same with a dominance RDBS like Oracle and SQL Server. Even for web applications, it was commonly acceptable to use a single database as a foundation of PLM system (and other enterprise systems).
2010s… Cloud. Polyglot Persistence.
The changes in technology stack started to come from the side of global web application and cloud technologies. Three things happened – (1) tons of new data management technologies developed as a result of massive web development; (2) we can see the usage of multiple databases at the same time (polyglot persistence); (3) cloud technologies eliminated the need of IT to be in charge of databases and servers.
What is my conclusion?… and, most importantly, what trends to expect in PLM technological stack in 2020? Cloud is here and it will lead the technological and tech stack choices for PLM applications. PLM applications are growing beyond a single company. The network paradigm is replacing the single company paradigm. Polyglot persistence, multi-tenancy, the microservices architecture will take a dominant position in PLM development in the next decade. We will see usage of multiple databases, multiple languages, tools and frameworks. A large number of PLM applications were developed 15-25 years ago and the question about what PLM tech stack should company is looking for is very important. Just my thoughts…
Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud-based bill of materials and inventory management tool for manufacturing companies, hardware startups, and supply chain. My opinion can be unintentionally biased.