Few years ago, I wrote about how data can be become a driver to improve user experience in PLM systems. Check out my article – PLM data driven user experience. My favorite example of data drive user experience is coming from Amazon.
As Bezos sees it, the success of electronic retailers will depend on their ability to analyze each customer’s tastes and create unique experiences from the moment they walk in the virtual door. “If we have 4.5 million customers, we shouldn’t have one store,” he says. “We should have 4.5 million stores.” Amazon was one of the first Internet sites to customize pages for each registered customer, offering recommendations for books and music through a process called “collaborative filtering.” Using mathematical formulas, it arrives at predictions by comparing a customer’s previous purchases and stated preferences (click “Not for Me” to weed out lousy suggestions) with the preferences of other people who bought the same titles.
Fast forward in 2019. My attention was caught by news from Siemens PLM about NX and machine learning applied to make user experience better. Check this one – Siemens updates NX Software with Artificial Intelligence and Machine Learning to increase productivity.
Siemens announced today an expansion of the Digital Innovation Platform with the introduction of the latest version of NX software, which has been enhanced with machine learning (ML) and artificial intelligence (AI) capabilities. These new features can predict next steps and update the user interface to help users more efficiently use software to increase productivity. The ability to automatically adapt the user interface to meet the needs of different types of users across multiple departments can result in higher adoption rates, leading to a higher-quality computer-aided technology (CAx) system and the creation of a more robust digital twin.
Here is an interesting snippet about new NX functionality published by Al Dean at Develop3D article.
Essentially, you begin a modeling or detailing activity in NX, then the system knows, through analysis and prediction (the machine learning part) what you’re most likely to do next, so brings up the most commonly used operations or commands.
What is my conclusion? Web companies are using data for decades analyzing what users are doing and optimizing user experience and functions. It is a time for this functionality to come to engineering and manufacturing software. User experience, part purchasing, contractors selection… this is only a few examples where I can see machine learning and AI can shine in the next few years. Just my thoughts…
Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.