By Rainer Zeifang
Companies have more experience with artificial intelligence (AI) than you would expect. This was one of the insights gained from PROSTEP’s webinar on the benefits of AI in product development, which was attended by over 50 people. The participants see the greatest potential for AI in requirements management, process automation and information research.
The webinar focused on the question of how AI can be put to good use in product development. The two AI experts from PROSTEP first of all explained to those attending the challenges that companies are trying to master with the help of AI. These include the increasing complexity of products and development processes and the resulting flood of data, as well as growing time and cost pressures. At the same time, an aging population and the increasing shortage of trained specialists mean that repetitive tasks need to be automated. AI makes a higher level of knowledge-based, data-driven product development possible, which speeds up the launch of products onto the market and boosts competitiveness.
In addition to searching for information, the application areas in which companies expect AI to provide the greatest benefit include process automation and requirements management. This was the conclusion drawn by one of the Mentimeter surveys conducted among the attendees of the webinar, after PROSTEP’s experts presented a number of practice-oriented use cases involving product development. They demonstrated how the quality of requirements can be checked with the help of a chatbot that combines company-specific criteria with generative language models. (RAG or retrieval-augmented generation). Quality control with regard to requirements engineering is important if development errors resulting from incomplete or unclear requirements are to be avoided.
The chatbot not only checks the quality of the requirements but also optimizes their structure, which makes it easier to process them further in requirements management systems. In another use case, the experts demonstrated how the optimized text requirements can be used to automatically generate a system model in SysML V2 format. The RAG-based chatbot is also able to derive test cases directly from the requirements and link them so that, if changes are made, they can be used later to perform impact analyses, for example.
However, the challenge posed by AI does not lie in developing a chatbot but in understanding how the processes that are to be optimized using the chatbot work. This is where PROSTEP and its subsidiary BHC come into play as they bundle the skills that are key to the efficient implementation and use of AI. Our many years of experience with PLM strategy consulting and the depth of our knowledge of the automotive, aerospace, mechanical and plant engineering and shipbuilding industries are particularly helpful in this context. This means that PROSTEP is able to provide companies with the best possible support when it comes to selecting and implementing PLM systems, as well as with the implementation of AI applications in their processes, methods and tools. The services we offer range from the development of an AI strategy for appropriate use cases to the implementation of AI chatbots and agents for providing information and automating processes through to customized AI solutions and the automatic generation of test cases.
PROSTEP provides its customers with support – from the initial idea through to the successful implementation of AI solutions. The targeted use of AI means that PROSTEP is able to relieve the burden placed on specialists and provide consistent, up-to-date information. This speeds up processes considerably, thus making product development more agile and shortening the time-to-market over the long term.
During the Q&A session at the end of the webinar, one of the questions raised was how to ensure that the personal data used to train AI does not end up in the cloud. The AI experts from PROSTEP recommended that thought be given to ensuring that personal data is anonymized, or even deleted in its entirety, when preparing the data. Alternatively, smaller large language models could also be operated locally (on-premises) in order to meet data protection requirements.