BPM & Large Language Models - Integration of the latest technology in your automted processes
Jul 26, 2024

Large Language Models such as ChatGPT by OpenAI have been on everyone’s lips since 2023 and offer a variety of use cases with impressive results. Their development is advancing rapidly—billions of US dollars are being invested in the further refinement of these models, which is why we can expect them to be used profitably in more and more companies.
However, significant productivity gains and efficiency boosts in organizations can only be achieved when artificial intelligence is seamlessly embedded in robust, (semi-)automated business processes.
Large Language Models in Combination with Process Automation
We can think of a Large Language Model (LLM) as a giant brain that has learned from vast amounts of publicly available data. In addition, we can provide the model with internal company data that it should incorporate into its “answers.” In an automated process, we communicate with the LLM via an API interface. Recently, it has even become possible to force the output into a structured data format (such as JSON) to make it easier to process programmatically within a workflow.
All of this can be implemented, for example, in a company’s own Microsoft Azure OpenAI environment, ensuring that data is processed GDPR-compliantly only in the designated regional data centers.
Using a BPM tool such as FireStart, you can combine the extensive general knowledge of Large Language Models with a company’s valuable internal data and the specialized expertise of employees in standardized processes. Only then can artificial intelligence be scaled and used in a value-adding way.

BPM Process incl. OpenAI activity
Possible Applications of LLMs in Business Processes
Classification
Compared to traditional machine learning and deep learning technologies, LLMs offer two major advantages in the realm of classification:
They are ready to use immediately and don’t need resource-intensive training with large amounts of data.
They are highly flexible - classes can be changed after the fact or even generated by the LLM itself.
Examples of Classification Types:
Topic Classification: Whether it’s text, images, video, or speech, Large Language Models deliver impressive results in classifying or categorizing content. This classification can then direct how the process continues.
Sentiment Analysis: Identifying the sentiment of a text based on categories such as positive/negative/neutral.
Language Detection: Automatically identifying various languages—for example, to route inquiries to the appropriate employees.
Intent Detection: Detecting intentions—like recognizing a customer’s goal when interacting with a chatbot—to then trigger specific processes.
Text/Data Extraction
From documents, images, free-text input, and so on.
Text/Data Generation
Data Analysis: LLMs can analyze, create, and modify structured data sets (e.g., CSV or Excel files). They can also generate data visualizations, which can, for instance, be attached to forms.
Text Generation: For example, summarizing texts, correcting grammar, translating content, creating image descriptions, etc.
Image and Video Generation.
Practical Example: Automatic Detection of a Price Inquiry and Semi-Automated Quote Generation
In this real-world example, we use the LLM GPT-4o in combination with the OpenAI Assistant API for a multi-step classification. A PDF containing a list of product items with item numbers and descriptions is provided to the AI assistant.
The Challenge:
Detection of Price Inquiries (Classification – Intent Detection)
A company wants to automatically filter out price inquiries from a large volume of incoming messages through multiple channels (e.g., via an email address such as [email protected] or a web form). The system should also detect which items and what quantities are being requested, then notify the right person in the organization. If the total value of the price inquiry is over €10,000, it should be routed to the Sales Manager role; otherwise, it should go to the Key Account Manager.Ensuring Accuracy and Ease of Editing
The employee in the relevant role should get a quick overview of the price inquiry and be able to modify the pre-filled quote data points in case of any misclassification by the AI. If everything is correct, a single click should automatically generate a quote and email it to the customer.

Process Example
The Solution:
Process Start
Whenever a new email arrives in the mailbox or a new web form submission is received, the workflow automatically starts.

External Form
Classification with the OpenAI Assistant
By using the FireStart low-code REST API activity, we can easily communicate with the assistant. We receive a JSON object in return, telling us whether it is a price inquiry. If it is, we also get a JSON array with the respective item numbers and quantities.
Calculate Inquiry Value
We access current pricing data to determine the total value of the inquiry.
Notify the Right Person
Using a role model in FireStart that is connected to the company’s Active Directory, a specific employee is notified depending on the value of the inquiry. We use the FireStart Form Builder and simply drag and drop the data points returned by the AI assistant into the form. The employee can view this form right in Microsoft Outlook. The fields in the table are editable, and the employee can still update the quote data if needed.

Confirm the Quote
If the employee confirms the quote via a single button click, it is automatically sent to the requesting individual via email.

Implementing AI-Powered Processes with FireStart
In just a few steps, you can set up AI-driven processes (as shown in the example above) with FireStart and optimize them. Our BPM tool offers seamless integration with OpenAI. It helps you accelerate processes in your company and use your employees’ resources more efficiently. Book a demo and see for yourself how your BPM can benefit from AI.