license: apache-2.0
language:
- en
pipeline_tag: text2text-generation
tags:
- bpmn
- Business Process
T5-Small Finetuned for Purchase Order Workflow Business Processes
Model Description
This is a fine-tuned version of the T5-Small model, designed specifically for the extraction of BPMN (Business Process Model and Notation) diagrams from textual descriptions related to Purchase Order Workflow Business Processes. This AI-driven approach leverages advanced language modeling techniques to transform natural language descriptions into BPMN models, facilitating the modernization and automation of business processes in this specific area. This model serves as a proof of concept and is not yet ready for real-life applications.
Key Features
- Language Model Base: T5-Small, known for its efficiency and efficacy in understanding and generating text.
- Specialization: Fine-tuned specifically for BPMN generation in Purchase Order Workflows, improving accuracy and relevancy in business process modeling.
- Dataset: Trained on the "MaD: A Dataset for Interview-based BPM in Business Process Management" dataset, cited from the research article available at IEEE Xplore.
Applications
- Business Process Management: Automates the generation of BPMN diagrams, which are crucial for documenting and improving Purchase Order Workflow Business Processes.
- AI Research and Development: Provides a research basis for further exploration into the integration of NLP and business process management.
- Educational Tool: Assists in teaching the concepts of BPMN and AI's role in business process automation, particularly in the context of Purchase Order Workflows.
Configuration
- Pre-trained Model: Google's T5-Small
- Training Environment: Utilized a dataset from "MaD: A Dataset for Interview-based BPM in Business Process Management" for training and validation.
- Hardware Used:
- CPU: Apple M1 MAX with 10 cores (8 performance + 2 efficiency), 3.20 GHz
- GPU: Integrated, 32 cores
- RAM: 64 GB LPDDR5
- Storage: 2 TB SSD
- OS: macOS 12.7.1 (Monterey)
- Training Script: Finetuning T5-Small BPMN
Installation and Requirements
The model can be accessed and installed via the Hugging Face model hub. Requirements for using this model include Python 3.6 or newer and access to a machine with adequate computational capabilities to run inference with the T5 architecture.
Contributors
- Pascal Poizat: Hugging Face Profile - Training hardware
- Omar El Fachati: Hugging Face Profile - Training script