TXT2BPMN / README.md
fachati's picture
Update README.md
b8ae777 verified
|
raw
history blame
3.5 kB
metadata
license: mit
language:
  - en
pipeline_tag: text2text-generation
tags:
  - bpmn
  - Business Process
widget:
  - text: >-
      The following text is about the purchase order workflow. It starts with
      recording the basic details. After recording the basic details, getting
      approval from manager needs to be done. Once getting approval from manager
      occurs, creating the purchase order should be done. When creating the
      purchase order is completed, getting approval from supplier should be
      done. When getting approval from supplier is completed, confirming the
      delivery date needs to be done. The process is now completed. 
    example_title: purchase order workflow

TXT2BPMN Model

Model Description

TXT2BPMN is a fine-tuned version of the T5-Small model, designed for the extraction of BPMN (Business Process Model and Notation) diagrams from textual descriptions. This AI-driven approach leverages advanced language modeling techniques to transform natural language descriptions into BPMN models, facilitating the modernization and automation of business processes.

Key Features

  • Language Model Base: T5-Small, known for its efficiency and efficacy in understanding and generating text.
  • Specialization: Fine-tuned specifically for BPMN generation, improving accuracy and relevancy in business process modeling.
  • Dataset: Trained on a comprehensive dataset consisting of 30,000 pairs of text descriptions and corresponding BPMN diagrams, ensuring diverse exposure to various business process scenarios.

Applications

  • Business Process Management: Automates the generation of BPMN diagrams, which are crucial for documenting and improving business workflows.
  • AI Research and Development: Provides a research basis for further exploration into the integration of NLP and business process management.
  • Educational Tool: Assists in teaching the concepts of BPMN and AI's role in business process automation.

Configuration

  • Pre-trained Model: Google's T5-Small
  • Training Environment: Utilized a dataset from "MaD: A Dataset for Interview-based BPM in Business Process Management" for training and validation.
  • Hardware Used: Intel Core i7 with 24 GB RAM and Intel HD Graphics 630.

Usage

The model is intended for use by developers, researchers, and business analysts interested in automating BPMN generation from textual descriptions. It can be integrated into business software systems or used in standalone applications for process modeling.

Model Performance

The model has shown promising results in terms of syntactic integrity and logical fidelity in the generated BPMN diagrams. Detailed performance metrics and evaluations are included in the full project documentation.

Installation and Requirements

The model can be accessed and installed via the Hugging Face model hub. Requirements for using this model include Python 3.6 or newer and access to a machine with adequate computational capabilities to run inference with the T5 architecture.

How to Use

For detailed instructions on how to integrate and use the TXT2BPMN model in your projects, please refer to the official documentation available in the model's repository on Hugging Face.

Contributions

Developed by Omar El Fachati as part of a Master's thesis. The project was supervised by academic and industry professionals.

Contact Information

For further assistance or collaboration opportunities, contact me on my LinkedIn.