Update README.md
Browse files
README.md
CHANGED
@@ -6,21 +6,20 @@ colorTo: pink
|
|
6 |
sdk: static
|
7 |
pinned: false
|
8 |
---
|
|
|
9 |
|
10 |
-
|
11 |
-
|
12 |
-
ArtyLLaMA is an innovative chat interface for Open Source Large Language Models, leveraging the power of LLaMA.cpp. It features dynamic content generation and display through an "Artifacts-like" system, making AI-assisted creativity more accessible and interactive.
|
13 |
|
14 |
## Model Description
|
15 |
|
16 |
-
ArtyLLaMA is not a model itself, but a framework that allows users to interact with various
|
17 |
|
18 |
### Key Features:
|
19 |
|
20 |
-
-
|
21 |
- π¨ **Dynamic Artifact Generation**: Create and display content artifacts during chat interactions
|
22 |
- π₯οΈ **Real-time HTML Preview**: Instantly visualize HTML artifacts
|
23 |
-
- π **Multi-Model Support**: Choose from multiple language models
|
24 |
- π± **Responsive Design**: Mobile-friendly interface built with Tailwind CSS
|
25 |
- π **Dark Mode**: Easy on the eyes with a default dark theme
|
26 |
- π **Local Inference**: Run models locally for privacy and customization
|
@@ -28,7 +27,6 @@ ArtyLLaMA is not a model itself, but a framework that allows users to interact w
|
|
28 |
## Intended Use
|
29 |
|
30 |
ArtyLLaMA is designed for developers, researchers, and creative professionals who want to:
|
31 |
-
|
32 |
- Explore the capabilities of open-source language models
|
33 |
- Generate creative content, including code, designs, and written text
|
34 |
- Prototype AI-assisted applications and workflows
|
@@ -36,15 +34,14 @@ ArtyLLaMA is designed for developers, researchers, and creative professionals wh
|
|
36 |
|
37 |
## Limitations
|
38 |
|
39 |
-
- Requires local installation and setup
|
40 |
- Performance depends on the user's hardware capabilities
|
41 |
-
- Limited to models
|
42 |
- Does not include built-in content moderation (users should implement their own safeguards)
|
43 |
|
44 |
## Ethical Considerations
|
45 |
|
46 |
Users of ArtyLLaMA should be aware of:
|
47 |
-
|
48 |
- Potential biases present in the underlying language models
|
49 |
- The need for responsible use and content generation
|
50 |
- Privacy implications of using AI-generated content
|
@@ -52,19 +49,18 @@ Users of ArtyLLaMA should be aware of:
|
|
52 |
## Technical Specifications
|
53 |
|
54 |
- **Framework**: Python with Flask backend, HTML/CSS/JavaScript frontend
|
55 |
-
- **Required Libraries**:
|
56 |
-
- **Supported Model Formats**:
|
57 |
-
- **Hardware Requirements**: Varies based on the chosen model size
|
58 |
|
59 |
## Getting Started
|
60 |
|
61 |
1. Clone the repository: `git clone https://github.com/kroonen/ArtyLLaMA`
|
62 |
2. Set up a Python environment: `conda create -n ArtyLLaMa python=3.11 && conda activate ArtyLLaMa`
|
63 |
3. Install dependencies: `pip install -r requirements.txt`
|
64 |
-
4.
|
65 |
-
5.
|
66 |
-
6.
|
67 |
-
7. Access the interface at `http://localhost:5000`
|
68 |
|
69 |
For more detailed instructions and documentation, visit our [GitHub repository](https://github.com/kroonen/ArtyLLaMA).
|
70 |
|
@@ -79,7 +75,7 @@ If you use ArtyLLaMA in your research or projects, please cite it as follows:
|
|
79 |
```bibtex
|
80 |
@software{artyllama2024,
|
81 |
author = {Robin Kroonen},
|
82 |
-
title = {ArtyLLaMA: Empowering Creativity
|
83 |
year = {2024},
|
84 |
url = {https://github.com/kroonen/ArtyLLaMA}
|
85 |
}
|
@@ -88,8 +84,8 @@ If you use ArtyLLaMA in your research or projects, please cite it as follows:
|
|
88 |
## Contact
|
89 |
|
90 |
For questions, feedback, or collaborations, please reach out to:
|
91 |
-
|
92 |
- GitHub: [https://github.com/kroonen/ArtyLLaMA](https://github.com/kroonen/ArtyLLaMA)
|
93 |
-
- Email:
|
|
|
94 |
|
95 |
We welcome contributions and feedback from the community!
|
|
|
6 |
sdk: static
|
7 |
pinned: false
|
8 |
---
|
9 |
+
# ArtyLLaMA: Empowering AI Creativity in the Open Source Community π¦π¨
|
10 |
|
11 |
+
ArtyLLaMA is an innovative chat interface for Open Source Large Language Models, now leveraging the power of Ollama. It features dynamic content generation and display through an "Artifacts-like" system, making AI-assisted creativity more accessible and interactive.
|
|
|
|
|
12 |
|
13 |
## Model Description
|
14 |
|
15 |
+
ArtyLLaMA is not a model itself, but a framework that allows users to interact with various language models through Ollama. It provides a user-friendly interface for generating creative content, code, and visualizations using state-of-the-art language models.
|
16 |
|
17 |
### Key Features:
|
18 |
|
19 |
+
- π¦ **Ollama Integration**: Seamless support for multiple language models via Ollama
|
20 |
- π¨ **Dynamic Artifact Generation**: Create and display content artifacts during chat interactions
|
21 |
- π₯οΈ **Real-time HTML Preview**: Instantly visualize HTML artifacts
|
22 |
+
- π **Multi-Model Support**: Choose from multiple language models available through Ollama
|
23 |
- π± **Responsive Design**: Mobile-friendly interface built with Tailwind CSS
|
24 |
- π **Dark Mode**: Easy on the eyes with a default dark theme
|
25 |
- π **Local Inference**: Run models locally for privacy and customization
|
|
|
27 |
## Intended Use
|
28 |
|
29 |
ArtyLLaMA is designed for developers, researchers, and creative professionals who want to:
|
|
|
30 |
- Explore the capabilities of open-source language models
|
31 |
- Generate creative content, including code, designs, and written text
|
32 |
- Prototype AI-assisted applications and workflows
|
|
|
34 |
|
35 |
## Limitations
|
36 |
|
37 |
+
- Requires local installation and setup of Ollama
|
38 |
- Performance depends on the user's hardware capabilities
|
39 |
+
- Limited to models supported by Ollama
|
40 |
- Does not include built-in content moderation (users should implement their own safeguards)
|
41 |
|
42 |
## Ethical Considerations
|
43 |
|
44 |
Users of ArtyLLaMA should be aware of:
|
|
|
45 |
- Potential biases present in the underlying language models
|
46 |
- The need for responsible use and content generation
|
47 |
- Privacy implications of using AI-generated content
|
|
|
49 |
## Technical Specifications
|
50 |
|
51 |
- **Framework**: Python with Flask backend, HTML/CSS/JavaScript frontend
|
52 |
+
- **Required Libraries**: Flask, Tailwind CSS, Alpine.js
|
53 |
+
- **Supported Model Formats**: Those supported by Ollama
|
54 |
+
- **Hardware Requirements**: Varies based on the chosen model size
|
55 |
|
56 |
## Getting Started
|
57 |
|
58 |
1. Clone the repository: `git clone https://github.com/kroonen/ArtyLLaMA`
|
59 |
2. Set up a Python environment: `conda create -n ArtyLLaMa python=3.11 && conda activate ArtyLLaMa`
|
60 |
3. Install dependencies: `pip install -r requirements.txt`
|
61 |
+
4. Ensure Ollama is installed and running on your system
|
62 |
+
5. Run the application: `python app.py`
|
63 |
+
6. Access the interface at `http://localhost:5000`
|
|
|
64 |
|
65 |
For more detailed instructions and documentation, visit our [GitHub repository](https://github.com/kroonen/ArtyLLaMA).
|
66 |
|
|
|
75 |
```bibtex
|
76 |
@software{artyllama2024,
|
77 |
author = {Robin Kroonen},
|
78 |
+
title = {ArtyLLaMA: Empowering AI Creativity in the Open Source Community},
|
79 |
year = {2024},
|
80 |
url = {https://github.com/kroonen/ArtyLLaMA}
|
81 |
}
|
|
|
84 |
## Contact
|
85 |
|
86 |
For questions, feedback, or collaborations, please reach out to:
|
|
|
87 |
- GitHub: [https://github.com/kroonen/ArtyLLaMA](https://github.com/kroonen/ArtyLLaMA)
|
88 |
+
- Email: rob@artyllama.com
|
89 |
+
- Twitter: [@rob_x_ai](https://x.com/rob_x_ai)
|
90 |
|
91 |
We welcome contributions and feedback from the community!
|