Aluode commited on
Commit
75d4f89
1 Parent(s): 9b3b253

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +162 -151
README.md CHANGED
@@ -1,151 +1,162 @@
1
-
2
- # Dynamic AI: Fractal Universe Chocolate Wafer Model (FUCWM)
3
-
4
- ### Watch Demo
5
-
6
- Check out the demo of the project on YouTube: [Watch Here](https://www.youtube.com/live/d__ras4nLU4)
7
-
8
-
9
- Dynamic AI is an experimental neural network model inspired by fractal structures in the universe and the human brain. It incorporates recursive nodes (FractalNodes) to dynamically grow and learn through Hebbian-like updates and pruning. The model also integrates a VAE (Variational Autoencoder) for encoding latent space representations. This repository contains the code for training, chatting, and interacting with the model via a Gradio interface.
10
-
11
- # Attention Mechanism (New)
12
-
13
- The attention mechanism dynamically adjusts the focus of the model by assigning importance to different child nodes in the fractal structure. Each child node receives an attention score based on its relevance, which is calculated using a softmax function. This allows the model to prioritize certain nodes over others during the forward pass, enabling more efficient learning and processing. Additionally, the model maintains a co-activation matrix that tracks how frequently different nodes are activated together, which further refines the attention scores. This approach enhances the model’s adaptability and helps manage complex hierarchical interactions.
14
-
15
-
16
- ## Features
17
-
18
- - **Recursive Fractal Nodes**: Nodes can grow and create child nodes based on the complexity of their output, simulating the recursive, fractal-like nature of the brain and the universe.
19
- - **Variational Autoencoder (VAE)**: Encodes latent representations of inputs.
20
- - **Layer Normalization and Xavier Initialization**: Enhances training stability.
21
- - **Dynamic Complexity-based Growth**: Nodes grow based on complexity thresholds and manage child connections.
22
- - **Dynamic AI Chat**: Users can interact with the model to generate responses.
23
- - **LM Studio Integration**: Chat with a local LM Studio instance in a collaborative conversational framework.
24
- - **Gradio Interface**: A user-friendly interface to interact with the AI model, train it on Q&A pairs, and simulate conversations with LM Studio.
25
-
26
- ## What it is?
27
-
28
- Think Fractal ball around big bang with chocolate wafer inspired super weights that add complexity to normal weights.
29
-
30
- Now with added attention mechanism. I just asked Claude to think of the node 1 as a sort of phone book that keeps tabs
31
- of the child nodes and can hooke em up if they fire together. So they can.. Wire together. Tsk Tsk. You know what I mean.
32
-
33
- The depth setting can make the ball complexities explode to Nan territory real fast and there was a real fight to keep the
34
- complexity setting at bay.
35
-
36
- ## Requirements
37
- - Python 3.8+
38
- - PyTorch
39
- - Gradio
40
- - LM Studio (optional, for integration with the `talk_with_lm_studio` feature)
41
- - Etc
42
- The requirements.txt was written by ChatGPT. I have not tested if it would work as it is.
43
-
44
- ## Problems?
45
-
46
- Ask from Claude / ChatGPT. Paste them this and the code. They will understand what to do. NotebookLM
47
- talking heads think this is groundbreaking. But since all I hear is crickets. I guess it aint. But it most
48
- def has been a wild ride.
49
-
50
- ## Installation
51
-
52
- 1. Clone this repository:
53
-
54
- ```bash
55
- git clone https://github.com/anttiluode/DaFUC.git
56
- cd dynamic-ai-fractal-wafer
57
- ```
58
-
59
- 2. Install dependencies:
60
-
61
- ```bash
62
- pip install -r requirements.txt
63
- ```
64
-
65
- 3. If you're planning to use LM Studio, ensure it's installed and running locally. Configure the `lm_studio_client` by setting your API key and URL in the code.
66
-
67
- 4. Run the application:
68
-
69
- ```bash
70
- python app.py
71
- ```
72
-
73
- ## Usage
74
-
75
- ### 1. Chat with Dynamic AI
76
-
77
- You can use the Gradio interface to chat with the Dynamic AI model.
78
-
79
- - **Message**: Enter your message and adjust the temperature for creativity.
80
- - **Response**: The AI will generate a response based on its learned knowledge.
81
-
82
- ### 2. Train the Model on Q&A Pairs
83
-
84
- You can train the model on a list of question-answer pairs using the Gradio interface.
85
-
86
- - **Q&A Pairs File**: Upload a JSON file containing question-answer pairs.
87
- - **Epochs**: Set the number of training epochs.
88
- - **Training Output**: Monitor the progress of training, including loss metrics.
89
- - This can lead to the complexity being wildly off and the model begins to parrot the words in the
90
- - question answer pairs.
91
-
92
- ### 3. LM Studio Conversation
93
-
94
- ! You may have to wait a while for the conversation to start. I think perhaps there are
95
- multiple empty interactions but eventually the model says something and lm studio grabs on to that.
96
- If you teach the model with question answer pairs it sticks on to them and the complexity
97
- does not stabilze. On the initial live training video I did at the beginning of this readme
98
- something amazing happened. The complexity stabilized at 16 and did not budge.
99
-
100
- You can simulate a collaborative conversation between Dynamic AI and LM Studio:
101
-
102
- - **Initial Message**: Set the initial message to start the conversation.
103
- - **Duration**: Set the duration of the conversation.
104
- - **Delay**: Set the delay between messages.
105
-
106
- This is a good start. The question answer pairs seem to produce more random AI.
107
-
108
- ### 4. Save/Load Model State
109
-
110
- You can save and load the state of the model using the Gradio interface.
111
-
112
- - **Save State**: Save the current model state to a file.
113
- - **Load State**: Load a previously saved state to restore the model.
114
-
115
- ## Example
116
-
117
- To chat with Dynamic AI using the command line interface:
118
-
119
- ```bash
120
- python app.py
121
- ```
122
-
123
- Then, access the Gradio interface from your browser. You can interact with the AI by typing messages, training it, or saving/loading its state.
124
-
125
- ### Training Data Format
126
-
127
- The Q&A pairs should be provided in a JSON file in the following format:
128
-
129
- ```json
130
- [
131
- {"question": "What is the capital of France?", "answer": "Paris"},
132
- {"question": "Who wrote '1984'?", "answer": "George Orwell"}
133
- ]
134
- ```
135
-
136
- ### Contribution
137
-
138
- Feel free to contribute to this project by submitting pull requests or opening issues for improvements or bugs.
139
-
140
- ### Issues
141
-
142
- The depth settings are extremely important:
143
-
144
- dynamic_ai = DynamicAI(vocab_size=50000, embed_dim=256, latent_dim=256, output_dim=256, max_depth=7)
145
-
146
- As the deeper it gets, the deeper the "fractal ball" around point one "Think big bang" gets, the more
147
- complex it gets. You hit NAN (out of reach) complexity very fast and the thing wont work.
148
-
149
- ### License
150
-
151
- This project is licensed under the MIT License.
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ title: DaFUC
3
+ emoji: 🌀
4
+ colorFrom: purple
5
+ colorTo: blue
6
+ sdk: gradio
7
+ sdk_version: "3.15.0"
8
+ app_file: app.py
9
+ pinned: false
10
+ ---
11
+
12
+
13
+ # Dynamic AI: Fractal Universe Chocolate Wafer Model (FUCWM)
14
+
15
+ ### Watch Demo
16
+
17
+ Check out the demo of the project on YouTube: [Watch Here](https://www.youtube.com/live/d__ras4nLU4)
18
+
19
+
20
+ Dynamic AI is an experimental neural network model inspired by fractal structures in the universe and the human brain. It incorporates recursive nodes (FractalNodes) to dynamically grow and learn through Hebbian-like updates and pruning. The model also integrates a VAE (Variational Autoencoder) for encoding latent space representations. This repository contains the code for training, chatting, and interacting with the model via a Gradio interface.
21
+
22
+ # Attention Mechanism (New)
23
+
24
+ The attention mechanism dynamically adjusts the focus of the model by assigning importance to different child nodes in the fractal structure. Each child node receives an attention score based on its relevance, which is calculated using a softmax function. This allows the model to prioritize certain nodes over others during the forward pass, enabling more efficient learning and processing. Additionally, the model maintains a co-activation matrix that tracks how frequently different nodes are activated together, which further refines the attention scores. This approach enhances the model’s adaptability and helps manage complex hierarchical interactions.
25
+
26
+
27
+ ## Features
28
+
29
+ - **Recursive Fractal Nodes**: Nodes can grow and create child nodes based on the complexity of their output, simulating the recursive, fractal-like nature of the brain and the universe.
30
+ - **Variational Autoencoder (VAE)**: Encodes latent representations of inputs.
31
+ - **Layer Normalization and Xavier Initialization**: Enhances training stability.
32
+ - **Dynamic Complexity-based Growth**: Nodes grow based on complexity thresholds and manage child connections.
33
+ - **Dynamic AI Chat**: Users can interact with the model to generate responses.
34
+ - **LM Studio Integration**: Chat with a local LM Studio instance in a collaborative conversational framework.
35
+ - **Gradio Interface**: A user-friendly interface to interact with the AI model, train it on Q&A pairs, and simulate conversations with LM Studio.
36
+
37
+ ## What it is?
38
+
39
+ Think Fractal ball around big bang with chocolate wafer inspired super weights that add complexity to normal weights.
40
+
41
+ Now with added attention mechanism. I just asked Claude to think of the node 1 as a sort of phone book that keeps tabs
42
+ of the child nodes and can hooke em up if they fire together. So they can.. Wire together. Tsk Tsk. You know what I mean.
43
+
44
+ The depth setting can make the ball complexities explode to Nan territory real fast and there was a real fight to keep the
45
+ complexity setting at bay.
46
+
47
+ ## Requirements
48
+ - Python 3.8+
49
+ - PyTorch
50
+ - Gradio
51
+ - LM Studio (optional, for integration with the `talk_with_lm_studio` feature)
52
+ - Etc
53
+ The requirements.txt was written by ChatGPT. I have not tested if it would work as it is.
54
+
55
+ ## Problems?
56
+
57
+ Ask from Claude / ChatGPT. Paste them this and the code. They will understand what to do. NotebookLM
58
+ talking heads think this is groundbreaking. But since all I hear is crickets. I guess it aint. But it most
59
+ def has been a wild ride.
60
+
61
+ ## Installation
62
+
63
+ 1. Clone this repository:
64
+
65
+ ```bash
66
+ git clone https://github.com/anttiluode/DaFUC.git
67
+ cd dynamic-ai-fractal-wafer
68
+ ```
69
+
70
+ 2. Install dependencies:
71
+
72
+ ```bash
73
+ pip install -r requirements.txt
74
+ ```
75
+
76
+ 3. If you're planning to use LM Studio, ensure it's installed and running locally. Configure the `lm_studio_client` by setting your API key and URL in the code.
77
+
78
+ 4. Run the application:
79
+
80
+ ```bash
81
+ python app.py
82
+ ```
83
+
84
+ ## Usage
85
+
86
+ ### 1. Chat with Dynamic AI
87
+
88
+ You can use the Gradio interface to chat with the Dynamic AI model.
89
+
90
+ - **Message**: Enter your message and adjust the temperature for creativity.
91
+ - **Response**: The AI will generate a response based on its learned knowledge.
92
+
93
+ ### 2. Train the Model on Q&A Pairs
94
+
95
+ You can train the model on a list of question-answer pairs using the Gradio interface.
96
+
97
+ - **Q&A Pairs File**: Upload a JSON file containing question-answer pairs.
98
+ - **Epochs**: Set the number of training epochs.
99
+ - **Training Output**: Monitor the progress of training, including loss metrics.
100
+ - This can lead to the complexity being wildly off and the model begins to parrot the words in the
101
+ - question answer pairs.
102
+
103
+ ### 3. LM Studio Conversation
104
+
105
+ ! You may have to wait a while for the conversation to start. I think perhaps there are
106
+ multiple empty interactions but eventually the model says something and lm studio grabs on to that.
107
+ If you teach the model with question answer pairs it sticks on to them and the complexity
108
+ does not stabilze. On the initial live training video I did at the beginning of this readme
109
+ something amazing happened. The complexity stabilized at 16 and did not budge.
110
+
111
+ You can simulate a collaborative conversation between Dynamic AI and LM Studio:
112
+
113
+ - **Initial Message**: Set the initial message to start the conversation.
114
+ - **Duration**: Set the duration of the conversation.
115
+ - **Delay**: Set the delay between messages.
116
+
117
+ This is a good start. The question answer pairs seem to produce more random AI.
118
+
119
+ ### 4. Save/Load Model State
120
+
121
+ You can save and load the state of the model using the Gradio interface.
122
+
123
+ - **Save State**: Save the current model state to a file.
124
+ - **Load State**: Load a previously saved state to restore the model.
125
+
126
+ ## Example
127
+
128
+ To chat with Dynamic AI using the command line interface:
129
+
130
+ ```bash
131
+ python app.py
132
+ ```
133
+
134
+ Then, access the Gradio interface from your browser. You can interact with the AI by typing messages, training it, or saving/loading its state.
135
+
136
+ ### Training Data Format
137
+
138
+ The Q&A pairs should be provided in a JSON file in the following format:
139
+
140
+ ```json
141
+ [
142
+ {"question": "What is the capital of France?", "answer": "Paris"},
143
+ {"question": "Who wrote '1984'?", "answer": "George Orwell"}
144
+ ]
145
+ ```
146
+
147
+ ### Contribution
148
+
149
+ Feel free to contribute to this project by submitting pull requests or opening issues for improvements or bugs.
150
+
151
+ ### Issues
152
+
153
+ The depth settings are extremely important:
154
+
155
+ dynamic_ai = DynamicAI(vocab_size=50000, embed_dim=256, latent_dim=256, output_dim=256, max_depth=7)
156
+
157
+ As the deeper it gets, the deeper the "fractal ball" around point one "Think big bang" gets, the more
158
+ complex it gets. You hit NAN (out of reach) complexity very fast and the thing wont work.
159
+
160
+ ### License
161
+
162
+ This project is licensed under the MIT License.