This repository contains a pruned and orthogonalized version of the Llama 3 8B model. The model was created by leveraging the pruning method described in the [PruneGPT](https://github.com/nyunAI/PruneGPT) repository to remove unimportant layers from the original Llama 3 8B model. Additionally, the model components were subjected to Orthogonal Activation Steering (OAS), also known as "abliteration", to mitigate refusals and improve versatility for various scenarios. ## Model Description The pruned and orthogonalized Llama 3 8B model was created by using the [grimjim/Llama-3-Oasis-v1-OAS-8B](https://huggingface.co/grimjim/Llama-3-Oasis-v1-OAS-8B) as the base. This base model is a merge of pre-trained language models that were already subjected to Orthogonal Activation Steering (OAS) to mitigate refusals and improve versatility for various scenarios. The following models were merged to create the Llama-3-Oasis-v1-OAS-8B base model: - [mlabonne/NeuralDaredevil-8B-abliterated](https://huggingface.co/mlabonne/NeuralDaredevil-8B-abliterated) - [NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS](https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS) - [Hastagaras/Halu-OAS-8B-Llama3](https://huggingface.co/Hastagaras/Halu-OAS-8B-Llama3) The merge was performed using the task arithmetic merge method, with mlabonne/NeuralDaredevil-8B-abliterated serving as the base model. Upon this merged base model, we applied the pruning method described in the PruneGPT repository to remove unimportant layers, resulting in a more efficient and compact model. The final model is versatile and suitable for both positive and negative roleplay scenarios as well as storytelling. However, please exercise caution when using this model. The model is built upon the Meta Llama 3 architecture. ## Usage To use this model, you can load it using the HuggingFace Transformers library in Python. Here's an example: ```python from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "GazTrab/Pruned-Llama-3-Oasis" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained(model_name) # Perform text generation or other tasks using the loaded model ``` Please refer to the HuggingFace Transformers documentation for more details on how to use the model for various tasks. ## Acknowledgements We would like to acknowledge the following resources and repositories that were used in the creation of this model: - [PruneGPT](https://github.com/nyunAI/PruneGPT) for the pruning method. - [Llama-3-Oasis-v1-OAS-8B](https://huggingface.co/grimjim/Llama-3-Oasis-v1-OAS-8B) for the OAS/abliteration technique and merge components. - [Meta Llama 3](https://huggingface.co/meta-llama/Meta-Llama-3-8B) for the base architecture. ## License This model is released under the [Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0).