--- base_model: [] library_name: transformers tags: - mergekit - merge --- # Untitled Model (1) This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the passthrough merge method. ### Models Merged The following models were included in the merge: * ../../Storage/NousResearch_Meta-Llama-3-70B-Instruct ### Configuration The following YAML configuration was used to produce this model: ```yaml dtype: float16 merge_method: passthrough slices: - sources: - layer_range: [0, 20] model: ../../Storage/NousResearch_Meta-Llama-3-70B-Instruct - sources: - layer_range: [10, 30] model: ../../Storage/NousResearch_Meta-Llama-3-70B-Instruct #Storage/NousResearch_Meta-Llama-3-70B-Instruct - sources: - layer_range: [20, 40] model: ../../Storage/NousResearch_Meta-Llama-3-70B-Instruct #~/Storage/NousResearch_Meta-Llama-3-70B-Instruct - sources: - layer_range: [30, 50] model: ../../Storage/NousResearch_Meta-Llama-3-70B-Instruct #~/Storage/NousResearch_Meta-Llama-3-70B-Instruct - sources: - layer_range: [40, 60] model: ../../Storage/NousResearch_Meta-Llama-3-70B-Instruct #~/Storage/NousResearch_Meta-Llama-3-70B-Instruct - sources: - layer_range: [50, 70] model: ../../Storage/NousResearch_Meta-Llama-3-70B-Instruct # ~/Storage/NousResearch_Meta-Llama-3-70B-Instruct - sources: - layer_range: [60, 80] model: ../../Storage/NousResearch_Meta-Llama-3-70B-Instruct #~/Storage/NousResearch_Meta-Llama-3-70B-Instruct ```