--- base_model: - 01-ai/Yi-9B library_name: transformers tags: - mergekit - merge license: other license_name: yi-license license_link: LICENSE --- # bigyi-15b I recently made [bigstral-12b](https://huggingface.co/abacusai/bigstral-12b-32k) and then I saw this new awesome model [yi-9b](https://huggingface.co/01-ai/Yi-9B) and decided to make an embiggened version. This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). Bigyi-15b is a base / completion model, so there is no chat template. It has a 4k context. ## Example Here is a recipe for Mai Tai:\n\n1: > 3 parts rum, 2: 3 parts pineapple juice, 3: half a cup of lime juice, 4: 6 to 8 fresh or frozen pineapple chunks, 5: crushed ice. > Mix all ingredients except ice and pour into glasses with ice. Garnish with a pineapple slice. Here is an implementation of 2-sum in golang: > ```go > func twoSum(nums []int, target int) []int { > if len(nums) <= 1 { return nil } > m := map[int] bool{} > for i := range(nums) {{ > n = nums[i] > > // find the complement of current number in map > comp = target - n > if comp in m { return [m[comp], i+1 ] } > else { m[n] = true } > }} > return nil > } > ``` ## Merge Details ### Merge Method This model was merged using the passthrough merge method. ### Models Merged The following models were included in the merge: * [01-ai/Yi-9B](https://huggingface.co/01-ai/Yi-9B) ### Configuration The following YAML configuration was used to produce this model: ```yaml dtype: float16 merge_method: passthrough slices: - sources: - layer_range: [0, 12] model: 01-ai/Yi-9B - sources: - layer_range: [6, 18] model: 01-ai/Yi-9B - sources: - layer_range: [12, 24] model: 01-ai/Yi-9B - sources: - layer_range: [18, 30] model: 01-ai/Yi-9B - sources: - layer_range: [24, 36] model: 01-ai/Yi-9B - sources: - layer_range: [30, 42] model: 01-ai/Yi-9B - sources: - layer_range: [36, 48] model: 01-ai/Yi-9B ```