bigyi-15b / README.md
ehartford's picture
Update README.md
15cec8d verified
|
raw
history blame
No virus
2.11 kB
metadata
base_model:
  - 01-ai/Yi-9B
library_name: transformers
tags:
  - mergekit
  - merge
license: other
license_name: yi-license
license_link: LICENSE

bigyi-15b

I recently made bigstral-12b and then I saw this new awesome model yi-9b and decided to make an embiggened version.

This is a merge of pre-trained language models created using mergekit.

Bigyi-15b is a base / completion model, so there is no chat template.

It has a 4k context.

Example

Here is a recipe for Mai Tai:\n\n1:

3 parts rum, 2: 3 parts pineapple juice, 3: half a cup of lime juice, 4: 6 to 8 fresh or frozen pineapple chunks, 5: crushed ice. Mix all ingredients except ice and pour into glasses with ice. Garnish with a pineapple slice.

Here is an implementation of 2-sum in golang:

func twoSum(nums []int, target int) []int {
  if len(nums) <= 1 { return nil }
  m := map[int] bool{}
  for i := range(nums) {{
    n = nums[i]\n\n // find the complement of current number in map
    comp = target - n
    if comp in m { return [m[comp], i+1 ] }
    else { m[n] = true }
  }}
  return nil
}

Output should be [2,3] because that' is the position of where 4+4 =8 exists in our list

Merge Details

Merge Method

This model was merged using the passthrough merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

dtype: float16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 12]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [6, 18]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [12, 24]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [18, 30]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [24, 36]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [30, 42]
    model: 01-ai/Yi-9B
- sources:
  - layer_range: [36, 48]
    model: 01-ai/Yi-9B