granite-7b-base / README.md
JRosenkranz's picture
Update README.md
625f5e4 verified
|
raw
history blame
1.03 kB
metadata
license: apache-2.0

Model Name: Granite-7b-base

License: Apache-2.0

Languages: Primarily English

Architecture: The model architecture is a replica of Meta’s Llama2-7B base variant with MHA, trained with 1M batch size on 2T tokens.

Context Length: 4k tokens

Tokenizer: Llama2

Model Developers: IBM Research

Representing IBM’s commitment to open source innovation IBM has released granite-7b-base, a base pre-trained LLM from IBM’s Granite model series, under an apache-2.0 license for community and commercial use. Granite-7b-base was pre-trained from scratch on IBM-curated data as an open reference implementation of Meta’s Llama-2-7B. In a commitment to data transparency and fostering open innovation, the data sources, sampling proportions, and URLs for access are provided below.

Pre-Training Data

The model was trained on 2T tokens, with sampling proportions designed to match the sampling distributions released in the Llama1 paper as closely as possible.