Spaces:
Running
title: README
emoji: 🚀
colorFrom: red
colorTo: indigo
sdk: static
pinned: false
Do you believe in a better tomorrow? We do. Our team of expert researchers live the dream and work to build it every day.
News
- 💥 TII has open-sourced Falcon LLM for research and commercial utilization! Access the 7B/40B models, and explore our high-quality web dataset, RefinedWeb.
- ✨ Falcon-40B/7B are now available under the Apache 2.0 license, TII has waived all royalties and commercial usage restrictions.
- 🤗 TII is calling for proposals from the global research community and SME entrepreneurs to submit use cases for Falcon LLM, learn more about it on the Falcon LLM website.
Falcon LLM
Falcon LLM is TII's flagship series of large language models, built from scratch using a custom data pipeline and distributed training library. Papers coming soon 😊.
To promote collaborations and drive innovation, we have open-sourced a number of artefacts:
- The Falcon-7/40B pretrained and instruct models, under the Apache 2.0 software license . Falcon-7B/40B models are state-of-the-art for their size, outperforming most other models on NLP benchmarks.
- The RefinedWeb dataset, a massive web dataset with stringent filtering and large-scale deduplication, enabling models trained on web data alone to match or outperform models trained on curated corpora. RefinedWeb is licensed under Apache 2.0.
See below for a detailed list of artefacts in the Falcon LLM family:
Artefact | Link | Type | Details |
---|---|---|---|
🥇 Falcon-40B | Here | pretrained model | 40B parameters trained on 1,000 billion tokens. |
Falcon-40B-Instruct | Here | instruction/chat model | Falcon-40B finetuned on the Baize dataset. |
🥈 Falcon-7B | Here | pretrained model | 6.7B parameters trained on 1,500 billion tokens. |
Falcon-7B-Instruct | Here | instruction/chat model | Falcon-7B finetuned on the Baize, GPT4All, and GPTeacher datasets. |
📀 RefinedWeb | Here | pretraining web dataset | ~600 billion "high-quality" tokens. |
Falcon-RW-1B | Here | pretrained model | 1.3B parameters trained on 350 billion tokens. |
Falcon-RW-7B | Here | pretrained model | 7.5B parameters trained on 350 billion tokens. |
TII Falcon LLM License
We have made our models available under the Apache 2.0 software license.
About us
The Technology Innovation Institute (TII) is a leading global research center dedicated to pushing the frontiers of knowledge. Our teams of scientists, researchers and engineers work in an open, flexible and agile environment to deliver discovery science and transformative technologies. Our work means we will not only prepare for the future; we will create it. Working together, we are committed to inspiring innovation for a better tomorrow.
We are part of Abu Dhabi Government’s Advanced Technology Research Council, which oversees technology research in the emirate. As a disruptor in science, we are setting new standards and serve as a catalyst for change.
Faced with a future of limitless possibilities and supported by strategically funded investments, we are encouraging a culture of discovery. Our work reinforces Abu Dhabi and the UAE’s status as an R&D hub and a global leader in breakthrough technologies.