Spaces:
Running
Running
JingweiZuo
commited on
Commit
β’
544aa61
1
Parent(s):
59d5ecd
Update README.md
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ pinned: false
|
|
14 |
|
15 |
# News
|
16 |
|
17 |
-
* π **[FalconMamba-7B](https://huggingface.co/tiiuae/falcon-mamba-7b) is now available.** The first pure SSM model of the Falcon series released under the same permissive license. You can interact with it [here](https://huggingface.co/spaces/tiiuae/falcon-mamba-playground).
|
18 |
* πΈ **[Falcon2-11B-vlm](https://huggingface.co/tiiuae/falcon-11B-vlm) is now available.** Built on top of the Falcon2-11B model, and released under the same permissive license, this open source model allows users to interact with image content via text.
|
19 |
* π **TII has just released a new generation of models, starting with [Falcon2-11B](https://huggingface.co/tiiuae/falcon-11B)**, a 11B parameters causal decoder-only model and trained over 5,000B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. The model is made available under the [TII Falcon License 2.0](https://falconllm-staging.tii.ae/falcon-2-terms-and-conditions.html), the permissive Apache 2.0-based software license which includes an [acceptable use policy](https://falconllm-staging.tii.ae/falcon-2-acceptable-use-policy.html) that promotes the responsible use of AI.
|
20 |
* π₯ **TII has open-sourced Falcon-180B for research and commercial utilization!** Access the [180B](https://huggingface.co/tiiuae/falcon-180b), as well as [7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) models, and explore our high-quality web dataset, [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb).
|
@@ -24,6 +24,8 @@ pinned: false
|
|
24 |
|
25 |
We are excited to announce the release of our groundbreaking LLM model with a pure SSM architecture, setting a new benchmark by outperforming all previous SSM models and achieving performance on par with leading transformer-based models.
|
26 |
|
|
|
|
|
27 |
| **Artefact** | **Link** | **Type** | **Details** |
|
28 |
|---------------------|------------------------------------------------------------------|-------------------------|-------------------------------------------------------------------|
|
29 |
| π **FalconMamba-7B** | [Here](https://huggingface.co/tiiuae/falcon-mamba-7b) | *pretrained model* | 7B parameters pure SSM trained on ~6,000 billion tokens. |
|
|
|
14 |
|
15 |
# News
|
16 |
|
17 |
+
* π **[FalconMamba-7B](https://huggingface.co/tiiuae/falcon-mamba-7b) is now available.** The first pure SSM model of the Falcon series released under the same permissive license. You can interact with it [here](https://huggingface.co/spaces/tiiuae/falcon-mamba-playground), and find the FalconMamba blogpost **[here](https://huggingface.co/blog/falconmamba)**.
|
18 |
* πΈ **[Falcon2-11B-vlm](https://huggingface.co/tiiuae/falcon-11B-vlm) is now available.** Built on top of the Falcon2-11B model, and released under the same permissive license, this open source model allows users to interact with image content via text.
|
19 |
* π **TII has just released a new generation of models, starting with [Falcon2-11B](https://huggingface.co/tiiuae/falcon-11B)**, a 11B parameters causal decoder-only model and trained over 5,000B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. The model is made available under the [TII Falcon License 2.0](https://falconllm-staging.tii.ae/falcon-2-terms-and-conditions.html), the permissive Apache 2.0-based software license which includes an [acceptable use policy](https://falconllm-staging.tii.ae/falcon-2-acceptable-use-policy.html) that promotes the responsible use of AI.
|
20 |
* π₯ **TII has open-sourced Falcon-180B for research and commercial utilization!** Access the [180B](https://huggingface.co/tiiuae/falcon-180b), as well as [7B](https://huggingface.co/tiiuae/falcon-7b)/[40B](https://huggingface.co/tiiuae/falcon-40b) models, and explore our high-quality web dataset, [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb).
|
|
|
24 |
|
25 |
We are excited to announce the release of our groundbreaking LLM model with a pure SSM architecture, setting a new benchmark by outperforming all previous SSM models and achieving performance on par with leading transformer-based models.
|
26 |
|
27 |
+
More details on the new models and their performance can also be found in our [FalconMamba blogpost](https://huggingface.co/blog/falconmamba).
|
28 |
+
|
29 |
| **Artefact** | **Link** | **Type** | **Details** |
|
30 |
|---------------------|------------------------------------------------------------------|-------------------------|-------------------------------------------------------------------|
|
31 |
| π **FalconMamba-7B** | [Here](https://huggingface.co/tiiuae/falcon-mamba-7b) | *pretrained model* | 7B parameters pure SSM trained on ~6,000 billion tokens. |
|