Upload README.md
Browse files
README.md
CHANGED
|
@@ -358,7 +358,7 @@ For Hugging Face support, we recommend using transformers or TGI, but a similar
|
|
| 358 |
|
| 359 |
**Overview** Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
|
| 360 |
|
| 361 |
-
**Data Freshness** The pretraining data has a cutoff of March 2023 for the
|
| 362 |
|
| 363 |
|
| 364 |
## Benchmarks
|
|
|
|
| 358 |
|
| 359 |
**Overview** Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
|
| 360 |
|
| 361 |
+
**Data Freshness** The pretraining data has a cutoff of March 2023 for the 8B and December 2023 for the 70B models respectively.
|
| 362 |
|
| 363 |
|
| 364 |
## Benchmarks
|