| --- |
| inference: false |
| --- |
| |
| <br> |
| <br> |
|
|
| # LWM-Text-1M-Chat Model Card |
|
|
| ## Model details |
|
|
| **Model type:** |
| LWM-Text-1M-Chat is an open-source model trained from LLaMA-2 on a subset of Books3 filtered data. It is an auto-regressive language model, based on the transformer architecture. |
|
|
| **Model date:** |
| LWM-Text-1M-Chat was trained in December 2023. |
|
|
| **Paper or resources for more information:** |
| https://largeworldmodel.github.io/ |
|
|
| ## License |
| Llama 2 is licensed under the LLAMA 2 Community License, |
| Copyright (c) Meta Platforms, Inc. All Rights Reserved. |
|
|
| **Where to send questions or comments about the model:** |
| https://github.com/LargeWorldModel/lwm/issues |
|
|
| ## Training dataset |
| - 800 subset of Books3 documents with 1M plus tokens |