tinyllama:1.1b
2.2M Downloads Updated 1 year ago
The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.
1.1b
Updated 1 year ago
1 year ago
2644915ede35 · 638MB
Readme
TinyLlama is a compact model with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.