https://arxiv.org/abs/2302.13971

 

LLaMA: Open and Efficient Foundation Language Models

We introduce LLaMA, a collection of foundation language models ranging from 7B to 65B parameters. We train our models on trillions of tokens, and show that it is possible to train state-of-the-art models using publicly available datasets exclusively, witho

arxiv.org

 

'AI > Paper Analysis' 카테고리의 다른 글

ChatGPT-1 Paper  (0) 2025.04.02
LoRA Paper  (0) 2025.04.01
LIama 2 Paper Model  (0) 2025.04.01

+ Recent posts