LLama

 A variety of model sizes were trained ranging from 7 billion to 65 billion parameters. LLaMA's developers reported that the 13 billion parameter model's performance on most NLP benchmarks exceeded that of the much larger GPT-3 (with 175 billion parameters) and that the largest model was competitive with state of the art models such as PaLM and Chinchilla.

Site: https://llama.ai


Price: Free

Click to order
Fill out the audit application form
Total: 
We will contact the auditor and agree on the best audit conditions for you
Made on
Tilda