Meta has released llama 3.1. It seems to be a significant improvement to an already quite good model. It is now multilingual, has a 128k context window, has some sort of tool chaining support and, overall, performs better on benchmarks than its predecessor.
With this new version, they also released their 405B parameter version, along with the updated 70B and 8B versions.
I’ve been using the 3.0 version and was already satisfied, so I’m excited to try this.
I’ll just stick to Mistral
Are you using mistral 7B?
I also really like that model and their fine-tunes. If licensing is a concern, it’s definitely a great choice.
Mistral also has a new model, Mistral Nemo. I haven’t tried it myself, but I heard it’s quite good. It’s also licensed under Apache 2.0 as far as I know.
Is it part of ollama?
Edit: https://ollama.com/library/mistral-nemo
Yes, you can find it here.