Home / News / Artificial Intelligence / Mistral AI releases its first publicly available large-scale language model

Mistral AI releases its first publicly available large-scale language model

While API access may be the norm for the most popular language models, open models (in the broadest sense of the term) are increasingly in demand. French artificial intelligence startup Mistral, which raised a sizable seed round in June, has just released its first model, which it claims outperforms others of its size, and it’s available for free, no strings attached.

A 13.4-gigabyte torrent (with a few hundred seeders) of the Mistral 7B model is currently available for download. The business has also established a Discord help channel and a GitHub repository for communication and problem solving.

Importantly, the model is fully free for any purpose, including redistribution, under the extremely liberal Apache 2.0 license. That makes the model accessible to anyone with access to a system capable of running it locally or willing to pay for the necessary cloud resources, be they a hobbyist, a multi-billion dollar corporation, or the Pentagon.

Mistral 7B is a refined version of previous “small” large language models like Llama 2, providing comparable performance (on some industry-standard benchmarks) at a much lower computational cost. Since foundation models like GPT-4 are so much more expensive and complicated to run, they are typically only accessible via application programming interfaces (APIs) or remote access.

In a blog post published alongside the model’s release, Mistral’s team stated their goal to “become the leading supporter of the open generative AI community, and bring open models to state-of-the-art performance.” The success of Mistral 7B proves that even small models can achieve great things when they put their minds to it. It took us three months to put together the Mistral AI team, rebuild a high-performance MLops stack, and design the most advanced data processing pipeline possible.

The founders had a head start because they had previously worked on similar models at Meta and Google DeepMind, but for some (maybe most), that list sounds like more than three months of work. Of course, that doesn’t make it simple, but at least they had a plan.

Of course, as we discussed last week at Disrupt, just because it’s available for download and use by anyone doesn’t make it “open source” or any variant of that term. Despite the open nature of the license, the datasets and weights used to train the model were developed privately and with private funds.

And that, it would appear, is Mistral’s entire business plan: The basic model is available at no cost, but if you want more features, you’ll need to upgrade to their premium offering. White-box solutions, including both weights and code sources, will be provided as part of [our commercial offering]. Hosted solutions and dedicated deployment for enterprises are currently under development, according to the blog post.

I’ve reached out to Mistral for more information about their openness and future release plans, and I’ll revise this post accordingly if I receive a response.

About Chambers

Check Also

Researchers have recently identified the initial fractal molecule found in the natural world

Fractals, which are self-repeating shapes that can be infinitely magnified without losing their intricate details, …