Mistral AI Launches Mistral Large LLM and ‘Le Chat’: Everything You Need to Know

In a world where new artificial intelligence (AI) chatbots and large language models (LLMs) hit the market almost every month, the new Mistral AI models and Le Chat immediately stand out in the crowd.

Le Chat and Mistral Large LLM were developed by Mistral AI, a French AI startup making big noises in the tech world, having already raised €385 million.

The Paris-based company, founded by former Google’s DeepMind and Meta researchers, released Le Chat in beta mode. The chatbot is available for anyone upon signing up.

On the same day, Mistral AI released its most powerful model, Mistral Large — claiming it is second in power only to ChatGPT 4 Turbo. Mistral Large is available through the company’s “La Plateforme” and for Microsoft Azure customers.

Let’s dive right into Mistral AI’s new releases, their technology, and their models, get under the hood, understand how it compares with others, and learn what it contributes to the AI market.

Key Takeaways

  • Mistrail AI released a free public generative AI chatbot where users can run and test their three models.
  • Mistral AI claims its Large LLM model is only slightly outperformed by ChatGPT 4.
  • Mistral AI offers Mistral Large, Small, and Embedded for business at competitive prices.
  • The company’s new AI models and their capabilities, user-friendly design, self-deployment, and performance are impressive and open up a new avenue for AI users.

Mistral AI Large: What Is It & What Can It Do?

Mistral Large is Mistral AI’s most powerful LLM model. It is designed for developers and businesses in mind, allowing them to deploy the model on the cloud or on-premises. According to the company, Mistral Large ranks second in the top LLM global list, with metrics and accuracy only outperformed by ChatGPT 4.

Comparison of GPT-4, Mistral Large (pre-trained), Claude 2, Gemini Pro 1.0, GPT 3.5 and LLaMA 2 70B on MMLU (Measuring massive multitask language understanding). (Mistral)
Comparison of GPT-4, Mistral Large (pre-trained), Claude 2, Gemini Pro 1.0, GPT 3.5, and LLaMA 2 70B on MMLU (Measuring massive multitask language understanding). (Mistral)

As a flagship model, Mistral Large has an impressive latency-to-performance ratio, making it a fast and agile AI. Mistral AI also designed Mistral Large and its other two more cost-effective AI models, Mistral Small and Mistral Embed, to be portable, extremely customizable, and unbiased.

Companies deploying Mistral AI models can benefit from complete modular control over moderation and self-deployment.

Mistral Large LLM Features

Mistral Large’s capacities are still those of a non-multimodal AI. Still, its limitations on the type of content and data it can generate and understand are compensated by its outstanding performance as a text-based model.

The company’s feature lists claims Mistral Large:

  • Can executive high-complexity tasks.
  • Is fluent in English, French, Italian, German, Spanish.
  • Has strong programming language skills.
  • Has a prompt context window of 32k tokens, with excellent recall for retrieval augmentation.
  • Has advanced modular moderation controls.
  • Can deploy on cloud or on-premises.
  • Is designed for self-deployment.
  • Offers advanced levels of customization and control.
  • Is designed for at-scale usage and built for easy customization

Developing with Mistral AI Models

Based on customer stories, we know that Mistral Large and Mistral models can:

  • Be integrated into websites and search engines.
  • Drive workers’ performance applications.
  • Be used to create real-time applications.
  • Give companies the technology they need to offer their customers the opportunity to build their own AI models.
  • Drive gaming experiences.
  • Create AI agents and more.

Mistral Large vs. ChatGPT-4

It is standard in the AI industry for every new model to be compared to AI-leading models, especially against OpenAI’s most powerful versions. Let’s find out how Mistral Large competes against ChatGPT-4 Turbo.

Feature ChatGPT-4 Turbo Mistral Large 
Free  No.

ChatGPT 3.5 is the only free version offered by OpenAI.

ChatGPT 4 can be accessed using Microsoft Copilot.


Free online via Mistral Le Chat.

Language Multi-language AI, but known to perform better in English. Multi-language AI. Fluent in English, French, Italian, German and Spanish.
Models ChatGPT 4 Turbo Large, Next, Small.
Business versions. Yes.

OpenAI offers many AI models to businesses, including ChatGPT 4.


Large and other Mistral AI models are available for enterprises.

Connected to internet No. Only through customizations. No. Only through customizations.
Customization Yes, but technical skills are required. Yes, user-friendly and seamless deployment.
Moderation Standard. Advanced moderation features.
Self-deployment for businesses No. Yes.
Massive Multi-Task Language Understanding

(MMLU benchmark)

86.4% 81.2%
Price  Starts at $10.00 per 1M tokens (input) and $60.00/1M tokens (output). Starts at $8/1M tokens (input) and $24/1M tokens (output).
Type of content generated Language, text, code, and images via Dall-E. Language, text, and code.

Mistral AI offers several things that ChatGPT 4 does not, for example, access for users to test out the different models and advanced moderation and customization, which are needed for businesses and organizations that deploy their own AI models.

Additionally, while Mistral AI at the time did not work with images or sound, and AI challenges and benchmarks place the model slightly under ChatGPT 4 in matters of performance, the technology is significantly cheaper.

While it is true that OpenAI models have global recognition and Mistral has yet to carve out a real reputation in the industry, both companies are supported by Microsoft.

The concept behind Mistral’s AI models also presents themselves as very user-friendly, while OpenAI models are known for requiring skills when deploying them.

Accessing Mistral AI Models

Those interested can test Mistral Large in Le Chat for free after signing up. Businesses can also access the model via Mistral AI’s platform, “La Plateforme“, before needing to subscribe and enter some personal information.

Mistral AI Models: Pricing

The Mistral AI business model is based on pay-to-go structures, and models are billed on a per token basis. There are no monthly subscriptions, and users only pay for what they use. OpenAI has a similar pricing structure.

Prices of Mistral AI models vary depending on their version: 

  • Open-mistral-7b: $0.25/million tokens (input) and $0.25/million tokens (output).
  • Open-mistral-8x7b: $0.70/million tokens (input) and $0.70/million tokens (output).
  • Mistral-small-2402: $2/million tokens (input) and $6/million tokens (output).
  • Mistral-medium: $2.70/million tokens (input) and $8.10/million tokens (output).
  • Mistral-large-2402: $8/million tokens (input) and $24/million tokens (output).
  • Mistral-embed: $0.10/million tokens (input).

Once a user selects a model, accepts the  Terms of Service and Privacy Policy, and adds a payment method, the model is made available on the Mistral AI platform.

Mistral AI and Microsoft

As mentioned, Mistral AI models are also available through Microsoft Azure. Microsoft recently announced that Mistral AI’s premium models are available to customers through the Models as a Service (MaaS) in the Azure AI Studio and Azure Machine Learning model catalog — which also contains OpenAI models.

What is Le Chat?

Like most top AI companies, Mistral’s Le Chat is a free public version. Anyone can take the tech out for a spin and see what it can do by simply signing up and accessing the model.

What Can Le Chat Do?

In essence, Le Chat is a multilingual conversational AI assistant based on Mistral models.

Le Chat is now in beta access, and the company considers it an entry point for users to interact with the various models from Mistral AI. Let’s look at some of its key features:

Access to Three Models

Interestingly, the Le Chat interface allows users to try out Le Chat in three models: Large LLM mode, Next mode, a prototype model aimed at conciseness, and Small LLM, a super fast and more cost-effective solution.

Mode switching in Le Chat between three models. (Mistral)
Mode switching in Le Chat between three models. (Techopedia/Mistral)

Sensitive Content Warnings

Mistral moderation mechanisms are advanced and tunable. This allows Le Chat to warn users when they are moving the conversation in a direction that may lead to the generation of sensitive or controversial content.

State-of-the-art AI Performance

Mistral assures that their LLM ranks second among all models and provides top-tier reasoning capabilities.

Le Chat: Security & Privacy

Mistral AI encrypts data at rest (AES256) and in transit (TLS 1.2+). Additionally, the company does not use user data to train its public model. Through private self-deployment technologies, the company also ensures business data is key private.

Le Chat Content Support

Le Chat is not a multi-modal AI system like Microsoft Copilot or Google’s Gemini. This means it cannot interpret, analyze, and generate different types of content simultaneously.

However, Le Chat does have some excellent capabilities. It can:

  • Answer factual questions.
  • Chat with users, including small talk and chit-chat.
  • Give explanations of concepts and topics.
  • Summarize texts.
  • Translate languages.
  • Generate creative writing, such as stories and poetry.
  • Offer opinions and recommendations.
  • Generate technical writing, such as code and documentation.
  • Classify customer reviews.
  • Generate marketing campaigns and more.

As Mistral AI moves forward, they likely design and release new AI models capable of understanding and working with audio, sound, images, and perhaps videos.

We asked Mistral AI to write a creative application using Python. This is what we got. (Techopedia/Mistral)
We asked Mistral AI to write a creative application using Python. (Techopedia/Mistral)

Mistral Business: Self-Deployment, Customization, and Tuning

Business leaders who see potential in the free version of Le Chat can sign up for Mistral AI Enterprise.

Mistral AI offers, as mentioned, three models: Mistral Large, Mistral Small, and Mistral Embedded. Companies that already use Mistral AI models include Brave, Cloudflare, MongoDB, Hugging Face, BNP Paribas, Orange, and others.

One of the most impressive features of Mistral AI is self-deployment. This technology benefits businesses and organizations by accelerating deployment and increasing access to the technology for all types of users.

Additionally, the models are extremely customizable, and companies can integrate policies by fine-tuning moderation mechanisms offered by Mistral AI.

Early Reviews

The launch of Le Chat and Mistral A’s Large LLM has been well-received by the tech media.

Some users on sites like Reddit have complained that the software is not free and open-source. However, Mistral AI has never claimed to be a non-profit company.

The Bottom Line

Whether or not Mistral AI models are better than ChatGPT is not really the question. Mistral AI has its own business strategy and technology and is moving forward in an independent direction. While comparisons of AI metrics and performance are inevitable, they are not the only factors to consider when using an AI.

Mistral’s user-friendly features, like advanced moderation or self-deployment, are unique to the market and drive accessibility by dropping the technical skills companies usually require to deploy and operate complex LLM models.

Furthermore, supported by Microsoft, the founders of Mistral have known how to capitalize on key powerful partners and establish business relationships with leading companies.

Mistral AI is only getting started. and opens up the AI “top players” conversation, revealing that a path exists beyond OpenAI, Microsoft, or Google, creating a healthier AI market where competition drives quality and everyone benefits.


Related Reading

Related Terms

Ray Fernandez
Senior Technology Journalist

Ray is an independent journalist with 15 years of experience, focusing on the intersection of technology with various aspects of life and society. He joined Techopedia in 2023 after publishing in numerous media, including Microsoft, TechRepublic, Moonlock, Hackermoon, VentureBeat, Entrepreneur, and ServerWatch. He holds a degree in Journalism from Oxford Distance Learning, and two specializations from FUNIBER in Environmental Science and Oceanography. When Ray is not working, you can find him making music, playing sports, and traveling with his wife and three kids.