Mistral releases its AI assistant on iOS and Android
Mistral, the company sometimes considered Europe’s great hope for AI, is releasing several updates to its AI assistant, Le Chat. In addition to a major upgrade to the web interface, the company is releasing a mobile app on iOS and Android.
As a reminder, Mistral develops its own large language models. The company’s flagship models, such as Mistral Large or its multimodal model Pixtral Large, are available for commercial use through an API or through a cloud partner, such as Azure AI Studio, Amazon Bedrock and Google’s Vertex AI. It also releases a number of open-weight models under the Apache 2.0 license.
Mistral aims to position itself as a credible alternative to OpenAI or Anthropic. In addition to developing foundation models, it offers an AI assistant called Le Chat that competes directly with ChatGPT, Claude, Google Gemini and Microsoft Copilot.
That’s why Mistral is finally releasing a mobile app for Le Chat so that it can better compete for a coveted spot on your phone’s home screen. The mobile app features the usual chatbot interface. You can query Mistral’s AI model and ask followup questions in a simple conversation-like interface.
Over the last few months, Le Chat has evolved to become a competent AI assistant. In November 2024, Mistral added support for web search with citations. It also lets you generate images or interact with free-form canvas to edit text or code in a separate window.
More recently, the company signed a wide-ranging deal with Agence France-Presse (AFP) to ground results with a reliable source of information. However, there’s no voice mode in Mistral’s mobile app for people who rely a lot on voice interactions to query AI assistants.
![Screenshot 2025 02 06 at 4.01.02PM](https://techcrunch.com/wp-content/uploads/2025/02/Screenshot-2025-02-06-at-4.01.02PM.png?w=680)
With today’s update to Le Chat, Mistral is also introducing a Pro tier for $14.99 per month, or €14.99 per month in Europe. While the company is no longer detailing the exact AI model it is using under the hood, the company says the Pro plan comes with access to the “highest-performing model,” which suggests that free accounts don’t have access to the top-of-the-line model.
Other benefits include higher limits across the board and the ability to opt out of sharing your data with Mistral — the company is bringing the pay-or-consent model to AI assistants.
Up to 1,000 words per second
So what makes Mistral stand out from the competition? The company doesn’t claim that it has better models than the competition. But it asserts that, from a product standpoint, it can sometimes outperform its competitors.
For instance, Mistral states that Le Chat runs on “the fastest inference engines on the planet” and it can answer up to 1,000 words per second. In our usage, it felt faster than using ChatGPT with the 4o model, indeed.
Mistral also claims that it generates much better images compared to ChatGPT or Grok. The reason Mistral performs well on this front is that it relies on one of the leading image-generation models, Black Forest Labs’ Flux Ultra model.
Those are nice-to-have features, but one of the main distinguishing factors come with Le Chat’s enterprise solution. Mistral lets you deploy Le Chat in your environment with custom models and a custom user interface.
If you work in defense or banking, you may need to be able to deploy an AI assistant on premise. And it’s currently not possible with ChatGPT Enterprise or Claude Enterprise. That will help when it comes to growing the company’s revenue.
But first, let’s see if Mistral’s mobile app can persuade some AI enthusiasts to try its chatbot. As of this writing, ChatGPT, DeepSeek, and Google Gemini hold the #2, #3, and #6 spots, respectively, among the most downloaded iPhone apps in the U.S.