French AI startup Mistral is releasing a new AI model, Mistral Medium 3, that’s focused on efficiency without compromising performance.
Available in Mistral’s API priced at $0.40 per million input tokens and $2 per million output tokens, Mistral Medium 3 performs “at or above” 90% of Anthropic’s costlier Claude Sonnet 3.7 model on “benchmarks across the board,” claims Mistral. It also surpasses recent open models including Meta’s Llama 4 Maverick and Cohere’s Command A on popular AI performance evaluations.
Tokens are the raw bits of data models work with, with a million tokens equivalent to about 750,000 words (roughly 163,000 words longer than “War and Peace”).
“Mistral Medium 3 can […] be deployed on any cloud, including self-hosted environments of four GPUs and above,” explained Mistral in a blog post sent to TechCrunch. “On pricing, the model beats cost leaders such as DeepSeek v3, both in API and self-deployed systems.”

Mistral, founded in 2023, is a frontier model lab, aiming to build a range of AI-powered services including a chatbot platform, Le Chat, and mobile apps. It’s backed by VCs including General Catalyst, and has raised over €1.1 billion (roughly $1.24 billion) to date. Mistral’s customers include BNP Paribas, AXA, and Mirakl.
According to Mistral, Mistral Medium 3 is best for coding and STEM tasks, and excels at multimodal understanding. The company says that clients in financial services, energy, and healthcare have been beta testing the model for use cases like customer service, workflow automation, and analyzing complex data sets.
In addition to Mistral’s API, where enterprise customers can work with Mistral to fine-tune it, Mistral Medium 3 is available on Amazon’s Sagemaker platform starting Wednesday. It’ll soon come to other hosts, including Microsoft’s Azure AI Foundry and Google’s Vertex AI platforms, the company added.
Techcrunch event
Berkeley, CA | June 5
The launch of Mistral Medium 3 follows on the heels of Mistral’s Mistral Small 3.1 in March. In its blog post, the company teased the release of a much larger model in the coming weeks.
Mistral on Wednesday also launched Le Chat Enterprise, a corporate-focused chatbot service that offers tools like an AI “agent” builder and integrates Mistral’s models with third-party services like Gmail, Google Drive, and SharePoint. Le Chat Enterprise rolled out in private preview earlier this year, but today marks its general availability.
Le Chat Enterprise will soon support MCP, Anthropic’s standard for connecting AI assistants to the systems and software where data resides. Other major AI model providers, including Google and OpenAI, announced that they would adopt MCP earlier this year.
Updated 7:48 a.m. Pacific: Fixed a pricing mistake — the version of Mistral’s blog post TechCrunch received had a typo. We regret the error
Kyle Wiggers is TechCrunch’s AI Editor. His writing has appeared in VentureBeat and Digital Trends, as well as a range of gadget blogs including Android Police, Android Authority, Droid-Life, and XDA-Developers. He lives in Manhattan with his partner, a music therapist.