News from the AI & ML world

DeeperML

@simonwillison.net //
Mistral has launched Codestral 25.01, a significantly upgraded code generation model. This new version boasts a 256k token context, a new record for Mistral, and is reported to be twice as fast as its predecessor in generating and completing code. The model supports over 80 programming languages and is designed for low-latency, high-frequency use cases such as fill-in-the-middle tasks, code correction and test generation. Codestral 25.01 has demonstrated impressive benchmark results.

While Codestral 25.01 is not available as open weights, it can be accessed via an API or through IDE partners. The model achieved a joint first-place score on the Copilot Arena leaderboard with Claude 3.5 Sonnet and Deepseek V2.5 (FIM). However, it scored 11% on the aider polyglot benchmark. Developers can try Codestral for free via plugins for VS Code or JetBrains. It is available through the myllm-mistral plugin, using the 'codestral' alias.

Share: bluesky twitterx--v2 facebook--v1 threads


References :
  • Simon Willison's Weblog: Brand new code-focused model from Mistral. Unlike this one isn't ( ) available as open weights.
  • Analytics Vidhya: Codestral 25.01 is here, and it’s a meaningful upgrade for developers.
  • mistral.ai: Mistral AI released new version of Codestral code generation model.
  • THE DECODER: French AI startup Mistral has released Codestral 25.01, an updated version of its code generation model.
  • www.analyticsvidhya.com: Codestral 25.01: AI that Codes Faster than you can Say “Syntax Error”
  • the-decoder.com: Mistral releases updated code model Codestral 25.01
Classification: