News from the AI & ML world
@www.analyticsvidhya.com
//
MiniMaxAI, a Chinese AI company, has launched MiniMax-M1, a large-scale open-source reasoning model, marking a significant step in the open-source AI landscape. Released on the first day of the "MiniMaxWeek" event, MiniMax-M1 is designed to compete with leading models like OpenAI's o3, Claude 4, and DeepSeek-R1. Alongside the model, MiniMax has released a beta version of an agent capable of running code, building applications, and creating presentations. MiniMax-M1 presents a flexible option for organizations looking to experiment with or scale up advanced AI capabilities while managing costs.
MiniMax-M1 boasts a 1 million token context window and utilizes a new, highly efficient reinforcement learning technique. The model comes in two variants, MiniMax-M1-40k and MiniMax-M1-80k. Built on a Mixture-of-Experts (MoE) architecture, the model is trained on 456 billion parameters. MiniMax has introduced Lightning Attention for its M1 model, dramatically reducing inference costs and only consumes 25% of the floating point operations (FLOPs) required by DeepSeek R1 at a generation length of 100,000 tokens.
Available on AI code sharing communities like Hugging Face and GitHub, MiniMax-M1 is released under the Apache 2.0 license, enabling businesses to freely use, modify, and implement it for commercial applications without restrictions or payment. MiniMax-M1 features a web search functionality and can handle multimodal input like text, images, and presentations. The expansive context window allows the model to exchange information equivalent to a small collection or book series, far exceeding OpenAI's GPT-4o, which has a context window of 128,000 tokens.
References :
- AI News | VentureBeat: MiniMax-M1 presents a flexible option for organizations looking to experiment with or scale up advanced AI capabilities while managing costs.
- Analytics Vidhya: The Chinese AI company, MiniMaxAI, has just launched a large-scale open-source reasoning model, named MiniMax-M1. The model, released on Day 1 of the 5-day MiniMaxWeek event, seems to give a good competition to OpenAI o3, Claude 4, DeepSeke-R1, and other contemporaries.
- The Rundown AI: PLUS: MiniMax’s new open-source reasoner with 1M token context
- www.analyticsvidhya.com: The Chinese AI company, MiniMaxAI, has just launched a large-scale open-source reasoning model, named MiniMax-M1.
- www.marktechpost.com: MiniMax AI Releases MiniMax-M1: A 456B Parameter Hybrid Model for Long-Context and Reinforcement Learning RL Tasks
Classification:
- HashTags: #MiniMaxAI #OpenSourceAI #ReasoningModel
- Company: MiniMax
- Target: AI Community
- Product: MiniMax-M1
- Feature: Open Source AI Model
- Type: AI
- Severity: Informative