News from the AI & ML world
Jaime Hampton@AIwire
// 60d
Cerebras Systems is expanding its role in AI inference with a new partnership with Hugging Face and the launch of six new AI datacenters across North America and Europe. The partnership with Hugging Face integrates Cerebras' inference capabilities into the Hugging Face Hub, granting access to the platform's five million developers. This integration allows developers to use Cerebras as their inference provider for models like Llama 3.3 70B, powered by the Cerebras CS-3 systems.
Cerebras is also launching six new AI inference datacenters located across North America and Europe. Once fully operational, these centers are expected to significantly increase Cerebras' capacity to handle high-speed inference workloads, supporting over 40 million Llama 70B tokens per second. The expansion includes facilities in Dallas, Minneapolis, Oklahoma City, Montreal, New York and France, with 85% of the total capacity located in the United States.
ImgSrc: www.aiwire.net
References :
- venturebeat.com: Cerebras just announced 6 new AI datacenters that process 40M tokens per second — and it could be bad news for Nvidia
- AIwire: Cerebras Scales AI Inference with Hugging Face Partnership and Datacenter Expansion
- THE DECODER: Nvidia rival Cerebras opens six data centers for rapid AI inference
Classification:
- HashTags: #Cerebras #HuggingFace #AIInference
- Company: Cerebras
- Target: AI Developers
- Product: CS-3
- Feature: AI Inference
- Type: AI
- Severity: Informative