News from the AI & ML world

DeeperML

Ronen Dar@NVIDIA Technical Blog //
NVIDIA has announced the open-source release of the KAI Scheduler, a Kubernetes-native GPU scheduling solution. Available under the Apache 2.0 license, the KAI Scheduler was originally developed within the Run:ai platform. This initiative aims to foster an active and collaborative community by encouraging contributions, feedback, and innovation in AI infrastructure. The KAI Scheduler will continue to be packaged and delivered as part of the NVIDIA Run:ai platform.

NVIDIA's move to open source the KAI Scheduler addresses challenges in managing AI workloads on GPUs and CPUs, which traditional resource schedulers often fail to meet. The scheduler dynamically manages fluctuating GPU demands, reduces wait times for compute access by combining gang scheduling, GPU sharing, and a hierarchical queuing system, and it helps to connect AI tools and frameworks seamlessly. By maximizing compute utilization through bin-packing, consolidation and spreading workloads across nodes, KAI Scheduler reduces resource fragmentation.
Original img attribution: https://developer-blogs.nvidia.com/wp-content/uploads/2025/03/dgx-corp-blog-scale-ai-infrastructure-run-ai-3813400-1480x830-1.jpg
ImgSrc: developer-blogs

Share: bluesky twitterx--v2 facebook--v1 threads


References :
  • NVIDIA Technical Blog: NVIDIA Open Sources Run:ai Scheduler to Foster Community Collaboration
  • thenewstack.io: NVIDIA Open Sources KAI Scheduler To Help AI Teams Optimize GPU Utilization
  • AI News | VentureBeat: Nvidia open sources Run:ai Scheduler to foster community collaboration
  • The Stack: NVIDIA opens up GPU utilization tool for underworked AI infrastructure
Classification:
  • HashTags: #NVIDIA #OpenSource #KubeCon
  • Company: Nvidia
  • Target: AI developers
  • Product: Run:ai
  • Feature: GPU Scheduling
  • Type: AI
  • Severity: Informative