News from the AI & ML world

DeeperML - #g-assist

Isha Salian@NVIDIA Blog // 37d
Nvidia is pushing the boundaries of artificial intelligence with a focus on multimodal generative AI and tools to enhance AI model integration. Nvidia's research division is actively involved in advancing AI across various sectors, underscored by the presentation of over 70 research papers at the International Conference on Learning Representations (ICLR) in Singapore. These papers cover a diverse range of topics including generative AI, robotics, autonomous driving, and healthcare, demonstrating Nvidia's commitment to innovation across the AI spectrum. Bryan Catanzaro, vice president of applied deep learning research at NVIDIA, emphasized the company's aim to accelerate every level of the computing stack to amplify the impact and utility of AI across industries.

Research efforts at Nvidia are not limited to theoretical advancements. The company is also developing tools that streamline the integration of AI models into real-world applications. One notable example is the work being done with NVIDIA NIM microservices, which are being leveraged by researchers at the University College London (UCL) Deciding, Acting, and Reasoning with Knowledge (DARK) Lab to benchmark agentic LLM and VLM reasoning for gaming. These microservices simplify the deployment and scaling of AI models, enabling researchers to efficiently handle workloads of any size and customize models for specific needs.

Nvidia's NIM microservices are designed to redefine how researchers and developers deploy and scale AI models, offering a streamlined approach to harnessing the power of GPUs. These microservices simplify the process of running AI inference workloads by providing pre-optimized engines such as NVIDIA TensorRT and NVIDIA TensorRT-LLM, which deliver low-latency, high-throughput performance. The microservices also offer easy and fast API integration with standard frontends like the OpenAI API or LangChain for Python environments.

Share: bluesky twitterx--v2 facebook--v1 threads


References :
  • developer.nvidia.com: Researchers from the University College London (UCL) Deciding, Acting, and Reasoning with Knowledge (DARK) Lab leverage NVIDIA NIM microservices in their new research on benchmarking agentic LLM and VLM reasoning for gaming.
  • BigDATAwire: Nvidia is actively involved in research related to multimodal generative AI, including efforts to improve the reasoning capabilities of LLM and VLM models for use in gaming.
Classification:
@developer.nvidia.com // 38d
NVIDIA is enhancing AI capabilities on RTX AI PCs with the introduction of a new plug-in builder for G-Assist. This innovative tool allows users to customize and expand G-Assist's functionality by integrating it with various Large Language Models (LLMs) and software applications. The plug-in builder is designed to enable users to generate AI-assisted functionalities through text and voice commands, effectively transforming G-Assist from a gaming-centric AI into a versatile tool adaptable to diverse applications, both gaming-related and otherwise.

The G-Assist plug-in builder facilitates the creation of custom commands and connections to external tools through APIs, allowing different software and services to communicate with each other. Developers can leverage coding languages like JSON and Python to create and integrate tools into G-Assist. NVIDIA has provided a GitHub repository with instructions and documentation for building and customizing these plug-ins, and users can even submit their creations for potential inclusion in the repository to share new capabilities with others. Examples of plug-in capabilities include seeking advice from AI assistants like Gemini on gaming strategies and using Twitch plug-ins to monitor streamer status via voice commands.

Furthermore, NVIDIA is advancing AI research and application across industries, demonstrated by their participation in the International Conference on Learning Representations (ICLR). NVIDIA Research presented over 70 papers at ICLR, showcasing developments in areas such as autonomous vehicles, healthcare, and multimodal content creation. Notably, researchers from University College London (UCL) are leveraging NVIDIA NIM microservices to benchmark agentic capabilities of AI models in gaming environments, highlighting the role of NIM in simplifying and accelerating the evaluation of AI reasoning in complex tasks. NIM microservices enable efficient deployment and scaling of AI models, supporting various platforms and workflows, making them a versatile solution for diverse research applications.

Share: bluesky twitterx--v2 facebook--v1 threads


References :
  • blogs.nvidia.com: NVIDIA Research at ICLR — Pioneering the Next Wave of Multimodal Generative AI
  • www.tomshardware.com: Nvidia introduces G-Assist plug-in builder, allowing its AI to integrate with LLMs and software
Classification: