New relica comprehensive observation platform for every engineer, announced that it is integrating its platform with NIM module from NVIDIA inference microservices to reduce the complexity and cost of developing, deploying, and monitoring generative AI (GenAI) applications. Now, customers can employ New Relic AI monitoring to achieve end-to-end visibility across the entire AI stack for applications built with NVIDIA NIM, all with simplified configuration and while ensuring data security. This complements the stalwart security features and ease of employ of self-hosted NVIDIA NIM models, which accelerates the generative delivery of AI applications. Together, Recent Relic integrated with NVIDIA NIM can lend a hand customers adopt AI faster and achieve faster ROI.
Observability is crucial to implementing cost-effective and productive models
Organizations are rapidly adopting generative AI to improve digital experiences, raise productivity and raise revenue. Gartner predicts that by 2026, more than 80% of enterprises will employ GenAI or deploy GenAI applications. Rapid implementation and faster ROI are crucial for organizations to gain market advantage, and observability is key. It offers a holistic, real-time view of the AI application stack – across services, infrastructure and the AI layer – to ensure productive, reliable and cost-effective operation.
Recent Relic accelerates ROI for AI applications built with NVIDIA NIM
AI applications can complicate technology stacks, raise security concerns, and be steep. Recent Relic AI Monitoring provides a comprehensive view of your AI stack, along with key metrics for throughput, latency and cost, while ensuring data privacy. It also tracks request flows across services and models to understand the inner workings of AI applications. Recent Relic extends its in-depth monitoring to NVIDIA NIM, supporting a wide range of AI models including – Databricks DBRX, Google Gemma, Meta Llama 3, Microsoft Phi-3, Mistral Enormous and Mixtral 8x22B, and Snowflake Arctic. This helps organizations deploy AI applications built with NVIDIA NIM with confidence, accelerate time to market, and improve return on investment.
Key features and employ cases for AI monitoring include:
- Full visibility of your AI stack: Detect issues faster with a holistic view of your applications, NVIDIA GPU-based infrastructure, AI layer, response quality, token count, and key APM signals.
- Deep insights into tracking every response: Fix performance and quality issues like bias, toxicity, and hallucinations by tracking the entire AI response lifecycle
- Model inventory: Easily isolate model-related performance, error, and cost issues by tracking key metrics across all NVIDIA NIM inference microservices in one place
- Model comparison: Compare the performance of NVIDIA NIM inference microservices running in production in a single view to optimize model selection based on your infrastructure and user needs.
- Deep GPU information: Analyze critical accelerated processing metrics such as GPU utilization, temperature, and performance states; understand context and resolve issues faster.
- Increased data security:In addition to the security benefits of NVIDIA’s self-hosted model, Recent Relic allows you to exclude monitoring of sensitive information (PII) in AI requests and responses.
Recent Relic deepens its ecosystem of 60+ AI integrations with NVIDIA
This integration follows the recent addition of Recent Relic to NVIDIA AIOps Partner Ecosystem. Leveraging NVIDIA AI accelerated computing, Recent Relic combines observability and AI to streamline IT operations and accelerate innovation with machine learning and a generative AI assistant, Recent Relic AI. Recent Relic offers the most comprehensive observability solution with over 60 AI integrations, including NVIDIA GPUs and NVIDIA Triton Inference Server software.