2 min

The new solution allows organisations to better track the costs and performance of their AI utilization.

This week Datadog announced a new integration to help monitor API usage patterns, costs and performance for various OpenAI models, including GPT-4 and other completion models. Datadog’s observability capabilities “simplify the process of data collection through tracing libraries so that customers can easily and quickly start monitoring their OpenAI usage”, the company claimed.

Aaron Kaplan and Shri Subramanian, the company’s Senior Product Manager, detailed the new integration in a blog post. “Usage of OpenAI’s products is expanding fast”, they write. “As diverse teams and users experiment with and build upon the company’s models, it’s important for organizations to monitor and understand this usage”.

Usage costs, they explain, vary depending on the models queried and are based on the consumption of “tokens”, which are common sequences of characters that comprise the overall throughput of prompts (textual input) and completions (the corresponding output).

Datadog’s OpenAI integration comes with an out-of-the-box dashboard that helps managers understand usage trends throughout their organization by breaking down API requests by OpenAI model, service, organization ID, and API key.

Unique organization IDs can be assigned to individual teams, enabling admins to track where OpenAI’s models are used in their organization and to what extent. Tracking by API keys can help customers break down usage by specific users so they can attribute spikes in usage and costs. It also enables admins to trace unauthorized API access back to specific keys and users, they add.

Managing API costs and performance

“In addition to tracking usage patterns, it’s important to track the overall rate and volume of your OpenAI usage in order to ensure that you don’t breach the API’s rate limits, which apply to both requests per minute and token usage per minute”, Kaplan and Subramanian explain. Datadog’s integration offers recommended monitors to allow admins to proactively track these metrics and avoid encountering rate-limit errors or excessive latencies, they say.

In addition to tracking internal usage patterns and costs, Datadog’s integration helps monitor the performance of the OpenAI API, the post continues. “Various metrics track API error rates and response times, and the out-of-the-box dashboard supplies this data for each of the models in use and the services and organizations using them”, they add.