Skip to content
147 changes: 147 additions & 0 deletions examples/tracing/openai/openai_responses_api_tracing.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,147 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "2722b419",
"metadata": {},
"source": [
"[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/openlayer-ai/openlayer-python/blob/main/examples/tracing/openai/openai_responses_api_tracing.ipynb)\n",
"\n",
"\n",
"# <a id=\"top\">OpenAI Responses API monitoring with Openlayer</a>\n",
"\n",
"This notebook shows how to monitor both OpenAI's Chat Completions API and the new Responses API with Openlayer. The same `trace_openai()` function supports both APIs seamlessly."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "020c8f6a",
"metadata": {},
"outputs": [],
"source": [
"!pip install openlayer openai"
]
},
{
"cell_type": "markdown",
"id": "75c2a473",
"metadata": {},
"source": [
"## 1. Set the environment variables"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "f3f4fa13",
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"\n",
"import openai\n",
"\n",
"# OpenAI API key\n",
"os.environ[\"OPENAI_API_KEY\"] = \"YOUR_OPENAI_API_KEY_HERE\"\n",
"\n",
"# Openlayer configuration\n",
"os.environ[\"OPENLAYER_API_KEY\"] = \"YOUR_OPENLAYER_API_KEY_HERE\"\n",
"os.environ[\"OPENLAYER_INFERENCE_PIPELINE_ID\"] = \"YOUR_OPENLAYER_INFERENCE_PIPELINE_ID_HERE\""
]
},
{
"cell_type": "markdown",
"id": "9758533f",
"metadata": {},
"source": [
"## 2. Create traced OpenAI client"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "c35d9860-dc41-4f7c-8d69-cc2ac7e5e485",
"metadata": {},
"outputs": [],
"source": [
"from openlayer.lib import trace_openai\n",
"\n",
"# Single function traces both Chat Completions AND Responses APIs\n",
"client = trace_openai(openai.OpenAI())"
]
},
{
"cell_type": "markdown",
"id": "72a6b954",
"metadata": {},
"source": [
"## 3. Use Chat Completions API (existing functionality)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "e00c1c79",
"metadata": {},
"outputs": [],
"source": [
"# Chat Completions API - works exactly as before\n",
"response = client.chat.completions.create(\n",
" model=\"gpt-4o-mini\",\n",
" messages=[{\"role\": \"user\", \"content\": \"What is 2 + 2?\"}],\n",
" max_tokens=50\n",
")\n",
"\n",
"response.choices[0].message.content"
]
},
{
"cell_type": "markdown",
"id": "76a350b4",
"metadata": {},
"source": [
"## 4. Use Responses API (new unified interface)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "responses-api-example",
"metadata": {},
"outputs": [],
"source": [
"# Responses API - new unified interface with enhanced metadata\n",
"response = client.responses.create(\n",
" model=\"gpt-4o-mini\",\n",
" input=\"What is 3 + 3?\",\n",
" max_output_tokens=50\n",
")\n",
"\n",
"# Response is automatically traced\n",
"response"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.18"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
Loading