{ "cells": [ { "metadata": {}, "cell_type": "markdown", "source": [ "# OpenTelemetry with Koog: Tracing your AI agent\n", "\n", "This notebook demonstrates how to add OpenTelemetry-based tracing to a Koog AI agent. We will:\n", "- Emit spans to the console for quick local debugging.\n", "- Export spans to an OpenTelemetry Collector and view them in Jaeger.\n", "\n", "Prerequisites:\n", "- Docker/Docker Compose installed\n", "- An OpenAI API key available in environment variable `OPENAI_API_KEY`\n", "\n", "Start the local OpenTelemetry stack (Collector + Jaeger) before running the notebook:\n", "```bash\n", "./docker-compose up -d\n", "```\n", "After the agent runs, open Jaeger UI:\n", "- http://localhost:16686\n", "\n", "To stop the services later:\n", "```bash\n", "docker-compose down\n", "```\n", "\n", "---" ] }, { "metadata": {}, "cell_type": "code", "outputs": [], "execution_count": null, "source": [ "%useLatestDescriptors\n", "// %use koog" ] }, { "metadata": {}, "cell_type": "code", "outputs": [], "execution_count": null, "source": [ "import ai.koog.agents.core.agent.AIAgent\n", "import ai.koog.agents.features.opentelemetry.feature.OpenTelemetry\n", "import ai.koog.prompt.executor.clients.openai.OpenAIModels\n", "import ai.koog.prompt.executor.llms.all.simpleOpenAIExecutor\n", "import io.opentelemetry.exporter.logging.LoggingSpanExporter\n", "import io.opentelemetry.exporter.otlp.trace.OtlpGrpcSpanExporter\n" ] }, { "metadata": {}, "cell_type": "markdown", "source": [ "## Configure OpenTelemetry exporters\n", "\n", "In the next cell, we:\n", "- Create a Koog AIAgent\n", "- Install the OpenTelemetry feature\n", "- Add two span exporters:\n", " - LoggingSpanExporter for console logs\n", " - OTLP gRPC exporter to http://localhost:4317 (Collector)\n", "\n", "This mirrors the example description: console logs for local debugging and OTLP for viewing traces in Jaeger.\n" ] }, { "metadata": {}, "cell_type": "code", "outputs": [], "execution_count": null, "source": [ "val agent = AIAgent(\n", " executor = simpleOpenAIExecutor(System.getenv(\"OPENAI_API_KEY\")),\n", " llmModel = OpenAIModels.Reasoning.GPT4oMini,\n", " systemPrompt = \"You are a code assistant. Provide concise code examples.\"\n", ") {\n", " install(OpenTelemetry) {\n", " // Add a console logger for local debugging\n", " addSpanExporter(LoggingSpanExporter.create())\n", "\n", " // Send traces to OpenTelemetry collector\n", " addSpanExporter(\n", " OtlpGrpcSpanExporter.builder()\n", " .setEndpoint(\"http://localhost:4317\")\n", " .build()\n", " )\n", " }\n", "}" ] }, { "metadata": {}, "cell_type": "markdown", "source": [ "## Run the agent and view traces in Jaeger\n", "\n", "Execute the next cell to trigger a simple prompt. You should see:\n", "- Console span logs from the LoggingSpanExporter\n", "- Traces exported to your local OpenTelemetry Collector and visible in Jaeger at http://localhost:16686\n", "\n", "Tip: Use the Jaeger search to find recent traces after you run the cell.\n" ] }, { "metadata": {}, "cell_type": "code", "outputs": [], "execution_count": null, "source": [ "import ai.koog.agents.utils.use\n", "import kotlinx.coroutines.runBlocking\n", "\n", "runBlocking {\n", " agent.use { agent ->\n", " println(\"Running agent with OpenTelemetry tracing...\")\n", "\n", " val result = agent.run(\"Tell me a joke about programming\")\n", "\n", " \"Agent run completed with result: '$result'.\\nCheck Jaeger UI at http://localhost:16686 to view traces\"\n", " }\n", "}" ] }, { "metadata": {}, "cell_type": "markdown", "source": [ "## Cleanup and troubleshooting\n", "\n", "When you're done:\n", "\n", "- Stop services:\n", " ```bash\n", " docker-compose down\n", " ```\n", "\n", "- If you don't see traces in Jaeger:\n", " - Ensure the stack is running: `./docker-compose up -d` and give it a few seconds to start.\n", " - Verify ports:\n", " - Collector (OTLP gRPC): http://localhost:4317\n", " - Jaeger UI: http://localhost:16686\n", " - Check container logs: `docker-compose logs --tail=200`\n", " - Confirm your `OPENAI_API_KEY` is set in the environment where the notebook runs.\n", " - Make sure the endpoint in the exporter matches the collector: `http://localhost:4317`.\n", "\n", "- What spans to expect:\n", " - Koog agent lifecycle\n", " - LLM request/response metadata\n", " - Any tool execution spans (if you add tools)\n", "\n", "You can now iterate on your agent and observe changes in your tracing pipeline.\n" ] } ], "metadata": { "kernelspec": { "display_name": "Kotlin", "language": "kotlin", "name": "kotlin" }, "language_info": { "name": "kotlin", "version": "2.2.20-Beta2", "mimetype": "text/x-kotlin", "file_extension": ".kt", "pygments_lexer": "kotlin", "codemirror_mode": "text/x-kotlin", "nbconvert_exporter": "" }, "ktnbPluginMetadata": { "projectDependencies": [ "koog-agents.examples.main" ], "projectLibraries": false } }, "nbformat": 4, "nbformat_minor": 0 }