{"cells": [{"cell_type": "markdown", "id": "24103c51", "metadata": {}, "source": ["\"在\n"]}, {"cell_type": "markdown", "id": "99cea58c-48bc-4af6-8358-df9695659983", "metadata": {}, "source": ["# 构建你自己的OpenAI代理\n"]}, {"cell_type": "markdown", "id": "673df1fe-eb6c-46ea-9a73-a96e7ae7942e", "metadata": {}, "source": ["使用支持函数调用的[新OpenAI API](https://openai.com/blog/function-calling-and-other-api-updates),构建自己的代理程序变得前所未有的简单!\n", "\n", "在这个笔记本教程中,我们展示了如何在**不到50行代码**的情况下编写自己的OpenAI代理程序!它非常简洁,但功能齐全(具有进行对话和使用工具的能力)。\n"]}, {"cell_type": "markdown", "id": "54b7bc2e-606f-411a-9490-fcfab9236dfc", "metadata": {}, "source": ["## 初始设置\n"]}, {"cell_type": "markdown", "id": "23e80e5b-aaee-4f23-b338-7ae62b08141f", "metadata": {}, "source": ["让我们从导入一些简单的基本模块开始。\n", "\n", "我们主要需要的是:\n", "1. OpenAI API(使用我们自己的 `llama_index` LLM 类)\n", "2. 保存对话历史的位置\n", "3. 我们的代理可以使用的工具的定义。\n"]}, {"cell_type": "markdown", "id": "41101795", "metadata": {}, "source": ["如果您在Colab上打开这个笔记本,您可能需要安装LlamaIndex 🦙。\n"]}, {"cell_type": "code", "execution_count": null, "id": "4985c578", "metadata": {}, "outputs": [], "source": ["%pip install llama-index-agent-openai\n", "%pip install llama-index-llms-openai"]}, {"cell_type": "code", "execution_count": null, "id": "c61c873d", "metadata": {}, "outputs": [], "source": ["!pip install llama-index"]}, {"cell_type": "code", "execution_count": null, "id": "9d47283b-025e-4874-88ed-76245b22f82e", "metadata": {}, "outputs": [], "source": ["import json\n", "from typing import Sequence, List\n", "\n", "from llama_index.llms.openai import OpenAI\n", "from llama_index.core.llms import ChatMessage\n", "from llama_index.core.tools import BaseTool, FunctionTool\n", "from openai.types.chat import ChatCompletionMessageToolCall\n", "\n", "import nest_asyncio\n", "\n", "nest_asyncio.apply()"]}, {"cell_type": "markdown", "id": "6fe08eb1-e638-4c00-9103-5c305bfacccf", "metadata": {}, "source": ["让我们为我们的代理人定义一些非常简单的计算器工具。\n"]}, {"cell_type": "code", "execution_count": null, "id": "3dd3c4a6-f3e0-46f9-ad3b-7ba57d1bc992", "metadata": {}, "outputs": [], "source": ["def multiply(a: int, b: int) -> int:\n", " \"\"\"将两个整数相乘,并返回结果整数\"\"\"\n", " return a * b\n", "\n", "\n", "multiply_tool = FunctionTool.from_defaults(fn=multiply)"]}, {"cell_type": "code", "execution_count": null, "id": "bfcfb78b-7d4f-48d9-8d4c-ffcded23e7ac", "metadata": {}, "outputs": [], "source": ["def add(a: int, b: int) -> int:\n", " \"\"\"对两个整数进行相加,并返回结果整数\"\"\"\n", " return a + b\n", "\n", "\n", "add_tool = FunctionTool.from_defaults(fn=add)"]}, {"cell_type": "markdown", "id": "fbcbd5ea-f377-44a0-a492-4568daa8b0b6", "metadata": {}, "source": ["## 代理(Agent)定义\n"]}, {"cell_type": "markdown", "id": "5b737e6c-64eb-4ae6-a8f7-350b1953e612", "metadata": {}, "source": ["现在,我们定义一个能够进行对话并在**不到50行代码**中调用工具的代理程序。\n", "\n", "代理程序逻辑的核心在于`chat`方法。在高层次上,有3个步骤:\n", "1. 调用OpenAI来决定是否调用工具以及使用什么参数。\n", "2. 使用参数调用工具以获得输出。\n", "3. 调用OpenAI根据对话上下文和工具输出来合成一个回复。\n", "\n", "`reset`方法只是简单地重置对话上下文,这样我们就可以开始另一个对话。\n"]}, {"cell_type": "code", "execution_count": null, "id": "a0e068f7-fd24-4f74-8243-5e6e4840f7a6", "metadata": {}, "outputs": [], "source": ["class YourOpenAIAgent:\n", " def __init__(\n", " self,\n", " tools: Sequence[BaseTool] = [],\n", " llm: OpenAI = OpenAI(temperature=0, model=\"gpt-3.5-turbo-0613\"),\n", " chat_history: List[ChatMessage] = [],\n", " ) -> None:\n", " self._llm = llm\n", " self._tools = {tool.metadata.name: tool for tool in tools}\n", " self._chat_history = chat_history\n", "\n", " def reset(self) -> None:\n", " self._chat_history = []\n", "\n", " def chat(self, message: str) -> str:\n", " chat_history = self._chat_history\n", " chat_history.append(ChatMessage(role=\"user\", content=message))\n", " tools = [\n", " tool.metadata.to_openai_tool() for _, tool in self._tools.items()\n", " ]\n", "\n", " ai_message = self._llm.chat(chat_history, tools=tools).message\n", " additional_kwargs = ai_message.additional_kwargs\n", " chat_history.append(ai_message)\n", "\n", " tool_calls = additional_kwargs.get(\"tool_calls\", None)\n", " # 现在支持并行函数调用\n", " if tool_calls is not None:\n", " for tool_call in tool_calls:\n", " function_message = self._call_function(tool_call)\n", " chat_history.append(function_message)\n", " ai_message = self._llm.chat(chat_history).message\n", " chat_history.append(ai_message)\n", "\n", " return ai_message.content\n", "\n", " def _call_function(\n", " self, tool_call: ChatCompletionMessageToolCall\n", " ) -> ChatMessage:\n", " id_ = tool_call.id\n", " function_call = tool_call.function\n", " tool = self._tools[function_call.name]\n", " output = tool(**json.loads(function_call.arguments))\n", " return ChatMessage(\n", " name=function_call.name,\n", " content=str(output),\n", " role=\"tool\",\n", " additional_kwargs={\n", " \"tool_call_id\": id_,\n", " \"name\": function_call.name,\n", " },\n", " )"]}, {"cell_type": "markdown", "id": "fbc2cec5-6cc0-4814-92a1-ca0bd237528f", "metadata": {}, "source": ["## 让我们试一试!\n"]}, {"cell_type": "code", "execution_count": null, "id": "08928f6e-610c-420b-8a7b-7a7042bbd6c6", "metadata": {}, "outputs": [], "source": ["agent = YourOpenAIAgent(tools=[multiply_tool, add_tool])"]}, {"cell_type": "code", "execution_count": null, "id": "e13f4f5d-0caf-42f0-ba3a-2c0890ed16af", "metadata": {}, "outputs": [{"data": {"text/plain": ["'Hello! How can I assist you today?'"]}, "execution_count": null, "metadata": {}, "output_type": "execute_result"}], "source": ["agent.chat(\"Hi\")"]}, {"cell_type": "code", "execution_count": null, "id": "b8f7650d-57b8-4ef4-b19d-651281ddb1be", "metadata": {}, "outputs": [{"data": {"text/plain": ["'The product of 2123 multiplied by 215123 is 456,706,129.'"]}, "execution_count": null, "metadata": {}, "output_type": "execute_result"}], "source": ["agent.chat(\"What is 2123 * 215123\")"]}, {"cell_type": "markdown", "id": "707d30b8-6405-4187-a9ed-6146dcc42167", "metadata": {}, "source": ["## 我们(稍微改进的)`OpenAIAgent` 实现\n"]}, {"cell_type": "markdown", "id": "798ca3fd-6711-4c0c-a853-d868dd14b484", "metadata": {}, "source": ["我们在LlamaIndex中提供了一个(稍微更好的)`OpenAIAgent`实现,您可以直接按照以下方式使用。\n", "\n", "与上面的简化版本相比:\n", "* 它实现了`BaseChatEngine`和`BaseQueryEngine`接口,因此您可以更无缝地在LlamaIndex框架中使用它。\n", "* 它支持每次对话轮多个函数调用\n", "* 它支持流式传输\n", "* 它支持异步端点\n", "* 它支持回调和追踪\n"]}, {"cell_type": "code", "execution_count": null, "id": "38ab3938-1138-43ea-b085-f430b42f5377", "metadata": {}, "outputs": [], "source": ["from llama_index.agent.openai import OpenAIAgent\n", "from llama_index.llms.openai import OpenAI"]}, {"cell_type": "code", "execution_count": null, "id": "d852ece7-e5a1-4368-9d59-c7014e0b5b4d", "metadata": {}, "outputs": [], "source": ["llm = OpenAI(model=\"gpt-3.5-turbo-0613\")\n", "agent = OpenAIAgent.from_tools(\n", " [multiply_tool, add_tool], llm=llm, verbose=True\n", ")"]}, {"cell_type": "markdown", "id": "500cbee4", "metadata": {}, "source": ["这是一个用Python编写的简单聊天程序。\n"]}, {"cell_type": "code", "execution_count": null, "id": "9fd1cad5", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "=== Calling Function ===\n", "Calling function: multiply with args: {\n", " \"a\": 121,\n", " \"b\": 3\n", "}\n", "Got output: 363\n", "========================\n", "\n", "STARTING TURN 2\n", "---------------\n", "\n", "=== Calling Function ===\n", "Calling function: add with args: {\n", " \"a\": 363,\n", " \"b\": 42\n", "}\n", "Got output: 405\n", "========================\n", "\n", "STARTING TURN 3\n", "---------------\n", "\n", "(121 * 3) + 42 is equal to 405.\n"]}], "source": ["response = agent.chat(\"What is (121 * 3) + 42?\")\n", "print(str(response))"]}, {"cell_type": "code", "execution_count": null, "id": "538bf32f", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["[ToolOutput(content='363', tool_name='multiply', raw_input={'args': (), 'kwargs': {'a': 121, 'b': 3}}, raw_output=363), ToolOutput(content='405', tool_name='add', raw_input={'args': (), 'kwargs': {'a': 363, 'b': 42}}, raw_output=405)]\n"]}], "source": ["# 检查数据源\n", "print(response.sources)"]}, {"cell_type": "markdown", "id": "fb33983c", "metadata": {}, "source": ["### 异步聊天\n"]}, {"cell_type": "code", "execution_count": null, "id": "1d1fc974", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "=== Calling Function ===\n", "Calling function: multiply with args: {\n", " \"a\": 121,\n", " \"b\": 3\n", "}\n", "Got output: 363\n", "========================\n", "\n", "STARTING TURN 2\n", "---------------\n", "\n", "121 multiplied by 3 is equal to 363.\n"]}], "source": ["response = await agent.achat(\"What is 121 * 3?\")\n", "print(str(response))"]}, {"cell_type": "markdown", "id": "aae035cb", "metadata": {}, "source": ["### 流式聊天\n", "在这里,每个LLM响应都作为生成器返回。您可以流式传输每个增量步骤,或者只获取最后的响应。\n"]}, {"cell_type": "code", "execution_count": null, "id": "14217fb2", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "=== Calling Function ===\n", "Calling function: multiply with args: {\n", " \"a\": 121,\n", " \"b\": 2\n", "}\n", "Got output: 242\n", "========================\n", "\n", "STARTING TURN 2\n", "---------------\n", "\n", "121 multiplied by 2 is equal to 242.\n", "\n", "Once upon a time, in a small village, there was a group of mice who lived in a cozy little burrow. The mice were known for their intelligence and resourcefulness. They had built a tight-knit community and worked together to overcome any challenges they faced.\n", "\n", "One sunny day, as the mice were going about their daily activities, they stumbled upon a bountiful field of ripe corn. The field was filled with tall stalks of golden corn, swaying gently in the breeze. The mice couldn't believe their luck! They knew they had to gather as much corn as possible to sustain themselves through the upcoming winter.\n", "\n", "With their tiny paws and sharp teeth, the mice began to harvest the corn. They worked tirelessly, carrying one ear of corn at a time back to their burrow. The mice were determined to make the most of this opportunity and ensure they had enough food for everyone.\n", "\n", "As the days turned into weeks, the mice's hard work paid off. They had collected an impressive stash of corn, thanks to their diligent efforts and the abundance of the field. The mice celebrated their success, knowing that they had secured their survival for the winter.\n", "\n", "But the mice didn't stop there. They realized that they had more corn than they needed just for themselves. They decided to share their abundance with the other animals in the village who were struggling to find food. The mice knew the importance of community and believed in helping others in need.\n", "\n", "Word spread quickly about the generous mice and their corn. Animals from all around the village came to the mice's burrow, grateful for the assistance. The mice happily distributed the corn, ensuring that everyone had enough to eat.\n", "\n", "The mice's act of kindness and their ability to multiply their resources had a profound impact on the village. The animals learned the value of working together and supporting one another. The mice became a symbol of unity and compassion, inspiring others to follow their example.\n", "\n", "And so, the mice's story of multiplying their resources and spreading kindness became a legend in the village. The mice continued to thrive, not just because of their intelligence and resourcefulness, but also because of their big hearts and willingness to help others.\n", "\n", "The end."]}], "source": ["response = agent.stream_chat(\n", " \"What is 121 * 2? Once you have the answer, use that number to write a\"\n", " \" story about a group of mice.\"\n", ")\n", "\n", "response_gen = response.response_gen\n", "\n", "for token in response_gen:\n", " print(token, end=\"\")"]}, {"cell_type": "markdown", "id": "3fac119f", "metadata": {}, "source": ["### 异步流式聊天\n", "\n", "这是一个简单的异步流式聊天示例,使用Python的asyncio库和websockets库。该示例包括一个简单的服务器和一个基本的Web客户端,用于在浏览器中进行聊天。\n", "\n", "#### 服务器\n", "\n", "服务器使用asyncio和websockets库创建一个简单的WebSocket服务器,用于接收和广播消息。\n", "\n", "#### 客户端\n", "\n", "客户端是一个简单的HTML页面,使用JavaScript和WebSocket API来连接到服务器并发送/接收消息。\n", "\n", "#### 运行示例\n", "\n", "要运行此示例,请按照以下步骤操作:\n", "\n", "1. 安装所需的库:`pip install websockets`\n", "2. 运行服务器:`python async_chat_server.py`\n", "3. 在浏览器中打开`async_chat_client.html`,然后输入用户名并开始聊天。\n", "\n", "现在,您可以在浏览器中与自己进行简单的异步流式聊天了!\n"]}, {"cell_type": "code", "execution_count": null, "id": "33ea069f-819b-4ec1-a93c-fcbaacb362a1", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "=== Calling Function ===\n", "Calling function: add with args: {\n", " \"a\": 121,\n", " \"b\": 8\n", "}\n", "Got output: 129\n", "========================\n", "\n", "STARTING TURN 2\n", "---------------\n", "\n", "121 plus 8 is equal to 129.\n", "\n", "Once upon a time, in a peaceful meadow, there lived a group of mice. These mice were known for their bravery and adventurous spirit. They loved exploring the meadow and discovering new places.\n", "\n", "One sunny day, as the mice were scurrying through the tall grass, they stumbled upon a hidden treasure. It was a small, sparkling gemstone that radiated with a mesmerizing glow. The mice were amazed by its beauty and knew that it was something special.\n", "\n", "Excitedly, the mice decided to take the gemstone back to their burrow. They carefully carried it, taking turns to ensure its safety. As they reached their cozy home, they marveled at the gemstone's brilliance. Little did they know, this gemstone held a magical power.\n", "\n", "As the mice gathered around the gemstone, a soft, enchanting light began to emanate from it. Suddenly, the mice felt a surge of energy and realized that they had been granted a special ability - the power to communicate with other animals.\n", "\n", "With their newfound power, the mice embarked on a mission to bring harmony and understanding among the creatures of the meadow. They started by reaching out to the birds, sharing their wisdom and learning about the secrets of the sky. The mice and birds formed a strong bond, exchanging stories and songs.\n", "\n", "Next, the mice approached the rabbits, teaching them about the importance of unity and cooperation. The rabbits, known for their agility, shared their knowledge of navigating the meadow and avoiding danger. Together, the mice and rabbits created a safe haven for all the animals.\n", "\n", "The mice's journey continued as they connected with the squirrels, teaching them the value of saving and planning for the future. The squirrels, in return, shared their knowledge of gathering food and surviving the harsh winters. The meadow became a place of abundance and security for all its inhabitants.\n", "\n", "As the seasons changed, the mice's influence spread throughout the meadow. Animals from all walks of life came together, forming a diverse and harmonious community. The mice's ability to bring different species together was a testament to their leadership and compassion.\n", "\n", "The gemstone, a symbol of unity and understanding, remained at the center of the mice's burrow. It served as a reminder of the power of collaboration and the importance of embracing diversity.\n", "\n", "And so, the mice's story of adding their strengths and bringing animals together became a legend in the meadow. The mice continued to explore, learn, and spread their message of unity, leaving a lasting impact on the meadow and its inhabitants.\n", "\n", "The end."]}], "source": ["response = await agent.astream_chat(\n", " \"What is 121 + 8? Once you have the answer, use that number to write a\"\n", " \" story about a group of mice.\"\n", ")\n", "\n", "response_gen = response.response_gen\n", "\n", "async for token in response.async_response_gen():\n", " print(token, end=\"\")"]}, {"cell_type": "markdown", "id": "2fe399c5-6d07-4926-b701-b612efd56b30", "metadata": {}, "source": ["### 具有个性的代理\n", "\n", "在这个示例中,我们将创建一个具有个性的代理,该代理将根据其个性特征做出不同的决策。我们将使用一个简单的示例来说明这个概念,该示例将根据代理的个性特征来选择不同的行为。\n", "\n", "首先,让我们定义代理的个性特征。代理的个性特征将包括以下几个方面:\n", "- 冒险程度(adventurousness):代理愿意尝试新的行为或策略的程度。\n", "- 谨慎程度(cautiousness):代理对于采取行动时的谨慎程度。\n", "- 礼貌程度(politeness):代理与其他代理互动时的礼貌程度。\n", "\n", "接下来,我们将根据代理的个性特征来定义其行为。例如,一个冒险程度高的代理可能更愿意尝试新的行为,而一个谨慎程度高的代理可能更倾向于选择较为保守的行为。\n", "\n", "让我们通过一个简单的示例来说明这个概念。\n"]}, {"cell_type": "markdown", "id": "8b47c034-f948-4604-a8d8-828b617ea245", "metadata": {}, "source": ["您可以指定一个系统提示,以便为代理人提供额外的指示或个性。\n"]}, {"cell_type": "code", "execution_count": null, "id": "bef36d1e-c26e-4b07-b3d0-3b7f314a45f5", "metadata": {}, "outputs": [], "source": ["from llama_index.agent.openai import OpenAIAgent\n", "from llama_index.llms.openai import OpenAI\n", "from llama_index.core.prompts.system import SHAKESPEARE_WRITING_ASSISTANT"]}, {"cell_type": "code", "execution_count": null, "id": "eba7fa46-1173-42f2-885c-0cc28df1cd2e", "metadata": {}, "outputs": [], "source": ["llm = OpenAI(model=\"gpt-3.5-turbo-0613\")\n", "\n", "agent = OpenAIAgent.from_tools(\n", " [multiply_tool, add_tool],\n", " llm=llm,\n", " verbose=True,\n", " system_prompt=SHAKESPEARE_WRITING_ASSISTANT,\n", ")"]}, {"cell_type": "code", "execution_count": null, "id": "c4841778-7008-4b61-afcc-995b6b64e91a", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "Greetings, fair traveler! How may I assist thee on this fine day?\n"]}], "source": ["response = agent.chat(\"Hi\")\n", "print(response)"]}, {"cell_type": "code", "execution_count": null, "id": "46a83768-1203-4485-a346-2fa78089afb1", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "Of course, dear friend! Allow me to weave a tale for thee in the style of Shakespeare. \n", "\n", "Once upon a time, in a land far away, there lived a noble knight named Sir William. He was known throughout the kingdom for his bravery and chivalry. One fateful day, as Sir William rode through the enchanted forest, he stumbled upon a hidden glade.\n", "\n", "In the glade, he discovered a beautiful maiden named Lady Rosalind. She was fair of face and gentle of heart, and Sir William was instantly captivated by her beauty. They spent hours conversing, sharing stories, and laughing together.\n", "\n", "As the days turned into weeks, Sir William and Lady Rosalind's bond grew stronger. They found solace in each other's company and realized that they had fallen deeply in love. However, their love was not without obstacles.\n", "\n", "Lady Rosalind's father, Lord Reginald, was a stern and overprotective man. He had already arranged a marriage for his daughter with a wealthy nobleman, Lord Percival. When Lady Rosalind confessed her love for Sir William, Lord Reginald was furious.\n", "\n", "Determined to be together, Sir William and Lady Rosalind devised a plan. They decided to elope under the cover of darkness, seeking refuge in a distant land where their love could flourish without hindrance. With heavy hearts, they bid farewell to their families and set off on their journey.\n", "\n", "Their path was treacherous, filled with perils and hardships. They faced raging storms, dangerous bandits, and even a fearsome dragon. But through it all, their love remained steadfast and unwavering.\n", "\n", "After many trials and tribulations, Sir William and Lady Rosalind finally reached their destination—a peaceful village nestled by the sea. They settled there, vowing to live a life of love and happiness.\n", "\n", "Years passed, and their love only grew stronger. They were blessed with children, who inherited their parents' noble qualities. Sir William and Lady Rosalind lived a long and fulfilling life, surrounded by the love of their family and the admiration of the villagers.\n", "\n", "And so, the tale of Sir William and Lady Rosalind serves as a reminder that true love can conquer all obstacles, even in the face of adversity. May their story inspire thee to follow thy heart and pursue love with unwavering determination.\n"]}], "source": ["response = agent.chat(\"Tell me a story\")\n", "print(response)"]}], "metadata": {"kernelspec": {"display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3"}, "language_info": {"codemirror_mode": {"name": "ipython", "version": 3}, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3"}}, "nbformat": 4, "nbformat_minor": 5}