{"cells": [{"cell_type": "markdown", "id": "23e5dc2d", "metadata": {}, "source": ["\"在\n"]}, {"cell_type": "markdown", "id": "4d3e1610-942d-458e-8379-ebb1fe88ac2c", "metadata": {}, "source": ["# 聊天引擎 - OpenAI 代理模式\n"]}, {"cell_type": "markdown", "id": "3a240df6", "metadata": {}, "source": ["如果您在Colab上打开这个笔记本,您可能需要安装LlamaIndex 🦙。\n"]}, {"cell_type": "code", "execution_count": null, "id": "db0f97f4", "metadata": {}, "outputs": [], "source": ["%pip install llama-index-llms-openai"]}, {"cell_type": "code", "execution_count": null, "id": "bacff25a", "metadata": {}, "outputs": [], "source": ["!pip install llama-index"]}, {"attachments": {}, "cell_type": "markdown", "id": "e8c1c82b", "metadata": {}, "source": ["## 下载数据\n"]}, {"cell_type": "code", "execution_count": null, "id": "f4ad2c64", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["--2023-11-20 14:52:58-- https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt\n", "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.110.133, 185.199.108.133, 185.199.109.133, ...\n", "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.110.133|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 75042 (73K) [text/plain]\n", "Saving to: ‘data/paul_graham/paul_graham_essay.txt’\n", "\n", "data/paul_graham/pa 100%[===================>] 73.28K --.-KB/s in 0.02s \n", "\n", "2023-11-20 14:52:58 (2.86 MB/s) - ‘data/paul_graham/paul_graham_essay.txt’ saved [75042/75042]\n", "\n"]}], "source": ["!mkdir -p 'data/paul_graham/'\n", "!wget 'https://raw.githubusercontent.com/run-llama/llama_index/main/docs/docs/examples/data/paul_graham/paul_graham_essay.txt' -O 'data/paul_graham/paul_graham_essay.txt'"]}, {"cell_type": "markdown", "id": "7d3cb595-1bd4-446e-93ba-6e3cfc1d4a29", "metadata": {}, "source": ["### 五行代码即可开始\n"]}, {"cell_type": "markdown", "id": "941f4d3a-b84c-409d-b371-85c33ab7b68f", "metadata": {}, "source": ["加载数据并建立索引\n"]}, {"cell_type": "code", "execution_count": null, "id": "4a258574-a4d1-42cc-9f1a-bc6f5d4c6a37", "metadata": {}, "outputs": [], "source": ["from llama_index.core import VectorStoreIndex,SimpleDirectoryReader\n", "from llama_index.llms.openai import OpenAI\n", "\n", "# 必须使用支持函数调用API的最新OpenAI模型\n", "llm = OpenAI(model=\"gpt-3.5-turbo-0613\")\n", "data = SimpleDirectoryReader(input_dir=\"../data/paul_graham/\").load_data()\n", "index = VectorStoreIndex.from_documents(data)"]}, {"cell_type": "markdown", "id": "9e0dc626-c877-422f-913a-afd3d3c8cdc8", "metadata": {}, "source": ["配置聊天引擎\n"]}, {"cell_type": "code", "execution_count": null, "id": "37717d64-851d-46c2-b64c-2f5efaaa37f9", "metadata": {}, "outputs": [], "source": ["chat_engine = index.as_chat_engine(chat_mode=\"openai\", llm=llm, verbose=True)"]}, {"cell_type": "markdown", "id": "d5b86747-22ca-4626-9df7-ff123ac57883", "metadata": {}, "source": ["与您的数据交谈\n"]}, {"cell_type": "code", "execution_count": null, "id": "4f09367b-375a-47d3-b0b4-0753b52d834d", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "Hello! How can I assist you today?\n"]}], "source": ["response = chat_engine.chat(\"Hi\")\n", "print(response)"]}, {"cell_type": "code", "execution_count": null, "id": "5b4b6686-21cd-4974-b767-4ebad4cefb36", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "=== Calling Function ===\n", "Calling function: query_engine_tool with args: {\n", " \"input\": \"Who did Paul Graham hand over YC to?\"\n", "}\n", "Got output: Paul Graham handed over YC to Sam Altman.\n", "========================\n", "\n", "STARTING TURN 2\n", "---------------\n", "\n", "Paul Graham handed over Y Combinator (YC) to Sam Altman.\n"]}], "source": ["response = chat_engine.chat(\n", " \"Use the tool to answer: Who did Paul Graham hand over YC to?\"\n", ")\n", "print(response)"]}, {"cell_type": "code", "execution_count": null, "id": "774ad72e-8d9d-48c4-b50b-9eac3b21839e", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "=== Calling Function ===\n", "Calling function: query_engine_tool with args: {\n", " \"input\": \"Who did Paul Graham hand over YC to?\"\n", "}\n", "Got output: Paul Graham handed over YC to Sam Altman.\n", "========================\n", "\n", "STARTING TURN 2\n", "---------------\n", "\n", "\n"]}], "source": ["response = chat_engine.stream_chat(\n", " \"Use the tool to answer: Who did Paul Graham hand over YC to?\"\n", ")\n", "print(response)"]}, {"cell_type": "markdown", "id": "ff69f4d8-4c58-46ae-8229-d18d97d31c43", "metadata": {}, "source": ["### 强制聊天引擎查询索引\n"]}, {"cell_type": "markdown", "id": "d2e61f30-0e4b-40a7-90de-ca9d48310bb1", "metadata": {}, "source": ["注意:这是“openai”聊天模式特有的功能(它在内部使用`OpenAIAgent`)。\n"]}, {"cell_type": "code", "execution_count": null, "id": "f57c745b-8bda-40fe-875e-f560a9a2dc0b", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["STARTING TURN 1\n", "---------------\n", "\n", "=== Calling Function ===\n", "Calling function: query_engine_tool with args: {\n", " \"input\": \"What did Paul Graham do growing up?\"\n", "}\n", "Got output: Growing up, Paul Graham worked on writing and programming. He wrote short stories and also tried his hand at programming on the IBM 1401 computer that his school district had. He later got a microcomputer, a TRS-80, and started programming more extensively, writing simple games and even a word processor.\n", "========================\n", "\n", "STARTING TURN 2\n", "---------------\n", "\n"]}], "source": ["response = chat_engine.chat(\n", " \"What did Paul Graham do growing up?\", tool_choice=\"query_engine_tool\"\n", ")"]}, {"cell_type": "code", "execution_count": null, "id": "67dccf41-9d60-464e-b0e4-f3444e9fa2fe", "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["Growing up, Paul Graham worked on writing and programming. He wrote short stories and also tried his hand at programming on the IBM 1401 computer that his school district had. He later got a microcomputer, a TRS-80, and started programming more extensively, writing simple games and even a word processor.\n"]}], "source": ["print(response)"]}], "metadata": {"kernelspec": {"display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3"}, "language_info": {"codemirror_mode": {"name": "ipython", "version": 3}, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3"}}, "nbformat": 4, "nbformat_minor": 5}