{"cells": [{"attachments": {}, "cell_type": "markdown", "metadata": {}, "source": ["\"在\n"]}, {"cell_type": "markdown", "metadata": {}, "source": ["AI21\n", "\n", "AI21是一个人工智能平台,旨在为用户提供各种人工智能相关的工具和服务。它包括自然语言处理、机器学习、数据分析和其他人工智能技术。AI21的目标是帮助用户利用人工智能技术解决各种问题,并提供高质量的人工智能解决方案。\n"]}, {"cell_type": "markdown", "metadata": {}, "source": ["## 基本用法\n"]}, {"cell_type": "markdown", "metadata": {}, "source": ["#### 使用提示调用`complete`\n"]}, {"attachments": {}, "cell_type": "markdown", "metadata": {}, "source": ["如果您在colab上打开这个笔记本,您可能需要安装LlamaIndex 🦙。\n"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": ["%pip install llama-index-llms-ai21"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": ["!pip install llama-index"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": ["from llama_index.llms.ai21 import AI21\n", "\n", "api_key = \"Your api key\"\n", "resp = AI21(api_key=api_key).complete(\"Paul Graham is \")"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\n", "an American computer scientist, essayist, and venture capitalist. He is best known for his work on Lisp, programming language design, and entrepreneurship. Graham has written several books on these topics, including \" ANSI Common Lisp\" and \" Hackers and Painters.\" He is also the co-founder of Y Combinator, a venture capital firm that invests in early-stage technology companies.\n"]}], "source": ["print(resp)"]}, {"cell_type": "markdown", "metadata": {}, "source": ["#### 使用消息列表调用`chat`\n"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": ["from llama_index.core.llms import ChatMessage\n", "from llama_index.llms.ai21 import AI21\n", "\n", "messages = [\n", " ChatMessage(role=\"user\", content=\"hello there\"),\n", " ChatMessage(\n", " role=\"assistant\", content=\"Arrrr, matey! How can I help ye today?\"\n", " ),\n", " ChatMessage(role=\"user\", content=\"What is your name\"),\n", "]\n", "\n", "resp = AI21(api_key=api_key).chat(\n", " messages, preamble_override=\"You are a pirate with a colorful personality\"\n", ")"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["assistant: yer talkin' to Captain Jack Sparrow\n"]}], "source": ["print(resp)"]}, {"cell_type": "markdown", "metadata": {}, "source": ["## 配置模型\n"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": ["from llama_index.llms.ai21 import AI21\n", "\n", "llm = AI21(model=\"j2-mid\", api_key=api_key)"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": ["resp = llm.complete(\"Paul Graham is \")"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\n", "an American computer scientist, essayist, and venture capitalist. He is best known for his work on Lisp, programming language design, and entrepreneurship. Graham has written several books on these topics, including \" ANSI Common Lisp\" and \" Hackers and Painters.\" He is also the co-founder of Y Combinator, a venture capital firm that invests in early-stage technology companies.\n"]}], "source": ["print(resp)"]}, {"cell_type": "markdown", "metadata": {}, "source": ["## 在每个实例级别设置API密钥\n", "如果需要的话,您可以让不同的LLM实例使用不同的API密钥。\n"]}, {"cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\n", "an American computer scientist, essayist, and venture capitalist. He is best known for his work on Lisp, programming language design, and entrepreneurship. Graham has written several books on these topics, including \"Hackers and Painters\" and \"On Lisp.\" He is also the co-founder of Y Combinator, a venture capital firm that invests in early-stage technology companies.\n"]}, {"name": "stderr", "output_type": "stream", "text": ["Calling POST https://api.ai21.com/studio/v1/j2-mid/complete failed with a non-200 response code: 401\n"]}, {"ename": "Unauthorized", "evalue": "Failed with http status code: 401 (Unauthorized). Details: {\"detail\":\"Forbidden: Bad or missing API token.\"}", "output_type": "error", "traceback": ["\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", "\u001b[0;31mUnauthorized\u001b[0m Traceback (most recent call last)", "\u001b[1;32m/home/amit/Desktop/projects/lindex/llama_index/docs/examples/llm/ai21.ipynb Cell 14\u001b[0m line \u001b[0;36m9\n\u001b[1;32m 6\u001b[0m resp \u001b[39m=\u001b[39m llm_good\u001b[39m.\u001b[39mcomplete(\u001b[39m\"\u001b[39m\u001b[39mPaul Graham is \u001b[39m\u001b[39m\"\u001b[39m)\n\u001b[1;32m 7\u001b[0m \u001b[39mprint\u001b[39m(resp)\n\u001b[0;32m----> 9\u001b[0m resp \u001b[39m=\u001b[39m llm_bad\u001b[39m.\u001b[39;49mcomplete(\u001b[39m\"\u001b[39;49m\u001b[39mPaul Graham is \u001b[39;49m\u001b[39m\"\u001b[39;49m)\n\u001b[1;32m 10\u001b[0m \u001b[39mprint\u001b[39m(resp)\n", "File \u001b[0;32m~/Desktop/projects/lindex/llama_index/llama_index/llms/base.py:312\u001b[0m, in \u001b[0;36mllm_completion_callback..wrap..wrapped_llm_predict\u001b[0;34m(_self, *args, **kwargs)\u001b[0m\n\u001b[1;32m 302\u001b[0m \u001b[39mwith\u001b[39;00m wrapper_logic(_self) \u001b[39mas\u001b[39;00m callback_manager:\n\u001b[1;32m 303\u001b[0m event_id \u001b[39m=\u001b[39m callback_manager\u001b[39m.\u001b[39mon_event_start(\n\u001b[1;32m 304\u001b[0m CBEventType\u001b[39m.\u001b[39mLLM,\n\u001b[1;32m 305\u001b[0m payload\u001b[39m=\u001b[39m{\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 309\u001b[0m },\n\u001b[1;32m 310\u001b[0m )\n\u001b[0;32m--> 312\u001b[0m f_return_val \u001b[39m=\u001b[39m f(_self, \u001b[39m*\u001b[39;49margs, \u001b[39m*\u001b[39;49m\u001b[39m*\u001b[39;49mkwargs)\n\u001b[1;32m 313\u001b[0m \u001b[39mif\u001b[39;00m \u001b[39misinstance\u001b[39m(f_return_val, Generator):\n\u001b[1;32m 314\u001b[0m \u001b[39m# intercept the generator and add a callback to the end\u001b[39;00m\n\u001b[1;32m 315\u001b[0m \u001b[39mdef\u001b[39;00m \u001b[39mwrapped_gen\u001b[39m() \u001b[39m-\u001b[39m\u001b[39m>\u001b[39m CompletionResponseGen:\n", "File \u001b[0;32m~/Desktop/projects/lindex/llama_index/llama_index/llms/ai21.py:104\u001b[0m, in \u001b[0;36mAI21.complete\u001b[0;34m(self, prompt, **kwargs)\u001b[0m\n\u001b[1;32m 100\u001b[0m \u001b[39mimport\u001b[39;00m \u001b[39mai21\u001b[39;00m\n\u001b[1;32m 102\u001b[0m ai21\u001b[39m.\u001b[39mapi_key \u001b[39m=\u001b[39m \u001b[39mself\u001b[39m\u001b[39m.\u001b[39m_api_key\n\u001b[0;32m--> 104\u001b[0m response \u001b[39m=\u001b[39m ai21\u001b[39m.\u001b[39;49mCompletion\u001b[39m.\u001b[39;49mexecute(\u001b[39m*\u001b[39;49m\u001b[39m*\u001b[39;49mall_kwargs, prompt\u001b[39m=\u001b[39;49mprompt)\n\u001b[1;32m 106\u001b[0m \u001b[39mreturn\u001b[39;00m CompletionResponse(\n\u001b[1;32m 107\u001b[0m text\u001b[39m=\u001b[39mresponse[\u001b[39m\"\u001b[39m\u001b[39mcompletions\u001b[39m\u001b[39m\"\u001b[39m][\u001b[39m0\u001b[39m][\u001b[39m\"\u001b[39m\u001b[39mdata\u001b[39m\u001b[39m\"\u001b[39m][\u001b[39m\"\u001b[39m\u001b[39mtext\u001b[39m\u001b[39m\"\u001b[39m], raw\u001b[39m=\u001b[39mresponse\u001b[39m.\u001b[39m\u001b[39m__dict__\u001b[39m\n\u001b[1;32m 108\u001b[0m )\n", "File \u001b[0;32m~/.cache/pypoetry/virtualenvs/llama-index-2x1vjWb5-py3.10/lib/python3.10/site-packages/ai21/modules/resources/nlp_task.py:22\u001b[0m, in \u001b[0;36mNLPTask.execute\u001b[0;34m(cls, **params)\u001b[0m\n\u001b[1;32m 20\u001b[0m \u001b[39mreturn\u001b[39;00m \u001b[39mcls\u001b[39m\u001b[39m.\u001b[39m_execute_sm(destination\u001b[39m=\u001b[39mdestination, params\u001b[39m=\u001b[39mparams)\n\u001b[1;32m 21\u001b[0m \u001b[39mif\u001b[39;00m \u001b[39misinstance\u001b[39m(destination, AI21Destination):\n\u001b[0;32m---> 22\u001b[0m \u001b[39mreturn\u001b[39;00m \u001b[39mcls\u001b[39;49m\u001b[39m.\u001b[39;49m_execute_studio_api(params)\n\u001b[1;32m 24\u001b[0m \u001b[39mraise\u001b[39;00m WrongInputTypeException(key\u001b[39m=\u001b[39mDESTINATION_KEY, expected_type\u001b[39m=\u001b[39mDestination, given_type\u001b[39m=\u001b[39m\u001b[39mtype\u001b[39m(destination))\n", "File \u001b[0;32m~/.cache/pypoetry/virtualenvs/llama-index-2x1vjWb5-py3.10/lib/python3.10/site-packages/ai21/modules/completion.py:69\u001b[0m, in \u001b[0;36mCompletion._execute_studio_api\u001b[0;34m(cls, params)\u001b[0m\n\u001b[1;32m 65\u001b[0m url \u001b[39m=\u001b[39m \u001b[39mf\u001b[39m\u001b[39m'\u001b[39m\u001b[39m{\u001b[39;00murl\u001b[39m}\u001b[39;00m\u001b[39m/\u001b[39m\u001b[39m{\u001b[39;00mcustom_model\u001b[39m}\u001b[39;00m\u001b[39m'\u001b[39m\n\u001b[1;32m 67\u001b[0m url \u001b[39m=\u001b[39m \u001b[39mf\u001b[39m\u001b[39m'\u001b[39m\u001b[39m{\u001b[39;00murl\u001b[39m}\u001b[39;00m\u001b[39m/\u001b[39m\u001b[39m{\u001b[39;00m\u001b[39mcls\u001b[39m\u001b[39m.\u001b[39mMODULE_NAME\u001b[39m}\u001b[39;00m\u001b[39m'\u001b[39m\n\u001b[0;32m---> 69\u001b[0m \u001b[39mreturn\u001b[39;00m execute_studio_request(task_url\u001b[39m=\u001b[39;49murl, params\u001b[39m=\u001b[39;49mparams)\n", "File \u001b[0;32m~/.cache/pypoetry/virtualenvs/llama-index-2x1vjWb5-py3.10/lib/python3.10/site-packages/ai21/modules/resources/execution_utils.py:11\u001b[0m, in \u001b[0;36mexecute_studio_request\u001b[0;34m(task_url, params, method)\u001b[0m\n\u001b[1;32m 9\u001b[0m \u001b[39mdef\u001b[39;00m \u001b[39mexecute_studio_request\u001b[39m(task_url: \u001b[39mstr\u001b[39m, params, method: \u001b[39mstr\u001b[39m \u001b[39m=\u001b[39m \u001b[39m'\u001b[39m\u001b[39mPOST\u001b[39m\u001b[39m'\u001b[39m):\n\u001b[1;32m 10\u001b[0m client \u001b[39m=\u001b[39m AI21StudioClient(\u001b[39m*\u001b[39m\u001b[39m*\u001b[39mparams)\n\u001b[0;32m---> 11\u001b[0m \u001b[39mreturn\u001b[39;00m client\u001b[39m.\u001b[39;49mexecute_http_request(method\u001b[39m=\u001b[39;49mmethod, url\u001b[39m=\u001b[39;49mtask_url, params\u001b[39m=\u001b[39;49mparams)\n", "File \u001b[0;32m~/.cache/pypoetry/virtualenvs/llama-index-2x1vjWb5-py3.10/lib/python3.10/site-packages/ai21/ai21_studio_client.py:52\u001b[0m, in \u001b[0;36mAI21StudioClient.execute_http_request\u001b[0;34m(self, method, url, params, files)\u001b[0m\n\u001b[1;32m 51\u001b[0m \u001b[39mdef\u001b[39;00m \u001b[39mexecute_http_request\u001b[39m(\u001b[39mself\u001b[39m, method: \u001b[39mstr\u001b[39m, url: \u001b[39mstr\u001b[39m, params: Optional[Dict] \u001b[39m=\u001b[39m \u001b[39mNone\u001b[39;00m, files\u001b[39m=\u001b[39m\u001b[39mNone\u001b[39;00m):\n\u001b[0;32m---> 52\u001b[0m response \u001b[39m=\u001b[39m \u001b[39mself\u001b[39;49m\u001b[39m.\u001b[39;49mhttp_client\u001b[39m.\u001b[39;49mexecute_http_request(method\u001b[39m=\u001b[39;49mmethod, url\u001b[39m=\u001b[39;49murl, params\u001b[39m=\u001b[39;49mparams, files\u001b[39m=\u001b[39;49mfiles)\n\u001b[1;32m 53\u001b[0m \u001b[39mreturn\u001b[39;00m convert_to_ai21_object(response)\n", "File \u001b[0;32m~/.cache/pypoetry/virtualenvs/llama-index-2x1vjWb5-py3.10/lib/python3.10/site-packages/ai21/http_client.py:84\u001b[0m, in \u001b[0;36mHttpClient.execute_http_request\u001b[0;34m(self, method, url, params, files, auth)\u001b[0m\n\u001b[1;32m 82\u001b[0m \u001b[39mif\u001b[39;00m response\u001b[39m.\u001b[39mstatus_code \u001b[39m!=\u001b[39m \u001b[39m200\u001b[39m:\n\u001b[1;32m 83\u001b[0m log_error(\u001b[39mf\u001b[39m\u001b[39m'\u001b[39m\u001b[39mCalling \u001b[39m\u001b[39m{\u001b[39;00mmethod\u001b[39m}\u001b[39;00m\u001b[39m \u001b[39m\u001b[39m{\u001b[39;00murl\u001b[39m}\u001b[39;00m\u001b[39m failed with a non-200 response code: \u001b[39m\u001b[39m{\u001b[39;00mresponse\u001b[39m.\u001b[39mstatus_code\u001b[39m}\u001b[39;00m\u001b[39m'\u001b[39m)\n\u001b[0;32m---> 84\u001b[0m handle_non_success_response(response\u001b[39m.\u001b[39;49mstatus_code, response\u001b[39m.\u001b[39;49mtext)\n\u001b[1;32m 86\u001b[0m \u001b[39mreturn\u001b[39;00m response\u001b[39m.\u001b[39mjson()\n", "File \u001b[0;32m~/.cache/pypoetry/virtualenvs/llama-index-2x1vjWb5-py3.10/lib/python3.10/site-packages/ai21/http_client.py:23\u001b[0m, in \u001b[0;36mhandle_non_success_response\u001b[0;34m(status_code, response_text)\u001b[0m\n\u001b[1;32m 21\u001b[0m \u001b[39mraise\u001b[39;00m BadRequest(details\u001b[39m=\u001b[39mresponse_text)\n\u001b[1;32m 22\u001b[0m \u001b[39mif\u001b[39;00m status_code \u001b[39m==\u001b[39m \u001b[39m401\u001b[39m:\n\u001b[0;32m---> 23\u001b[0m \u001b[39mraise\u001b[39;00m Unauthorized(details\u001b[39m=\u001b[39mresponse_text)\n\u001b[1;32m 24\u001b[0m \u001b[39mif\u001b[39;00m status_code \u001b[39m==\u001b[39m \u001b[39m422\u001b[39m:\n\u001b[1;32m 25\u001b[0m \u001b[39mraise\u001b[39;00m UnprocessableEntity(details\u001b[39m=\u001b[39mresponse_text)\n", "\u001b[0;31mUnauthorized\u001b[0m: Failed with http status code: 401 (Unauthorized). Details: {\"detail\":\"Forbidden: Bad or missing API token.\"}"]}], "source": ["from llama_index.llms.ai21 import AI21\n", "\n", "llm_good = AI21(api_key=api_key)\n", "llm_bad = AI21(model=\"j2-mid\", api_key=\"BAD_KEY\")\n", "\n", "resp = llm_good.complete(\"Paul Graham is \")\n", "print(resp)\n", "\n", "resp = llm_bad.complete(\"Paul Graham is \")\n", "print(resp)"]}], "metadata": {"kernelspec": {"display_name": "llama-index-2x1vjWb5-py3.10", "language": "python", "name": "python3"}, "language_info": {"codemirror_mode": {"name": "ipython", "version": 3}, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3"}}, "nbformat": 4, "nbformat_minor": 2}