{ "cells": [ { "cell_type": "code", "execution_count": null, "metadata": { "tags": [] }, "outputs": [], "source": [ "import panel as pn\n", "\n", "pn.extension()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `ChatFeed` is a mid-level layout, that lets you manage a list of [`ChatMessage`](ChatMessage.ipynb) items.\n", "\n", "This layout provides backend methods to:\n", "- Send (append) messages to the chat log.\n", "- Stream tokens to the latest `ChatMessage` in the chat log.\n", "- Execute callbacks when a user sends a message.\n", "- Undo a number of sent `ChatMessage` objects.\n", "- Clear the chat log of all `ChatMessage` objects.\n", "\n", "See [`ChatInterface`](ChatInterface.ipynb) for a high-level, *easy to use*, *ChatGPT like* interface.\n", "\n", "Check out the [panel-chat-examples](https://holoviz-topics.github.io/panel-chat-examples/) docs to see applicable examples related to [LangChain](https://python.langchain.com/docs/get_started/introduction), [OpenAI](https://openai.com/blog/chatgpt), [Mistral](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwjZtP35yvSBAxU00wIHHerUDZAQFnoECBEQAQ&url=https%3A%2F%2Fdocs.mistral.ai%2F&usg=AOvVaw2qpx09O_zOzSksgjBKiJY_&opi=89978449), [Llama](https://ai.meta.com/llama/), etc. If you have an example to demo, we'd love to add it to the panel-chat-examples gallery!\n", "\n", "\"Chat\n", "\n", "#### Parameters:\n", "\n", "##### Core\n", "\n", "* **`objects`** (`List[ChatMessage]`): The messages added to the chat feed.\n", "* **`renderers`** (List[Callable]): A callable or list of callables that accept the value and return a Panel object to render the value. If a list is provided, will attempt to use the first renderer that does not raise an exception. If None, will attempt to infer the renderer from the value.\n", "* **`callback`** (callable): Callback to execute when a user sends a message or when `respond` is called. The signature must include the previous message value `contents`, the previous `user` name, and the component `instance`.\n", "\n", "##### Styling\n", "\n", "* **`card_params`** (Dict): Parameters to pass to Card, such as `header`, `header_background`, `header_color`, etc.\n", "* **`message_params`** (Dict): Parameters to pass to each ChatMessage, such as `reaction_icons`, `timestamp_format`, `show_avatar`, `show_user`, and `show_timestamp` Params passed that are not ChatFeed params will be forwarded into `message_params`.\n", "\n", "##### Other\n", "\n", "* **`header`** (Any): The header of the chat feed; commonly used for the title. Can be a string, pane, or widget.\n", "* **`callback_user`** (str): The default user name to use for the message provided by the callback.\n", "* **`callback_avatar`** (str, BytesIO, bytes, ImageBase): The default avatar to use for the entry provided by the callback. Takes precedence over `ChatMessage.default_avatars` if set; else, if None, defaults to the avatar set in `ChatMessage.default_avatars` if matching key exists. Otherwise defaults to the first character of the `callback_user`.\n", "* **`callback_exception`** (str, Callable): How to handle exceptions raised by the callback. If \"raise\", the exception will be raised. If \"summary\", a summary will be sent to the chat feed. If \"verbose\" or \"traceback\", the full traceback will be sent to the chat feed. If \"ignore\", the exception will be ignored. If a callable is provided, the signature must contain the `exception` and `instance` arguments and it will be called with the exception.\n", "* **`edit_callback`** (callable): Callback to execute when a user edits a message. The signature must include the previous message value `contents`, the previous `user` name, and the component `instance`.\n", "* **`help_text`** (str): If provided, initializes a chat message in the chat log using the provided help text as the message object and `help` as the user. This is useful for providing instructions, and will not be included in the `serialize` method by default.\n", "* **`placeholder_text`** (str): The text to display next to the placeholder icon.\n", "* **`placeholder_params`** (dict) Defaults to `{\"user\": \" \", \"reaction_icons\": {}, \"show_copy_icon\": False, \"show_timestamp\": False}` Params to pass to the placeholder `ChatMessage`, like `reaction_icons`, `timestamp_format`, `show_avatar`, `show_user`, `show_timestamp`.\n", "* **`placeholder_threshold`** (float): Min duration in seconds of buffering before displaying the placeholder. If 0, the placeholder will be disabled. Defaults to 0.2.\n", "* **`post_hook`** (callable): A hook to execute after a new message is *completely* added, i.e. the generator is exhausted. The `stream` method will trigger this callback on every call. The signature must include the `message` and `instance` arguments.\n", "* **`auto_scroll_limit`** (int): Max pixel distance from the latest object in the Column to activate automatic scrolling upon update. Setting to 0 disables auto-scrolling.\n", "* **`scroll_button_threshold`** (int): Min pixel distance from the latest object in the Column to display the scroll button. Setting to 0 disables the scroll button.\n", "* **`load_buffer`** (int): The number of objects loaded on each side of the visible objects. When scrolled halfway into the buffer, the feed will automatically load additional objects while unloading objects on the opposite side.\n", "* **`show_activity_dot`** (bool): Whether to show an activity dot on the ChatMessage while streaming the callback response.\n", "* **`view_latest`** (bool): Whether to scroll to the latest object on init. If not enabled the view will be on the first object. Defaults to True.\n", "\n", "#### Methods\n", "\n", "##### Core\n", "\n", "* **`send`**: Sends a value and creates a new message in the chat log. If `respond` is `True`, additionally executes the callback, if provided.\n", "* **`serialize`**: Exports the chat log as a dict; primarily for use with `transformers`.\n", "* **`stream`**: Streams a token and updates the provided message, if provided. Otherwise creates a new message in the chat log, so be sure the returned message is passed back into the method, e.g. `message = chat.stream(token, message=message)`. This method is primarily for outputs that are not generators--notably LangChain. For most cases, use the send method instead.\n", "\n", "##### Other\n", "\n", "* **`clear`**: Clears the chat log and returns the messages that were cleared.\n", "* **`respond`**: Executes the callback with the latest message in the chat log. Typically called after streaming is completed, i.e. after a for loop where `stream` is called multiple times. If not streaming, use the `respond` keyword argument inside the `send` method instead.\n", "* **`trigger_post_hook`**: Triggers the post hook with the latest message in the chat log. Typically called after streaming is completed, i.e. after a for loop where `stream` is called multiple times. If not streaming, use the `trigger_post_hook` keyword argument inside the `send` method instead.\n", "* **`stop`**: Cancels the current callback task if possible.\n", "* **`scroll_to(index: int)`**: Column will scroll to the object at the specified index.\n", "* **`undo`**: Removes the last `count` of messages from the chat log and returns them. Default `count` is 1.\n", "\n", "___" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Basics" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`ChatFeed` can be initialized without any arguments." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed = pn.chat.ChatFeed()\n", "chat_feed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can send chat messages with the `send` method." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\n", " \"Hello world!\",\n", " user=\"Bot\",\n", " avatar=\"B\",\n", " footer_objects=[pn.widgets.Button(name=\"Footer Object\")],\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `send` method returns a [`ChatEntry`](ChatEntry.ipynb), which can display any object that Panel can display. You can **interact with chat messages** like any other Panel component. You can find examples in the [`ChatEntry` Reference Notebook](ChatEntry.ipynb)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Besides messages of `str` type, the `send` method can also accept `dict`s containing the key `value` and `ChatEntry` objects." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send({\"object\": \"Welcome!\", \"user\": \"Bot\", \"avatar\": \"B\", \"footer_objects\": [pn.widgets.Button(name=\"Footer Object\")]})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`avatar` can also accept emojis, paths/URLs to images, and/or file-like objects." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "pn.chat.ChatFeed(\n", " pn.chat.ChatMessage(\"I'm an emoji!\", avatar=\"🤖\"),\n", " pn.chat.ChatMessage(\"I'm an image!\", avatar=\"https://upload.wikimedia.org/wikipedia/commons/6/63/Yumi_UBports.png\"),\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note if you provide both the user/avatar in the `dict` and keyword argument, the keyword argument takes precedence." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "contents = \"\"\"\n", "```python\n", "import panel as pn\n", "\n", "pn.chat.ChatMessage(\"I'm a code block!\", avatar=\"🤖\")\n", "```\n", "\"\"\"\n", "message = chat_feed.send({\"object\": contents, \"user\": \"Bot\"}, user=\"MegaBot\")\n", "message" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ ":::{note} Code Highlighting\n", "To enable code highlighting in code blocks, `pip install pygments`\n", ":::" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Callbacks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A `callback` can be attached for a much more interesting `ChatFeed`.\n", "\n", "The signature must include the latest available message value `contents`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def echo_message(contents):\n", " return f\"Echoing... {contents}\"\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=echo_message)\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Hello!\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In addition to `contents`, the signature can also include the latest available `user` name and the chat `instance`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def echo_message(contents, user, instance):\n", " return f\"Echoing {user!r}... {contents}\\n\\n{instance!r}\"\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=echo_message)\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Hello!\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "However, not all three arguments need to be in the signature.\n", "\n", "- If there's only one, it will be `contents`.\n", "- If there's two, it will be `contents` and `user`.\n", "- If there's three, it will be `contents`, `user`, and `instance`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def echo_message(contents, user):\n", " return f\"Echoing {user!r}... {contents}\"\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=echo_message)\n", "chat_feed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Update `callback_user` to change the default name and `callback_avatar` to update the default avatar of the responder." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed = pn.chat.ChatFeed(callback=echo_message, callback_user=\"Echo Bot\", callback_avatar=\"🛸\")\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Hey!\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The specified `callback` can also return a `dict` and `ChatEntry` object, which **must contain** a `value` key, and optionally a `user` and a `avatar` key, that overrides the default `callback_user` and `callback_avatar`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def parrot_message(contents):\n", " return {\"value\": f\"No, {contents.lower()}\", \"user\": \"Parrot\", \"avatar\": \"🦜\"}\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=parrot_message, callback_user=\"Echo Bot\", callback_avatar=\"🛸\")\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Are you a parrot?\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you do not want the callback to be triggered alongside `send`, set `respond=False`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Don't parrot this.\", respond=False)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can surface exceptions by setting `callback_exception` to `\"summary\"`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def bad_callback(contents):\n", " return 1 / 0\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=bad_callback, callback_exception=\"summary\")\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed.send(\"This will fail...\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To see the entire traceback, you can set it to `\"verbose\"`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "tags": [] }, "outputs": [], "source": [ "def bad_callback(contents):\n", " return 1 / 0\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=bad_callback, callback_exception=\"verbose\")\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "tags": [] }, "outputs": [], "source": [ "chat_feed.send(\"This will fail...\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Alternatively, you can provide a callable that accepts the exception and the instance as arguments to handle different exception scenarios." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import random\n", "\n", "\n", "def callback(content):\n", " if random.random() < 0.5:\n", " raise RuntimeError(\"This is an unhandled error\")\n", " raise ValueError(\"This is a handled error\")\n", "\n", "\n", "def callback_exception_handler(exception, instance):\n", " if isinstance(exception, ValueError):\n", " instance.stream(\"I can handle this\", user=\"System\")\n", " return\n", " instance.stream(\"Fatal error occurred\", user=\"System\")\n", "\n", " # you can raise a new exception here if desired\n", " # raise RuntimeError(\"Fatal error occurred\") from exception\n", "\n", "\n", "chat_feed = pn.chat.ChatFeed(\n", " callback=callback, callback_exception=callback_exception_handler\n", ")" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed.send(\"This will sometimes fail...\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Async Callbacks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `ChatFeed` also support *async* `callback`s.\n", "\n", "In fact, we recommend using *async* `callback`s whenever possible to keep your app fast and responsive, *as long as there's nothing blocking the event loop in the function*." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import panel as pn\n", "import asyncio\n", "pn.extension()\n", "\n", "async def parrot_message(contents):\n", " await asyncio.sleep(2.8)\n", " return {\"value\": f\"No, {contents.lower()}\", \"user\": \"Parrot\", \"avatar\": \"🦜\"}\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=parrot_message, callback_user=\"Echo Bot\")\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Are you a parrot?\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Do not mark the function as async if there's something blocking your event loop--if you do, the placeholder will **not** appear." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import panel as pn\n", "import time\n", "pn.extension()\n", "\n", "async def parrot_message(contents):\n", " time.sleep(2.8)\n", " return {\"value\": f\"No, {contents.lower()}\", \"user\": \"Parrot\", \"avatar\": \"🦜\"}\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=parrot_message, callback_user=\"Echo Bot\")\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Are you a parrot?\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The easiest and most optimal way to stream output is through *async generators*.\n", "\n", "If you're unfamiliar with this term, don't fret!\n", "\n", "It's simply prefixing your function with `async` and replacing `return` with `yield`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "async def stream_message(contents):\n", " message = \"\"\n", " for character in contents:\n", " message += character\n", " yield message\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=stream_message)\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Streaming...\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can also continuously replace the original message if you do not concatenate the characters." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "async def replace_message(contents):\n", " for character in contents:\n", " await asyncio.sleep(0.1)\n", " yield character\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=replace_message)\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"ABCDEFGHIJKLMNOPQRSTUVWXYZ\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This works extremely well with OpenAI's `create` and `acreate` functions--just be sure that `stream` is set to `True`!\n", "\n", "```python\n", "import openai\n", "import panel as pn\n", "\n", "pn.extension()\n", "\n", "async def openai_callback(contents):\n", " response = await openai.ChatCompletion.acreate(\n", " model=\"gpt-3.5-turbo\",\n", " messages=[{\"role\": \"user\", \"content\": contents}],\n", " stream=True,\n", " )\n", " message = \"\"\n", " async for chunk in response:\n", " message += chunk[\"choices\"][0][\"delta\"].get(\"content\", \"\")\n", " yield message\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=openai_callback)\n", "chat_feed.send(\"Have you heard of HoloViz Panel?\")\n", "```\n", "\n", "![OpenAI ACreate App](https://user-images.githubusercontent.com/42288570/260281672-09da9517-9336-42df-a502-b61530bd89b3.gif)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "It's also possible to manually trigger the callback with `respond`. This could be useful to achieve a chain of responses from the initial message!" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "async def chain_message(contents, user, instance):\n", " await asyncio.sleep(1.8)\n", " if user == \"User\":\n", " yield {\"user\": \"Bot 1\", \"value\": \"Hi User! I'm Bot 1--here to greet you.\"}\n", " instance.respond()\n", " elif user == \"Bot 1\":\n", " yield {\n", " \"user\": \"Bot 2\",\n", " \"value\": \"Hi User; I see that Bot 1 already greeted you; I'm Bot 2.\",\n", " }\n", " instance.respond()\n", " elif user == \"Bot 2\":\n", " yield {\n", " \"user\": \"Bot 3\",\n", " \"value\": \"I'm Bot 3; the last bot that will respond. See ya!\",\n", " }\n", "\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=chain_message)\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = chat_feed.send(\"Hello bots!\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Edit Callbacks\n", "\n", "An `edit_callback` can be attached to the `ChatFeed` to handle message edits.\n", "\n", "The signature must include the latest available message value `contents`, the index of the edited message, and the chat `instance`.\n", "\n", "Here, when the user edits the first message, the downstream message is updated to match the edited message." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def echo_callback(content, index, instance):\n", " return content\n", "\n", "\n", "def edit_callback(content, index, instance):\n", " instance.objects[index + 1].object = content\n", "\n", "\n", "chat_feed = pn.chat.ChatFeed(\n", " edit_callback=edit_callback, callback=echo_callback, callback_user=\"Echo Guy\"\n", ")\n", "chat_feed.send(\"Edit this\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Steps\n", "\n", "Intermediate steps, like chain of thoughts, can be provided through a series of [`ChatStep`s](ChatStep.ipynb)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed = pn.chat.ChatFeed()\n", "chat_feed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "These can easily be added using the `.add_step` method:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "with chat_feed.add_step(\"To answer the user's query, I need to first create a plan.\", title=\"Create a plan\", user='Agent') as step:\n", " step.stream(\"\\n\\n...Okay the plan is to demo this!\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default this method will attempt to append the step to an existing message as long as the last message is a step and the user matches." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "with chat_feed.add_step(title=\"Execute the plan\", status=\"running\") as step:\n", " step.stream(\"\\n\\n...Executing plan...\")\n", " time.sleep(1)\n", " step.stream(\"\\n\\n...Handing over to SQL Agent\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If the user does not match a new message will be created:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "with chat_feed.add_step(title=\"Running SQL query\", user='SQL Agent') as step:\n", " step.stream('Querying...')\n", " time.sleep(1)\n", " step.stream('\\nSELECT * FROM TABLE')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "See [`ChatStep`](ChatStep.ipynb) for more details on how to use those components." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Prompt User\n", "\n", "It is possible to temporarily pause the execution of code and prompt the user to answer a question, or fill out a form, using `prompt_user`, which accepts any Panel `component` and a follow-up `callback` (with `component` and `instance` as args) to execute upon submission." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def repeat_answer(component, instance):\n", " contents = component.value\n", " instance.send(f\"Wow, {contents}, that's my favorite flavor too!\", respond=False, user=\"Ice Cream Bot\")\n", "\n", "\n", "def show_interest(contents, user, instance):\n", " if \"ice\" in contents or \"cream\" in contents:\n", " answer_input = pn.widgets.TextInput(\n", " placeholder=\"Enter your favorite ice cream flavor\"\n", " )\n", " instance.prompt_user(answer_input, callback=repeat_answer)\n", " else:\n", " return \"I'm not interested in that topic.\"\n", "\n", "\n", "chat_feed = pn.chat.ChatFeed(\n", " callback=show_interest,\n", " callback_user=\"Ice Cream Bot\",\n", ")\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed.send(\"food\");" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can also set a `predicate` to evaluate the component's state, e.g. widget has value. If provided, the submit button will be enabled when the predicate returns `True`. The `predicate` should accept the component as an argument." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def is_chocolate(component):\n", " return \"chocolate\" in component.value.lower()\n", "\n", "\n", "def repeat_answer(component, instance):\n", " contents = component.value\n", " instance.send(f\"Wow, {contents}, that's my favorite flavor too!\", respond=False, user=\"Ice Cream Bot\")\n", "\n", "\n", "def show_interest(contents, user, instance):\n", " if \"ice\" in contents or \"cream\" in contents:\n", " answer_input = pn.widgets.TextInput(\n", " placeholder=\"Enter your favorite ice cream flavor\"\n", " )\n", " instance.prompt_user(answer_input, callback=repeat_answer, predicate=is_chocolate)\n", " else:\n", " return \"I'm not interested in that topic.\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can also set a `timeout` in seconds and `timeout_message` to prevent submission after a certain time." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def is_chocolate(component):\n", " return \"chocolate\" in component.value.lower()\n", "\n", "\n", "def repeat_answer(component, instance):\n", " contents = component.value\n", " instance.send(f\"Wow, {contents}, that's my favorite flavor too!\", respond=False, user=\"Ice Cream Bot\")\n", "\n", "\n", "def show_interest(contents, user, instance):\n", " if \"ice\" in contents or \"cream\" in contents:\n", " answer_input = pn.widgets.TextInput(\n", " placeholder=\"Enter your favorite ice cream flavor\"\n", " )\n", " instance.prompt_user(answer_input, callback=repeat_answer, predicate=is_chocolate, timeout=10, timeout_message=\"You're too slow!\")\n", " else:\n", " return \"I'm not interested in that topic.\"" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Lastly, use `button_params` and `timeout_button_params` to customize the appearance of the buttons and timeout button, respectively." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Serialize" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The chat history can be serialized for use with the `transformers` or `openai` packages through either `serialize` with `format=\"transformers\"`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed.serialize(format=\"transformers\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`role_names` can be set to explicitly map the role to the ChatMessage's user name." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed.serialize(\n", " format=\"transformers\", role_names={\"assistant\": [\"Bot 1\", \"Bot 2\", \"Bot 3\"]}\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A `default_role` can also be set to use if the user name is not found in `role_names`.\n", "\n", "If this is set to None, raises a ValueError if the user name is not found." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed.serialize(\n", " format=\"transformers\",\n", " default_role=\"assistant\"\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The messages can be filtered by using a custom `filter_by` function." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def filter_by_reactions(messages):\n", " return [message for message in messages if \"favorite\" in message.reactions]\n", "\n", "\n", "chat_feed.send(\n", " pn.chat.ChatMessage(\"I'm a message with a reaction!\", reactions=[\"favorite\"])\n", ")\n", "\n", "chat_feed.serialize(filter_by=filter_by_reactions)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`help_text` is an easy way to provide instructions to the users about what the feed does." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def say_hi(contents, user):\n", " return f\"Hi {user}!\"\n", "\n", "chat_feed = pn.chat.ChatFeed(help_text=\"This chat feed will respond by saying hi!\", callback=say_hi)\n", "chat_feed.send(\"Hello there!\")\n", "chat_feed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default, the `serialize` method will exclude the user `help` from its output. It can be changed by updating `exclude_users`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed.serialize()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If the output is complex, you can pass a `custom_serializer` to only keep the text part." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "complex_output = pn.Tabs((\"Code\", \"`print('Hello World)`\"), (\"Output\", \"Hello World\"))\n", "chat_feed = pn.chat.ChatFeed(pn.chat.ChatMessage(complex_output))\n", "chat_feed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here's the output without:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed.serialize()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here's the output with a `custom_serializer`:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def custom_serializer(obj):\n", " if isinstance(obj, pn.Tabs):\n", " # only keep the first tab's content\n", " return obj[0].object\n", " # fall back to the default serialization\n", " return obj.serialize()\n", "\n", "chat_feed.serialize(custom_serializer=custom_serializer)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "It can be fun to watch bots talking to each other. Beware of the token usage!\n", "\n", "```python\n", "import openai\n", "import panel as pn\n", "\n", "pn.extension()\n", "\n", "\n", "async def openai_self_chat(contents, user, instance):\n", " if user == \"User\" or user == \"ChatBot B\":\n", " user = \"ChatBot A\"\n", " avatar = \"https://upload.wikimedia.org/wikipedia/commons/6/63/Yumi_UBports.png\"\n", " elif user == \"ChatBot A\":\n", " user = \"ChatBot B\"\n", " avatar = \"https://upload.wikimedia.org/wikipedia/commons/thumb/3/36/Outreachy-bot-avatar.svg/193px-Outreachy-bot-avatar.svg.png\"\n", "\n", " response = await openai.ChatCompletion.acreate(\n", " model=\"gpt-3.5-turbo\",\n", " messages=[{\"role\": \"user\", \"content\": contents}],\n", " temperature=0,\n", " max_tokens=500,\n", " stream=True,\n", " )\n", " message = \"\"\n", " async for chunk in response:\n", " message += chunk[\"choices\"][0][\"delta\"].get(\"content\", \"\")\n", " yield {\"user\": user, \"value\": message, \"avatar\": avatar}\n", " instance.respond()\n", "\n", "\n", "chat_feed = pn.chat.ChatFeed(callback=openai_self_chat, sizing_mode=\"stretch_width\", height=1000).servable()\n", "chat_feed.send(\"What is HoloViz Panel?\")\n", "```\n", "\n", "![OpenAI Bot Conversation](https://user-images.githubusercontent.com/42288570/260283078-de8f56c2-becc-4566-9813-618bfc81f3c2.gif)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Stream" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If a returned object is not a generator (notably LangChain output), it's still possible to stream the output with the `stream` method.\n", "\n", "Note, if you're working with `generator`s, use `yield` in your callback instead." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed = pn.chat.ChatFeed()\n", "\n", "# creates a new message\n", "message = chat_feed.stream(\"Hello\", user=\"Aspiring User\", avatar=\"🤓\")\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# streams (appends) to the previous message\n", "message = chat_feed.stream(\n", " \" World!\",\n", " user=\"Aspiring User\",\n", " avatar=\"🤓\",\n", " message=message,\n", " footer_objects=[pn.widgets.Button(name=\"Footer Object\")],\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Be sure to check out the [panel-chat-examples](https://holoviz-topics.github.io/panel-chat-examples/) docs for more examples related to [LangChain](https://python.langchain.com/docs/get_started/introduction), [OpenAI](https://openai.com/blog/chatgpt), [Mistral](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwjZtP35yvSBAxU00wIHHerUDZAQFnoECBEQAQ&url=https%3A%2F%2Fdocs.mistral.ai%2F&usg=AOvVaw2qpx09O_zOzSksgjBKiJY_&opi=89978449), [Llama](https://ai.meta.com/llama/), etc." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `stream` method is commonly used with for loops; here, we use `time.sleep`, but if you're using `async`, it's better to use `asyncio.sleep`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed = pn.chat.ChatFeed()\n", "chat_feed" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message = None\n", "for n in \"123456789 abcdefghijklmnopqrstuvxyz\":\n", " time.sleep(0.1)\n", " message = chat_feed.stream(n, message=message)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### Customization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can pass `ChatEntry` params through `message_params`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "message_params = dict(\n", " default_avatars={\"System\": \"S\", \"User\": \"👤\"}, reaction_icons={\"like\": \"thumb-up\"}\n", ")\n", "chat_feed = pn.chat.ChatFeed(message_params=message_params)\n", "chat_feed.send(user=\"System\", value=\"This is the System speaking.\")\n", "chat_feed.send(user=\"User\", value=\"This is the User speaking.\")\n", "chat_feed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Alternatively, directly pass those params to the ChatFeed constructor and it'll be forwarded into `message_params` automatically." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed = pn.chat.ChatFeed(default_avatars={\"System\": \"S\", \"User\": \"👤\"}, reaction_icons={\"like\": \"thumb-up\"})\n", "chat_feed.send(user=\"System\", value=\"This is the System speaking.\")\n", "chat_feed.send(user=\"User\", value=\"This is the User speaking.\")\n", "chat_feed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "It's possible to customize the appearance of the chat feed by setting the `message_params` parameter.\n", "\n", "Please visit [`ChatMessage`](ChatMessage.ipynb) for a full list of customizable, target CSS classes (e.g. `.avatar`, `.name`, etc)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "chat_feed = pn.chat.ChatFeed(\n", " show_activity_dot=True,\n", " message_params={\n", " \"stylesheets\": [\n", " \"\"\"\n", " .message {\n", " background-color: tan;\n", " font-family: \"Courier New\";\n", " font-size: 24px;\n", " }\n", " \"\"\"\n", " ]\n", " },\n", ")\n", "chat_feed.send(\"I am so stylish!\")\n", "chat_feed" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can build your own custom chat interface too on top of `ChatFeed`, but remember there's a pre-built [`ChatInterface`](ChatInterface.ipynb)!" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "tags": [] }, "outputs": [], "source": [ "import asyncio\n", "import panel as pn\n", "from panel.chat import ChatMessage, ChatFeed\n", "\n", "pn.extension()\n", "\n", "\n", "async def get_response(contents, user):\n", " await asyncio.sleep(0.88)\n", " return {\n", " \"Marc\": \"It is 2\",\n", " \"Andrew\": \"It is 4\",\n", " }.get(user, \"I don't know\")\n", "\n", "\n", "ASSISTANT_AVATAR = (\n", " \"https://upload.wikimedia.org/wikipedia/commons/6/63/Yumi_UBports.png\"\n", ")\n", "\n", "chat_feed = ChatFeed(\n", " ChatMessage(\"Hi There!\", user=\"Assistant\", avatar=ASSISTANT_AVATAR),\n", " callback=get_response,\n", " height=500,\n", " message_params=dict(\n", " default_avatars={\"Assistant\": ASSISTANT_AVATAR},\n", " ),\n", ")\n", "\n", "marc_button = pn.widgets.Button(\n", " name=\"Marc\",\n", " on_click=lambda event: chat_feed.send(\n", " \"What is the square root of 4?\", user=\"Marc\", avatar=\"🚴\"\n", " ),\n", " align=\"center\",\n", " disabled=chat_feed.param.disabled,\n", ")\n", "andrew_button = pn.widgets.Button(\n", " name=\"Andrew\",\n", " on_click=lambda event: chat_feed.send(\n", " \"What is the square root of 4 squared?\", user=\"Andrew\", avatar=\"🏊\"\n", " ),\n", " align=\"center\",\n", " disabled=chat_feed.param.disabled,\n", ")\n", "undo_button = pn.widgets.Button(\n", " name=\"Undo\",\n", " on_click=lambda event: chat_feed.undo(2),\n", " align=\"center\",\n", " disabled=chat_feed.param.disabled,\n", ")\n", "clear_button = pn.widgets.Button(\n", " name=\"Clear\",\n", " on_click=lambda event: chat_feed.clear(),\n", " align=\"center\",\n", " disabled=chat_feed.param.disabled,\n", ")\n", "\n", "\n", "pn.Column(\n", " chat_feed,\n", " pn.layout.Divider(),\n", " pn.Row(\n", " \"Click a button\",\n", " andrew_button,\n", " marc_button,\n", " undo_button,\n", " clear_button,\n", " ),\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For an example on `renderers`, see [ChatInterface](ChatInterface.ipynb).\n", "\n", "Also, if you haven't already, check out the [panel-chat-examples](https://holoviz-topics.github.io/panel-chat-examples/) docs for more examples related to [LangChain](https://python.langchain.com/docs/get_started/introduction), [OpenAI](https://openai.com/blog/chatgpt), [Mistral](https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwjZtP35yvSBAxU00wIHHerUDZAQFnoECBEQAQ&url=https%3A%2F%2Fdocs.mistral.ai%2F&usg=AOvVaw2qpx09O_zOzSksgjBKiJY_&opi=89978449), [Llama](https://ai.meta.com/llama/), etc." ] } ], "metadata": { "language_info": { "name": "python", "pygments_lexer": "ipython3" } }, "nbformat": 4, "nbformat_minor": 4 }