{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "Handle Evolving Workflows\n", "=========================\n", "\n", "For some workflows we don't know the extent of the computation at the outset. We need to do some computation in order to figure out the rest of the computation that we need to do. The computation grows and evolves as we do more work.\n", "\n", "As an example, consider a situation where you need to read many files and then based on the contents of those files, fire off additional work. You would like to read the files in parallel, and then within each file expose more parallelism.\n", "\n", "This example goes through three ways to handle this situation using [Dask Futures](https://docs.dask.org/en/latest/futures.html)\n", "\n", "1. Using `as_completed`\n", "2. Using `async/await`\n", "3. Launching tasks from tasks\n", "\n", "But first, lets run our code sequentially." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "0: Sequential code\n", "------------------" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "filenames = [\"file.{}.txt\".format(i) for i in range(10)]\n", "\n", "filenames[:3]" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import random, time\n", "\n", "\n", "def parse_file(fn: str) -> list:\n", " \"\"\" Returns a list work items of unknown length \"\"\"\n", " time.sleep(random.random())\n", " return [random.random() for _ in range(random.randint(1, 10))]\n", "\n", "def process_item(x: float):\n", " \"\"\" Process each work item \"\"\"\n", " time.sleep(random.random() / 4)\n", " return x + 1" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%%time\n", "\n", "# This takes around 10-20s\n", "\n", "results = []\n", "\n", "for fn in filenames:\n", " L = parse_file(fn)\n", " for x in L:\n", " out = process_item(x)\n", " results.append(out)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Start Dask Client\n", "-----------------\n", "\n", "We'll need a Dask client in order to manage dynamic workloads" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from dask.distributed import Client\n", "\n", "client = Client(processes=False, n_workers=1, threads_per_worker=6)\n", "client" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "1: Use as_completed\n", "-------------------\n", "\n", "The [as_completed](https://docs.dask.org/en/latest/futures.html#distributed.as_completed) iterator lets us handle futures as they complete. We can then submit more data on the fly.\n", "\n", "- We submit a task for each of our filenames\n", "- We also compute the length of each of the returned lists\n", "- As those lengths return, we submit off a new task to get each item of that list. We do this at higher priority, so that we process existing data before we collect new data.\n", "- We wait on all of the returned results" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%%time\n", "\n", "from dask.distributed import as_completed\n", "import operator\n", "\n", "lists = client.map(parse_file, filenames, pure=False)\n", "lengths = client.map(len, lists)\n", "\n", "mapping = dict(zip(lengths, lists))\n", "\n", "futures = []\n", "\n", "for future in as_completed(lengths):\n", " n = future.result()\n", " L = mapping[future]\n", " for i in range(n):\n", " new = client.submit(operator.getitem, L, i, priority=1)\n", " new = client.submit(process_item, new, priority=1)\n", " futures.append(new)\n", " \n", "client.gather(futures)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "2: Use async/await to handle single file processing locally\n", "-----------------------------------------------------------\n", "\n", "We can also handle the concurrency here within our local process. This requires you to understand async/await syntax, but is generally powerful and arguably simpler than the `as_completed` approach above." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import asyncio\n", "\n", "async def f(fn):\n", " \"\"\" Handle the lifecycle of a single file \"\"\"\n", " future = client.submit(parse_file, fn, pure=False)\n", " length_future = client.submit(len, future)\n", " length = await length_future\n", " \n", " futures = [client.submit(operator.getitem, future, i, priority=10) \n", " for i in range(length)]\n", " futures = client.map(process_item, futures, priority=10)\n", " return futures\n", "\n", "async def run_all(filenames):\n", " list_of_list_of_futures = await asyncio.gather(*[f(fn) for fn in filenames])\n", " futures = sum(list_of_list_of_futures, [])\n", " return await client.gather(futures)\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We now need to run this function in the same event loop as our client is running. If we had started our client asynchronously, then we could have done this:\n", "\n", "```python\n", "client = await Client(asynchronous=True)\n", "\n", "await run_all(filenames)\n", "```\n", "\n", "However, because we started our client without the `asynchronous=True` flag the event loop is actually running in a separate thread, so we'll have to ask the client to run this for us." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "client.sync(run_all, filenames)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "3: Submit tasks from tasks\n", "--------------------------\n", "\n", "We can also submit tasks that themselves submit more tasks. See [documentation here](https://docs.dask.org/en/latest/futures.html#submit-tasks-from-tasks)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "%%time\n", "\n", "from dask.distributed import get_client, secede, rejoin\n", "\n", "def f(fn):\n", " L = parse_file(fn)\n", " client = get_client()\n", " \n", " futures = client.map(process_item, L, priority=10)\n", " secede()\n", " results = client.gather(futures)\n", " rejoin()\n", " return results\n", "\n", "futures = client.map(f, filenames, pure=False)\n", "results = client.gather(futures)" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.12" } }, "nbformat": 4, "nbformat_minor": 4 }