{ "cells": [ { "cell_type": "markdown", "id": "92b777f7", "metadata": {}, "source": [ "# Lessons (Not) Learned: Chicago Health Inequities during the 1918 Flu and COVID-19.\n", "\n", "## Historic 1918 Influenza and 2020 COVID-19 Pandemic in Chicago: Health Diaparity during pandemic for slaves and other black living in Chicago" ] }, { "cell_type": "markdown", "id": "86d47cc8", "metadata": {}, "source": [ "### Objectives\n", "The purpose of this project is to illustrate the historical racial trauma in African Americans via tracing the linkage between 1918 flu pandemic and COVID-19 in Illinois. The project consists of three part of works: 1) digitizing and geoprocessing for historical analog data into digital resources, 2) identifying the neighborhood for each race, and 3) comparison between neighborhood and its mortality rate. As a part of result, several maps will be produced to achieve. This notebook would ptovide the intuition which racial group and factors were associated with the excess death rate in Chicago during pandemic." ] }, { "cell_type": "markdown", "id": "0680f2a5", "metadata": {}, "source": [ "## 1. 1918 Influenza and pneumonia and COVID-19 data\n", " All historic maps that given geospatial/geolocation information were digitized and analyzed using ArcGIS software. To assess whether descendants of slaves and other Blacks living in Chicago were vulnerable to excess death during the 1918 Influenza Pandemic, data from the book, A Report of an Epidemic of Influenza in Chicago Occurring during the Fall of 1918, was digitized and analyzed. In the book are seven consecutive weekly maps of Chicago from the week ending on October 5th to the week ending on November 16th that illustrate the locations of influenza deaths and pneumonia deaths. Each point refers only to the location of death and not any description of the person who died." ] }, { "cell_type": "markdown", "id": "7a5effe5", "metadata": {}, "source": [ "### Methods\n", "\n", "#### 1. Preparation: Geo-referencing and Digitizing\n", "\n", "##### (1) 1918 Historic Influenza map\n", "\n", "|DATE|
10/5/1918
|
10/12/1918
|
10/19/1918
|
10/26/1918
|
11/2/1918
|
11/9/1918
|
11/16/1918
|\n", "|---|---|---|---|---|---|---|---|\n", "|ORIGINAL MAP||||||||\n" ] }, { "cell_type": "markdown", "id": "064912dd", "metadata": {}, "source": [ "This original historic maps were used to perform the geo-referencing [Link](https://github.com/beckvalle/georef_demo/blob/main/Georeferencing_Instructions.ipynb) and digitizing process.\n", "\n", "#### Digitizing\n", "|||\n", "|:---:|:---:|\n", "|
Digitizing protocol from Original Map
|
Connecting folder
|\n", "\n", "|||\n", "|:---:|:---:|\n", "|
Loading files (1) - All image files + street map
|
Loading files (2) + Empty point shapefile
|\n", "\n", "|||\n", "|:---:|:---:|\n", "|
Editing (1)
|
Editing (2)
|" ] }, { "cell_type": "markdown", "id": "e93ff7c0", "metadata": {}, "source": [ "These degitized point map for each individual death were aggregated into the census tract map for better visualization\n", "\n", "|DATE|
10/5/1918
|
10/12/1918
|
10/19/1918
|
10/26/1918
|
11/2/1918
|
11/9/1918
|
11/16/1918
|\n", "|---|---|---|---|---|---|---|---|\n", "|INFLUENZA DEATH MAP||||||||\n", "|PNUEMONIA DEATH MAP||||||||" ] }, { "cell_type": "markdown", "id": "4d49d81f", "metadata": {}, "source": [ " For the analysis, all 7 influenza and pnuemonia death maps were merged into the one map to examine the total death during 7 weeks periods.\n", " \n", " " ] }, { "cell_type": "markdown", "id": "a670d9e2", "metadata": {}, "source": [ "##### 2. Identifying neighborhood\n", " Identifying spatial boundaries is important to assess race and neighborhood. \n", " \n", "###### 1) Community settlement map\n", " Firstly, we used a community settlement map that is from the supplement of the book, Historic City – The settlement of Chicago, which is published by the department of development and planning the city of Chicago. This map not only shows the pattern of settlement of diverse racial groups in Chicago but also represents the cultural and cognitive boundary for each race. Moreover, the identified industrial area can provide environmental vulnerability based on its proximity. \n", " \n", "||||\n", "|---|---|---|\n", "|
Original scanned map
|
Degitized map with influenza death
|
Degitized map with pnuemonia death
|\n" ] }, { "cell_type": "markdown", "id": "b2be4949", "metadata": {}, "source": [ "###### 2) Census tract\n", " To be released after publishing.\n", "\n" ] }, { "cell_type": "markdown", "id": "a65dfff2", "metadata": {}, "source": [ "###### 3) Census ward\n", " To be released after publishing.\n", "\n", "\n" ] }, { "cell_type": "markdown", "id": "5569a6d2", "metadata": {}, "source": [ "#### Interactive map for 1918 Influenza pandemic (Incomplete. Functioning version to be released upon publishing)" ] }, { "cell_type": "code", "execution_count": 10, "id": "79c197c7", "metadata": {}, "outputs": [], "source": [ "\"\"\"# import libraries\n", "import numpy as np\n", "from datetime import date, datetime\n", "import pandas as pd\n", "import geopandas as gpd\n", "\n", "from bokeh.plotting import save, figure\n", "from bokeh.io import show, output_notebook, push_notebook\n", "from bokeh.models import GeoJSONDataSource, ColumnDataSource, CustomJS\n", "from bokeh.models import LinearColorMapper, ColorBar\n", "from bokeh.models import DataTable, DateFormatter, TableColumn\n", "from bokeh.models import HoverTool, TapTool, Div\n", "from bokeh.models import DateRangeSlider, Dropdown\n", "from bokeh.palettes import brewer\n", "from bokeh.events import Tap\n", "from bokeh.tile_providers import Vendors, get_provider\n", "from bokeh.layouts import gridplot, layout, column, row\n", "\n", "import networkx as nx\n", "import osmnx as ox\n", "import matplotlib.pyplot as plt\n", "import multiprocessing as mp\n", "import folium, itertools, os, time, warnings\n", "\n", "from shapely.geometry import Point, LineString, Polygon\n", "from tqdm import tqdm\n", "from IPython.display import display, clear_output\n", "\n", "warnings.filterwarnings(\"ignore\")\"\"\"" ] }, { "cell_type": "code", "execution_count": 13, "id": "b47f840b", "metadata": {}, "outputs": [], "source": [ "\"\"\"FLU_1005 = gpd.read_file(\"./Influenza_CHI/P_181005_B.shp\")\n", "FLU_1012 = gpd.read_file(\"./Influenza_CHI/P_181012_B.shp\")\n", "FLU_1019 = gpd.read_file(\"./Influenza_CHI/P_181019_B.shp\")\n", "FLU_1026 = gpd.read_file(\"./Influenza_CHI/P_181026_B.shp\")\n", "FLU_1102 = gpd.read_file(\"./Influenza_CHI/P_181102_B.shp\")\n", "FLU_1109 = gpd.read_file(\"./Influenza_CHI/P_181109_B.shp\")\n", "FLU_1116 = gpd.read_file(\"./Influenza_CHI/P_181116_B.shp\")\n", "\n", "PNA_1005 = gpd.read_file(\"./Influenza_CHI/P_181005_W.shp\")\n", "PNA_1012 = gpd.read_file(\"./Influenza_CHI/P_181012_W.shp\")\n", "PNA_1019 = gpd.read_file(\"./Influenza_CHI/P_181019_W.shp\")\n", "PNA_1026 = gpd.read_file(\"./Influenza_CHI/P_181026_W.shp\")\n", "PNA_1102 = gpd.read_file(\"./Influenza_CHI/P_181102_W.shp\")\n", "PNA_1109 = gpd.read_file(\"./Influenza_CHI/P_181109_W.shp\")\n", "PNA_1116 = gpd.read_file(\"./Influenza_CHI/P_181116_W.shp\")\n", "\n", "FLU_1005_C=FLU_1005.to_crs(epsg=4326)\n", "FLU_1012_C=FLU_1012.to_crs(epsg=4326)\n", "FLU_1019_C=FLU_1019.to_crs(epsg=4326)\n", "FLU_1026_C=FLU_1026.to_crs(epsg=4326)\n", "FLU_1102_C=FLU_1102.to_crs(epsg=4326)\n", "FLU_1109_C=FLU_1109.to_crs(epsg=4326)\n", "FLU_1116_C=FLU_1116.to_crs(epsg=4326)\n", "\n", "PNA_1005_C=PNA_1005.to_crs(epsg=4326)\n", "PNA_1012_C=PNA_1012.to_crs(epsg=4326)\n", "PNA_1019_C=PNA_1019.to_crs(epsg=4326)\n", "PNA_1026_C=PNA_1026.to_crs(epsg=4326)\n", "PNA_1102_C=PNA_1102.to_crs(epsg=4326)\n", "PNA_1109_C=PNA_1109.to_crs(epsg=4326)\n", "PNA_1116_C=PNA_1116.to_crs(epsg=4326)\n", "\n", "\n", "FLU_1005_C['date']='1918/10/05'\n", "FLU_1012_C['date']='1918/10/12'\n", "FLU_1019_C['date']='1918/10/19'\n", "FLU_1026_C['date']='1918/10/26'\n", "FLU_1102_C['date']='1918/11/02'\n", "FLU_1109_C['date']='1918/11/09'\n", "FLU_1116_C['date']='1918/11/16'\n", "\n", "PNA_1005_C['date']='1918/10/05'\n", "PNA_1012_C['date']='1918/10/12'\n", "PNA_1019_C['date']='1918/10/19'\n", "PNA_1026_C['date']='1918/10/26'\n", "PNA_1102_C['date']='1918/11/02'\n", "PNA_1109_C['date']='1918/11/09'\n", "PNA_1116_C['date']='1918/11/16'\n", "\n", "FLU_1005_C['type']='Flu'\n", "FLU_1012_C['type']='Flu'\n", "FLU_1019_C['type']='Flu'\n", "FLU_1026_C['type']='Flu'\n", "FLU_1102_C['type']='Flu'\n", "FLU_1109_C['type']='Flu'\n", "FLU_1116_C['type']='Flu'\n", "\n", "PNA_1005_C['type']='Pna'\n", "PNA_1012_C['type']='Pna'\n", "PNA_1019_C['type']='Pna'\n", "PNA_1026_C['type']='Pna'\n", "PNA_1102_C['type']='Pna'\n", "PNA_1109_C['type']='Pna'\n", "PNA_1116_C['type']='Pna'\n", "\n", "FLU_1005_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in FLU_1005_C['date']]\n", "FLU_1012_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in FLU_1012_C['date']]\n", "FLU_1019_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in FLU_1019_C['date']]\n", "FLU_1026_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in FLU_1026_C['date']]\n", "FLU_1102_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in FLU_1102_C['date']]\n", "FLU_1109_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in FLU_1109_C['date']]\n", "FLU_1116_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in FLU_1116_C['date']]\n", "\n", "PNA_1005_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in PNA_1005_C['date']]\n", "PNA_1012_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in PNA_1012_C['date']]\n", "PNA_1019_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in PNA_1019_C['date']]\n", "PNA_1026_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in PNA_1026_C['date']]\n", "PNA_1102_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in PNA_1102_C['date']]\n", "PNA_1109_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in PNA_1109_C['date']]\n", "PNA_1116_C['timestamp'] = [datetime.timestamp(datetime.strptime(date, '%Y/%m/%d')) for date in PNA_1116_C['date']]\n", "\n", "FLU_1005_C ['rat_color'] = 'red'\n", "FLU_1012_C ['rat_color'] = 'red'\n", "FLU_1019_C ['rat_color'] = 'red'\n", "FLU_1026_C ['rat_color'] = 'red'\n", "FLU_1102_C ['rat_color'] = 'red'\n", "FLU_1109_C ['rat_color'] = 'red'\n", "FLU_1116_C ['rat_color'] = 'red'\n", " \n", "PNA_1005_C ['rat_color'] = 'blue'\n", "PNA_1012_C ['rat_color'] = 'blue'\n", "PNA_1019_C ['rat_color'] = 'blue'\n", "PNA_1026_C ['rat_color'] = 'blue'\n", "PNA_1102_C ['rat_color'] = 'blue'\n", "PNA_1109_C ['rat_color'] = 'blue'\n", "PNA_1116_C ['rat_color'] = 'blue'\n", "\n", "FLU_Point = FLU_1005_C\n", "FLU_Point = FLU_Point.append(FLU_1012_C)\n", "FLU_Point = FLU_Point.append(FLU_1019_C)\n", "FLU_Point = FLU_Point.append(FLU_1026_C)\n", "FLU_Point = FLU_Point.append(FLU_1102_C)\n", "FLU_Point = FLU_Point.append(FLU_1109_C)\n", "FLU_Point = FLU_Point.append(FLU_1116_C)\n", "\n", "PNA_Point = PNA_1005_C\n", "PNA_Point = PNA_Point.append(PNA_1012_C)\n", "PNA_Point = PNA_Point.append(PNA_1019_C)\n", "PNA_Point = PNA_Point.append(PNA_1026_C)\n", "PNA_Point = PNA_Point.append(PNA_1102_C)\n", "PNA_Point = PNA_Point.append(PNA_1109_C)\n", "PNA_Point = PNA_Point.append(PNA_1116_C)\n", "\n", "Final_Point = FLU_Point.append(PNA_Point)\n", "\n", "# record data sources for bokeh\n", "tl_data= [['10/05/1918',\"Week 1\"],\n", " ['10/26/1918',\"Week 4\"],\n", " ['11/16/1918',\"Week 7\"]]\n", "tl_df= pd.DataFrame(tl_data, columns = ['tl_date', 'tl_note']) \n", "tl_df['tl_ts'] = [datetime.timestamp(datetime.strptime(date, '%m/%d/%Y'))*1000 \n", " for date in tl_df['tl_date']]\"\"\"" ] }, { "cell_type": "code", "execution_count": 14, "id": "20617a7e", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "GlyphRenderer(id='1859', ...)\n" ] } ], "source": [ "\"\"\"Final_source = GeoJSONDataSource(geojson=Final_Point.to_json())\n", "Final_disp_source = GeoJSONDataSource(geojson=Final_Point.to_json())\n", "timeline_source = ColumnDataSource(tl_df)\n", "\n", "tile_provider2 = get_provider(Vendors.CARTODBPOSITRON)\n", "\n", "s4 = figure(title=\"Influenza Map\", plot_width=500, x_range=(-9800000, -9740000), y_range=(5100000, 5170000),\n", " tooltips=[(\"\", \"@Title\")], tools=\"tap\")\n", "s4.toolbar.logo = None\n", "s4.toolbar_location = None\n", "s4.add_tile(tile_provider2)\n", "\n", "# plot points for movie locations\n", "Final_pts = s4.circle(x='x', y='y', source=Final_disp_source, fill_color='rat_color', legend_label='Flu', size=5)\n", "print(Final_pts)\n", "\n", "# plot points for movie locations\n", "Final_pts = s4.circle(x='x', y='y', source=Final_disp_source, fill_color='rat_color', legend_label='Flu', size=5)\n", "\n", "# add a dropdown menu to change the point color \n", "menu = [(\"Default\", \"default\"), (\"Type\", \"type\")]\n", "callback_p = CustomJS(args = dict(source = Final_source, \n", " fill_source = Final_disp_source, \n", " s4 = s4, Final_pts = Final_pts), \n\"\"\"", " code = \"\"\"\n", "var data = source.data;\n", "var fill_data = fill_source.data;\n", "var choice = this.item;\n", "\n", "if (choice == \"Flu\") {\n", " fill_data['rat_color'] = [];\n", " \n", " for (var i = 0; i < data.x.length; i++) { \n", " \n", " if (fill_data['type'][i] == \"Flu\") {\n", " fill_data['rat_color'].push('red');\n", " } \n", " }\n", " Final_pts.glyph.fill_color.field = 'rat_color';\n", "} else {\n", " Final_pts.glyph.fill_color = 'blue';\n", "}\n", "\n", "fill_source.change.emit();\n", " \"\"\")\n", "\n", "\"\"\"dropdown = Dropdown(label=\"Change Point Color\", button_type=\"warning\", menu=menu)\n", "dropdown.js_on_event(\"menu_item_click\", callback_p)\n", "\n", "# add a data range slider\n", "\n", "callback_t = CustomJS(args = dict(source = Final_source, fill_source = Final_disp_source), \n\"\"\"", " code = \"\"\"\n", "var data = source.data;\n", "var fill_data = fill_source.data;\n", "var s_val = cb_obj.value;\n", "fill_data['x']=[];\n", "fill_data['y']=[];\n", "for (var i = 0; i < data.x.length; i++) { \n", " if ((data['timestamp'][i] >= (s_val[0]/1000)) \n", " && (data['timestamp'][i] <= (s_val[1]/1000))) {\n", " fill_data['y'].push(source.data['y'][i]);\n", " fill_data['x'].push(source.data['x'][i]);\n", " } else {\n", " fill_data['y'].push(NaN);\n", " fill_data['x'].push(NaN);\n", " }\n", " }\n", "\n", "fill_source.change.emit();\n", "\"\"\")\n", "\n", "\"\"\"date_range_slider = DateRangeSlider(value=(date(1918, 10, 1), date(1918, 11, 15)),\n", " start=date(1918, 6, 1), end=date(1918, 12, 31))\n", "date_range_slider.js_on_change(\"value\", callback_t)\n", "div = Div(text=\"\"\"\"\"\", width=200, height=100)\n", "\n", "# add tap tool to display text when point is selected\n", "taptool = s4.select(type=TapTool)\n", "taptool.callback = CustomJS(args = dict(source = Final_disp_source, div = div), \n\"\"\"", " code = \"\"\"\n", " var data = source.data;\n", " const inds = source.selected.indices;\n", " var base_str = \"\";\n", " for (const indx in inds) {\n", " base_str = base_str.concat('id: ', inds[indx].toString(), '
');\n", " base_str = base_str.concat('date: ', data['Date'][inds[indx]], '
');\n", " base_str = base_str.concat('title: ', data['type'][inds[indx]], '
');\n", " }\n", " div.text = base_str;\n", " \n", "\"\"\")\n", "\n", "\n", "\"\"\"tl_hover = HoverTool(\n", " tooltips=n", "
\n", "
\n", " \n", "
\n", "
\n", " @tl_date\n", "
\n", " @tl_note\n", "
\n", "
\n", " \"\"\")\n", "\n", "\"\"\"s5 = figure(title=\"Timeline\", x_axis_type='datetime', tools=[tl_hover],\n", " plot_height=100,\n", " # multiply x range by 1000 to convert between microseconds\n", " x_range=(datetime.timestamp(datetime.strptime('09/01/1918', '%m/%d/%Y'))*1000, \n", " datetime.timestamp(datetime.strptime('12/31/1918', '%m/%d/%Y'))*1000),\n", " y_range=(0.5, 1.5))\n", "s5.yaxis.major_label_text_font_size = '0pt'\n", "s5.toolbar.logo = None\n", "s5.toolbar_location = None\n", "tl_pts = s5.triangle(x='tl_ts', y=1, source=timeline_source, fill_color='red', size=20)\n", "\n", "date_range_slider.js_link('value', s5.x_range, 'start')\n", "date_range_slider.js_link('value', s5.x_range, 'end')\n", "\n", "t = show((column(row(s4, div), dropdown, date_range_slider, s5)), notebook_handle=True)\"\"\"" ] }, { "cell_type": "markdown", "id": "2b7e5c52", "metadata": {}, "source": [ "###### 4) Race and neighborhood\n", " To be released after publishing.\n", "\n", "\n", "##### Ffinal maps for influezan and pneumonia death\n", " To be released after publishing.\n", "\n" ] }, { "cell_type": "markdown", "id": "b901cb32", "metadata": {}, "source": [ "## 2. COVID-19\n", " To be released after publishing.\n", "\n", "\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "id": "64b3367f", "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.8" }, "widgets": { "application/vnd.jupyter.widget-state+json": { "state": {}, "version_major": 2, "version_minor": 0 } } }, "nbformat": 4, "nbformat_minor": 5 }