{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# A football addiction confession and English club fan catchments" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Thursday 13th June\n", "\n", "Update 14:46: Attempted a fix for Google Chrome browser but it does not display properly in Kyso. Open in Firefox or Safari to view map.\n", "\n", "### A football addiction confession\n", "\n", "Not even two weeks since the Champions League final, today's release of the Premier League 2019/20 fixtures marks the countdown to the new English domestic football season. For fans of newly promoted clubs it's a particularly exciting day, and the more studious will begin to chart potential pathways to league survival. All fans will eagerly look out for their team's opening day, Boxing Day and final day fixtures, as well as those against rivals. \n", "\n", "I can remember in years gone by it being my small dose of the potent drug of club football, which, in Europe at least, is otherwise unavailable at this time of year. Lower strength substitutes such as tennis, cricket, the Olympics and even international football are available, but none of these can compete with club football.\n", "\n", "Having been brought up in the rural Wiltshire football wilderness, and with a father who did not count the beautiful game among his interests, I ended up devoting my loyalty as a kid to Manchester United - the most successful club at that time. And it would no exaggeration to say I got myself addicted to football. In what were sometimes difficult teen years, following club football provided a hobby and escape for me. I would love to have been good at playing it too, but my slightly poor co-ordination, self consciousness and lack of self confidence got in the way.\n", "\n", "This addiction manifested itself in many different ways. Some examples include a few trips to Old Trafford (I even caught this quite [remarkable Rooney goal](https://www.youtube.com/watch?v=pPHqRpJ7tLY)), obsessively checking the BBC Sport transfer gossip column, regularly watching Sky Sports News for hours on end (when it was on Freeview) and a successful lobbying effort to get our house subscribed to Sky Sports. \n", "\n", "The definition of a \"true fan\" is probably someone whose emotions on match days are strongly correlated with their team's showing on the pitch; they feel intensely the joy of the highs and the pain of the lows. As United's domestic and European success between 2006 and 2009 came to an end when I was about 16, it got to a point when I realised I would be happier if I took the conscious decision to \"mentally detach\" myself from supporting them. Although that may sound like a joke, I'm actually being dead serious!\n", "\n", "I didn't abandon football entirely - far from it. Football has been a continued interest of mine and a fruitful source of conversation, particularly as I wouldn't count myself a big film or TV buff. And as I have moved around since school, it has been fun to watch my local teams: York City, West Ham United, Curicó Unido (in Chile), Bristol City and, most recently, Tottenham Hotspur.\n", "\n", "I might not be a \"fan\", but I do feel some degree of belonging and affinity towards all of these clubs. And having worked as a steward at Tottenham since their return to N17 and told my colleagues I'm a Spurs supporter, maybe it is now appropriate to use the \"f word\" once more. I do appreciate most people will be reading all of this in horror!\n", "\n", "### Football club fan catchments\n", "\n", "As some light relief from working on my MSc thesis, I have applied my newly developed Python programming skills to the topic of football club fan catchments: the result is a map that shows the geographic catchment of almost every team in the Premier League and EFL(1).\n", "\n", "I created a [Voronoi diagram](https://en.wikipedia.org/wiki/Voronoi_diagram) dividing England up in to 89 thiessen polygons, where every possible point inside a given polygon is closest to a certain English Football League club stadium(2). I must acknowledge I used a spreadsheet of stadiums from a blog post written by someone who undertook the same exercise a year ago, but I still claim some novelness to my version by virtue of the fact it is interactive(3)!\n", "\n", "I then went a step further by using Census Output Area data to estimate the populations in each of these polygons. I have also calculated the average age of each catchment and YouGov \"fame ranking\" of those for which it was available(4).\n", "\n", "The interactive map shows a thiessen polygon for each club. The colours correspond by default to the population size of the catchment; the intervals for each colour can be seen in the legend. You can click the layer icon on the right-hand side to toggle to an average aged based choropleth map (note you must also deselect population). Additional information is also displayed in the tooltip - this can brought up by clicking on the desired club's map marker.\n", "\n", "I hope you find it interesting and would welcome any feedback in the comments section.\n", "\n", "\n", "(1) The Welsh clubs Swansea City, Cardiff City and Newport (who still play in the English system) are excluded, leaving 89 clubs in total.\n", "\n", "(2) Based on the teams due to participate in the EFL/EPL the 2019/20 season. Coventry City catchment calculated using coordinates of the Ricoh Arena (their former Stadium). Otherwise they would have an identical catchment due to Birmingham City thanks to their current groundshare arrangement.\n", "\n", "(3) I also could not get the co-ordinates in his data to work so I re-extracted these using the Wikipedia API \n", "\n", "(4) There are no good statistics on actual fan base sizes other than for the top clubs. Although I think the survey is of UK residents, these YouGov rankings actually contains some European teams that I have stripped out. I could also have scraped the number of Twitter followers for each club but decided life was too short!" ] }, { "cell_type": "code", "execution_count": 53, "metadata": { "scrolled": false }, "outputs": [ { "data": { "text/html": [ "
" ], "text/plain": [ "" ] }, "execution_count": 53, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Sources\n", "\n", "- Grounds for the 18/19 season from Safe.com(4): https://www.safe.com/blog/2018/08/fme-and-data-integration-evangelist177/\n", "- Coordinates using Wikipedia API to extract from each club page\n", "- Capacities scrapped from this Wikipedia page: https://en.wikipedia.org/wiki/List_of_football_stadiums_in_England\n", "- Fame rank from YouGov: https://yougov.co.uk/ratings/sport/popularity/football-clubs/all\n", "- Boundaries from Open Geography Portal: http://geoportal.statistics.gov.uk/\n", "- Population data from Nomis: https://www.nomisweb.co.uk/ " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Import packages & configuration" ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [], "source": [ "import pandas as pd\n", "import numpy as np\n", "import geopandas as gpd\n", "import os\n", "import requests\n", "import pysal as ps\n", "from pysal.contrib.viz import mapping as maps\n", "import palettable as pltt\n", "from seaborn import palplot\n", "import zipfile\n", "import folium\n", "from shapely import geometry\n", "from shapely.geometry import Point\n", "import folium.plugins\n", "import re\n", "import wikipedia\n", "import matplotlib.pyplot as plt\n", "from libpysal.cg.voronoi import Voronoi, voronoi_frames\n", "from shapely.geometry import Polygon\n", "import requests\n", "from bs4 import BeautifulSoup\n", "\n", "pd.options.display.max_rows = 200\n", "\n", "%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Main club & stadium data" ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [], "source": [ "# Read in data sets\n", "stadiums_data = pd.read_csv(\n", " \"EnglishFootballGrounds/PremiershipFootballGrounds.csv\")\n", "stadiums_data = stadiums_data.append(pd.read_csv(\n", " \"EnglishFootballGrounds/ChampionshipFootballGrounds.csv\"))\n", "stadiums_data = stadiums_data.append(pd.read_csv(\n", " \"EnglishFootballGrounds/LeagueOneFootballGrounds.csv\"))\n", "stadiums_data = stadiums_data.append(pd.read_csv(\n", " \"EnglishFootballGrounds/LeagueTwoFootballGrounds.csv\"))\n", "\n", "# Only using team and stadium\n", "stadiums_data = stadiums_data[[\"Team\", \"Stadium\"]]" ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [], "source": [ "# Import English regions data to get England boundary\n", "regions = gpd.read_file(\"regions.shp\", crs={'init': 'epsg:27700'})\n", "regions = regions.to_crs(epsg=4326)\n", "\n", "regions[\"dis\"] = np.zeros\n", "\n", "# Disolve boundaries\n", "regionz = regions.dissolve(by='dis')" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [], "source": [ "# Remove teams relegated from League 2 end of 2018/19 season\n", "stadiums_data = stadiums_data[stadiums_data.Team != \"Notts County\"]\n", "stadiums_data = stadiums_data[stadiums_data.Team != \"Yeovil Town\"]\n", "\n", "# Add teams promoted to League 2 end of 2018/19 season\n", "stadiums_data = stadiums_data.append(\n", " {\"Team\": \"Salford City\", \"Stadium\": \"Moor Lane\"}, ignore_index=True)\n", "stadiums_data = stadiums_data.append(\n", " {\"Team\": \"Leyton Orient\", \"Stadium\": \"Brisbane Road\"}, ignore_index=True)\n", "stadiums_data = stadiums_data.replace(np.nan, '', regex=True)\n", "\n", "# Remove Welsh teams\n", "stadiums_data = stadiums_data[stadiums_data.Team != \"Swansea City\"]\n", "stadiums_data = stadiums_data[stadiums_data.Team != \"Cardiff City\"]\n", "stadiums_data = stadiums_data[stadiums_data.Team != \"Newport County\"]\n", "\n", "# Correct spelling errors\n", "stadiums_data.Team = stadiums_data.Team.apply(lambda x: x.replace('and Hove', '& Hove'))\n", "stadiums_data.Team = stadiums_data.Team.apply(lambda x: x.replace('Middlesborough', 'Middlesbrough'))\n", "stadiums_data.Team = stadiums_data.Team.apply(lambda x: x.replace('West Bromich Albion', 'West Bromwich Albion'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Get stadium coordinates" ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [], "source": [ "# Function to remove extra characters in stadium name\n", "def rem_brack(x):\n", " return re.sub(\"[\\(\\[].*?[\\)\\]]\", \"\", x)\n", "\n", "\n", "stadiums_data[\"Stadium_name\"] = stadiums_data.Stadium.apply(rem_brack)" ] }, { "cell_type": "code", "execution_count": 25, "metadata": {}, "outputs": [], "source": [ "# Exceptions for those where incorrect stadium name prevented match\n", "def resolve_names(stad_name): \n", " if stad_name == \"Stamford Bridge\":\n", " return \"Stamford Bridge (stadium)\"\n", " if stad_name == \"White Hart Lane\":\n", " return \"Tottenham Hotspur Stadium\"\n", " elif stad_name == \"Etihad Stadium\":\n", " return \"City of Manchester Stadium\"\n", " elif stad_name == \"Ashton Gate\":\n", " return \"Ashton Gate Stadium\"\n", " elif stad_name == \"Bramall Land\":\n", " return \"Bramall Lane\"\n", " elif stad_name == \"Hillsborough\":\n", " return \"Hillsborough Stadium\"\n", " elif stad_name == \"The Valley\":\n", " return\"The Valley (London)\"\n", " elif stad_name == \"St Andrews\":\n", " return \"St Andrew's (Stadium)\"\n", " elif stad_name == \"County Ground\":\n", " return \"County Ground (Swindon)\"\n", " elif stad_name == \"St James Park\":\n", " return \"St James Park (Exeter)\"\n", " elif stad_name == \"Broadhill Way\":\n", " return \"Broadhall Way\"\n", " elif stad_name == \"Globe Arena\":\n", " return \"Globe Arena (football stadium)\"\n", " else:\n", " return stad_name\n", " \n", "stadiums_data.Stadium_name = stadiums_data.Stadium_name.apply(resolve_names)" ] }, { "cell_type": "code", "execution_count": 27, "metadata": {}, "outputs": [], "source": [ "# Get coordinates from Wikipedia API\n", "no_match = []\n", "\n", "\n", "def get_coords(name):\n", " # Exceptions where no coordinates in page name\n", " if name == \"Vale Park\":\n", " coords = (53.049829, -2.192612)\n", " elif name == \"Memorial Stadium\":\n", " coords = (51.486421, -2.583098)\n", " elif name == \"Bloomfield Road\":\n", " coords = (53.805080, -3.048066)\n", " elif name == \"Moor Lane\":\n", " coords = (53.513631, -2.276775)\n", " elif name == \"Anfield\": #Someone editing the Wikipedia page\n", " coords = (53.4252, -2.9565)\n", " else:\n", " try:\n", " coords = wikipedia.WikipediaPage(title=name).coordinates\n", " except:\n", " no_match.append(name)\n", " coords = \"not found\"\n", " return coords\n", "\n", "\n", "stadiums_data[\"geometryA\"] = stadiums_data.Stadium_name.apply(get_coords)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## YouGov fame data" ] }, { "cell_type": "code", "execution_count": 28, "metadata": {}, "outputs": [], "source": [ "# https://yougov.co.uk/ratings/sport/popularity/football-clubs/all\n", "fame = pd.read_csv(\"yougov_fame.csv\")" ] }, { "cell_type": "code", "execution_count": 29, "metadata": {}, "outputs": [], "source": [ "# Clean data\n", "fame.Team = fame.Team.apply(lambda x: x.replace(\" F.C.\", \"\"))\n", "fame.Team = fame.Team.apply(lambda x: x.replace(\" F.C\", \"\"))\n", "fame.Team = fame.Team.apply(lambda x: x.replace(\" F. C.\", \"\"))\n", "fame.Team = fame.Team.apply(lambda x: x.replace(\" A.F.C.\", \"\"))\n", "fame.Team = fame.Team.apply(lambda x: x.replace(\"A.F.C. \", \"\"))" ] }, { "cell_type": "code", "execution_count": 30, "metadata": {}, "outputs": [], "source": [ "# Merge with main stadiums data\n", "stadiums_data = stadiums_data.merge(fame, how=\"left\")\n", "stadiums_data = stadiums_data.sort_values(\"rank\")\n", "stadiums_data = stadiums_data.reset_index(drop=True)\n", "stadiums_data = stadiums_data.reset_index()\n", "stadiums_data = stadiums_data.rename(columns={\"index\": \"rank_en\"})\n", "stadiums_data.rank_en = stadiums_data.rank_en.apply(lambda x: x+1)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Wikipedia capacity data" ] }, { "cell_type": "code", "execution_count": 31, "metadata": {}, "outputs": [], "source": [ "# Get stadium capacity data from Wikipedia API\n", "website_url = requests.get(\n", " \"https://en.wikipedia.org/wiki/List_of_football_stadiums_in_England\")\n", "soup = BeautifulSoup(website_url.text)\n", "My_table = soup.find('table', {'class': 'wikitable sortable'})" ] }, { "cell_type": "code", "execution_count": 32, "metadata": {}, "outputs": [], "source": [ "# Function to get the stadium name\n", "def stad_name(texty):\n", " match = re.search(r'(title=\").+\"{1}', str(texty))\n", " if match:\n", " subbed = re.sub('\"', '', str(match.group()))\n", " subbed = re.sub('title=', '', str(subbed))\n", " return subbed\n", " else:\n", " return \"None\"\n", "\n", "cap_data = pd.DataFrame(pd.Series(My_table.findAll(\"tr\")))\n", "cap_data.columns = [\"parsed\"]\n", "cap_data[\"Stadium_name_alt\"] = cap_data.parsed.apply(stad_name)" ] }, { "cell_type": "code", "execution_count": 33, "metadata": {}, "outputs": [], "source": [ "# Remove header row\n", "cap_data = cap_data.drop(0)\n", "# Remove extra rows where ground share\n", "cap_data = cap_data.query(\n", " \"Stadium_name_alt not in ['Reading F.C. Women', 'Yeovil Town L.F.C.', 'Portsmouth_F.C._Ladies', 'London Bees']\")" ] }, { "cell_type": "code", "execution_count": 34, "metadata": {}, "outputs": [], "source": [ "# Extract capacity\n", "def cap(texty):\n", " match = re.search(\n", " r'(\">\\d)(\\d|,)(\\d|,).+(', '', str(match.group()))\n", " subbed = re.sub('♠\">', '', subbed)\n", " subbed = re.sub('\">', '', subbed)\n", " subbed = re.sub(' ', '', subbed)\n", " subbed = re.sub('= 0 for v in vertices):\n", " # finite region\n", " new_regions.append(vertices)\n", " continue\n", "\n", " # reconstruct a non-finite region\n", " ridges = all_ridges[p1]\n", " new_region = [v for v in vertices if v >= 0]\n", "\n", " for p2, v1, v2 in ridges:\n", " if v2 < 0:\n", " v1, v2 = v2, v1\n", " if v1 >= 0:\n", " # finite ridge: already in the region\n", " continue\n", "\n", " # Compute the missing endpoint of an infinite ridge\n", "\n", " t = vor.points[p2] - vor.points[p1] # tangent\n", " t /= np.linalg.norm(t)\n", " n = np.array([-t[1], t[0]]) # normal\n", "\n", " midpoint = vor.points[[p1, p2]].mean(axis=0)\n", " direction = np.sign(np.dot(midpoint - center, n)) * n\n", " far_point = vor.vertices[v2] + direction * radius\n", "\n", " new_region.append(len(new_vertices))\n", " new_vertices.append(far_point.tolist())\n", "\n", " # sort region counterclockwise\n", " vs = np.asarray([new_vertices[v] for v in new_region])\n", " c = vs.mean(axis=0)\n", " angles = np.arctan2(vs[:, 1] - c[1], vs[:, 0] - c[0])\n", " new_region = np.array(new_region)[np.argsort(angles)]\n", "\n", " # finish\n", " new_regions.append(new_region.tolist())\n", "\n", " return new_regions, np.asarray(new_vertices)" ] }, { "cell_type": "code", "execution_count": 39, "metadata": {}, "outputs": [], "source": [ "# Create normal voronoi\n", "vor = Voronoi(np.c_[stadiums_data.lon, stadiums_data.lat])\n", "\n", "# Finite region voronoi\n", "regions, vertices = voronoi_finite_polygons_2d(vor)\n", "poly = geometry.Polygon(vertices)" ] }, { "cell_type": "code", "execution_count": 40, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/Users/student/anaconda3/envs/cusplondon/lib/python3.6/site-packages/geopandas/base.py:29: UserWarning: GeoSeries crs mismatch: {'init': 'epsg:4326'} and {'init': 'epsg:4326', 'no_defs': True}\n", " other.crs))\n" ] } ], "source": [ "# Create list of vertices in each polygon\n", "cells = [vertices[region] for region in regions]\n", "listy = []\n", "for cell in cells:\n", " polygon = Polygon(cell)\n", " listy.append(polygon)\n", "\n", "# New column in data frame with geometry\n", "stadiums_data[\"geometry\"] = listy\n", "stadiums_data = stadiums_data.set_geometry('geometry')\n", "stadiums_data.crs = {'init': 'epsg:4326'}\n", "\n", "# Use England boundary to cut\n", "stadiums_data = gpd.overlay(stadiums_data, regionz, how='intersection')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Resolving multipolygon issue: https://github.com/geopandas/geopandas/issues/834" ] }, { "cell_type": "code", "execution_count": 41, "metadata": {}, "outputs": [], "source": [ "upcast_dispatch = {geometry.Point: geometry.MultiPoint,\n", " geometry.LineString: geometry.MultiLineString,\n", " geometry.Polygon: geometry.MultiPolygon}\n", "\n", "\n", "def maybe_cast_to_multigeometry(geom):\n", " caster = upcast_dispatch.get(type(geom), lambda x: x[0])\n", " return caster([geom])\n", "\n", "stadiums_data.geometry = stadiums_data.geometry.apply(maybe_cast_to_multigeometry)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Population attributes" ] }, { "cell_type": "code", "execution_count": 42, "metadata": {}, "outputs": [], "source": [ "# Get Census output area centroids from ONS Open Geography\n", "r = requests.get(\n", " \"https://opendata.arcgis.com/datasets/ba64f679c85f4563bfff7fad79ae57b1_0.geojson\")\n", "oa_centroids_data = r.json()\n", "centroids = gpd.GeoDataFrame.from_features(oa_centroids_data['features'], crs={'init': 'epsg:27700'})" ] }, { "cell_type": "code", "execution_count": 43, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/Users/student/anaconda3/envs/cusplondon/lib/python3.6/site-packages/IPython/core/interactiveshell.py:3020: DtypeWarning: Columns (1) have mixed types. Specify dtype option on import or set low_memory=False.\n", " interactivity=interactivity, compiler=compiler, result=result)\n" ] } ], "source": [ "# Import population and age data by output area (downloaded from www.nomisweb.co.uk)\n", "pop = pd.read_csv(\"oa_pop.csv\", header=6)\n", "\n", "# Merge LSOA boundaries with population\n", "centroids = centroids.merge(pop, left_on=\"oa11cd\", right_on=\"2011 output area\")" ] }, { "cell_type": "code", "execution_count": 44, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/Users/student/anaconda3/envs/cusplondon/lib/python3.6/site-packages/geopandas/tools/sjoin.py:44: UserWarning: CRS of frames being joined does not match!\n", " warn('CRS of frames being joined does not match!')\n" ] } ], "source": [ "# Spatial join to allocate output area centroids to club catchment areas\n", "team_geom = stadiums_data[[\"Team\", \"geometry\"]]\n", "centroids = gpd.sjoin(team_geom, centroids)\n", "\n", "# Convert population data to numeric and group by to get single instance for each club\n", "centroids[\"All usual residents\"] = pd.to_numeric(centroids[\"All usual residents\"])\n", "pop_sum = pd.DataFrame(centroids.groupby(by=\"Team\")[\n", " \"All usual residents\"].sum()).reset_index()" ] }, { "cell_type": "code", "execution_count": 45, "metadata": {}, "outputs": [], "source": [ "# Function to get weighted average through groupby\n", "def weighted_average(df, df_col, weight_col, by_col):\n", " df['_df_times_weight'] = df[df_col]*df[weight_col]\n", " df['_weight_where_notnull'] = df[weight_col]*pd.notnull(df[df_col])\n", " g = df.groupby(by_col)\n", " result = g['_df_times_weight'].sum() / g['_weight_where_notnull'].sum()\n", " del df['_df_times_weight'], df['_weight_where_notnull']\n", " return result" ] }, { "cell_type": "code", "execution_count": 46, "metadata": {}, "outputs": [], "source": [ "# Group by function with weighted average age\n", "mean_age = pd.DataFrame(weighted_average(\n", " centroids, \"Mean Age\", \"All usual residents\", \"Team\")).reset_index()\n", "mean_age.columns = [\"Team\", \"Mean age\"]" ] }, { "cell_type": "code", "execution_count": 47, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/Users/student/anaconda3/envs/cusplondon/lib/python3.6/site-packages/geopandas/io/file.py:108: FionaDeprecationWarning: Use fiona.Env() instead.\n", " with fiona.drivers():\n" ] } ], "source": [ "# Combine above in to new data frame\n", "stadiums_data2 = team_geom.merge(pop_sum, how=\"left\")\n", "stadiums_data2 = stadiums_data2.merge(mean_age, how=\"left\")\n", "stadiums_data2.columns = [\"Team\", \"geometry\", \"population\", \"average_age\"]\n", "stadiums_data2.average_age = stadiums_data2.average_age.round(1)\n", "stadiums_data2.to_file(\"club_catch.geojson\", driver='GeoJSON')\n", "club_catch = f'club_catch.geojson'" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Build map" ] }, { "cell_type": "code", "execution_count": 48, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "" ] }, "execution_count": 48, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# Folium map\n", "m = folium.Map(location=[52, -1], zoom_start=5, zoom_control=False)\n", "# Stadium marker\n", "for i in range(0, len(stadiums_data2)):\n", " folium.Marker(stadiums_data.iloc[i]['geometryA'],\n", " icon=folium.Icon(color=\"purple\", prefix= 'fa', icon=\"fa-futbol-o\"), \n", " popup=str(\n", " \"
Club: \" + str(stadiums_data2.iloc[i]['Team']) + \"
Stadium name: \" +\n", " stadiums_data.iloc[i]['Stadium_name']) +\n", " \"
Capacity: \" + str(stadiums_data.iloc[i]['capacity']) +\n", " \"
Population: \" + str(stadiums_data2.iloc[i]['population']) +\n", " \"
Average age: \" + str(stadiums_data2.iloc[i]['average_age'])+\n", " \"
YouGov fame rank: \" + str(stadiums_data.iloc[i]['rank_en'])).add_to(m)\n", "\n", "# Choropleth polygons\n", "\n", "folium.Choropleth(\n", " geo_data=club_catch,\n", " name='Average age',\n", " data=stadiums_data2[[\"Team\", \"average_age\"]],\n", " columns=['Team', 'average_age'],\n", " key_on='feature.properties.Team',\n", " fill_color='YlGn',\n", " fill_opacity=0.8,\n", " line_opacity=0.8,\n", " legend_name='average_age'\n", ").add_to(m)\n", "\n", "folium.Choropleth(\n", " geo_data=club_catch,\n", " name='Population',\n", " data=stadiums_data2[[\"Team\", \"population\"]],\n", " columns=['Team', 'population'],\n", " key_on='feature.properties.Team',\n", " fill_color='YlGn',\n", " fill_opacity=0.8,\n", " line_opacity=0.8,\n", " legend_name='population'\n", ").add_to(m)\n", "\n", "folium.LayerControl().add_to(m)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Google Chrome map display fix: https://github.com/python-visualization/folium/issues/812" ] } ], "metadata": { "kernelspec": { "display_name": "cusplondon", "language": "python", "name": "cusplondon" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.7" }, "toc": { "base_numbering": "1", "nav_menu": {}, "number_sections": false, "sideBar": true, "skip_h1_title": false, "title_cell": "Table of Contents", "title_sidebar": "Contents", "toc_cell": false, "toc_position": {}, "toc_section_display": true, "toc_window_display": true }, "varInspector": { "cols": { "lenName": 16, "lenType": 16, "lenVar": 40 }, "kernels_config": { "python": { "delete_cmd_postfix": "", "delete_cmd_prefix": "del ", "library": "var_list.py", "varRefreshCmd": "print(var_dic_list())" }, "r": { "delete_cmd_postfix": ") ", "delete_cmd_prefix": "rm(", "library": "var_list.r", "varRefreshCmd": "cat(var_dic_list()) " } }, "types_to_exclude": [ "module", "function", "builtin_function_or_method", "instance", "_Feature" ], "window_display": false } }, "nbformat": 4, "nbformat_minor": 2 }