{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "A6iFUUQLNDlE" }, "source": [ "\n", " \n", " \n", " \n", "
\n", " Run in Google Colab\n", " \n", " View on Github\n", " \n", " View raw on Github\n", "
" ] }, { "cell_type": "markdown", "metadata": { "id": "9p-FFN_yNDlH" }, "source": [ "# Module 1: Python environment setup\n", "\n", "## Overview of Python Visualization Toolkit\n", "\n", "These two talks provide pretty good overview of the Python visualization landscape.\n", "\n", "- Jake VanderPlas at PyCon 2017: [The Python Visualization Landscape](https://www.youtube.com/watch?v=FytuB8nFHPQ).\n", "- James Bednar at AnacondaCon 2018: [PyViz: Dashboards for Visualizing 1 Billion Datapoints in 30 Lines of Python](https://www.youtube.com/watch?v=k27MJJLJNT4).\n" ] }, { "cell_type": "markdown", "metadata": { "id": "oKQOR0agNDlI" }, "source": [ "## Python environment setup \n", "\n", "When you work on data visualizations, there are two main choices: local and cloud. Working locally means installing various packages in your computer and working on the cloud means using services like Google Colaboratory. \n", "\n", "Local installation can be nice and convenient (when it works) because everything is in your computer. You can work without network connection and everything can be configured exactly as you see fit. However, local installation can be a huge source of pain because there are multiple ways to install packages (e.g., Anaconda, System Python, etc.) and they can interfere and break. This \"dependency hell\" can be extremely frustrating. At the same time, working locally is limited by the power of your computer. For instance, you may not be able to run a heavy computational task because your laptop is 10 years old or because you refuse to delete a game for a huge dataset. \n", "\n", "On the other hand, working on a cloud environment can also be nice and convenient (when it works). You don't need to worry about different installation and dependency hell, as long as the environment that the cloud provider provides is adequate. All computation is done on the cloud so your computer will not suffer. But that also means you can only work within the boundary set by the cloud provider. For instance the free Google Colaboratory has a certain limit on memory and disk usage, and to go beyond you have to pay. \n", "\n", "My recommendation is the following: if you are comfortable with managing your computer and multiple Python installations and packages, or want to learn system management, go with local option. You can either use Anaconda or just vanilla Python (with `virtuanenv` and `pip`). If you don't want to spend time on this, use Google Colaboratory. \n", "\n", "### Local setup\n", "\n", "If you don't have much experience with Python ecosystem, we recommend using Anaconda, which makes most data analysis tasks easier. But it is totally fine not to use Anaconda. \n", "\n", "\n", "#### Setting up Anaconda environment\n", "\n", "Anaconda is a popular, all-in-one Python package distribution system. If your work is focused on data analysis, machine learning, visualization, etc., then just sticking with Anaconda is probably the most convenient way to manage multiple Python versions, packages, and virtual environments. Anaconda comes with a package manager called `conda` that replaces `pip` and `virtualenv`. It allows you to install packages, create virtual environments, and more. It even allows you to set a different Python version for each virtual environment. If you use Anaconda, you want to stick to it as much as possible by exclusively using `conda` (use `pip` only when there is no way to install with `conda`). \n", "\n", "First, download Anaconda for your system from [here](https://www.anaconda.com/download). Follow the instructions and reopen your shell (or use the Anaconda GUI system). By default, Anaconda activates `base` environment. You can install packages by running \n", "\n", "```\n", "$ conda install packagename\n", "```\n", "\n", "Let's install core Python data packages. \n", "\n", "```\n", "$ conda install numpy scipy pandas scikit-learn matplotlib seaborn jupyter jupyterlab\n", "```\n", "\n", "We usually want to use a virtual environment for each of your project. By using virtual environments, you can isolate each environment from the others and maintain separate sets (versions) of packages. `conda` has a built-in support for virtual environments. \n", "\n", "```\n", "$ conda create -n dviz python=3.8\n", "```\n", "\n", "This command creates a virtual environment named (`-n`) `dviz` with Python 3.8. You can activate the environment (whenever you begins to work on this course) by running\n", "\n", "```\n", "$ conda activate dviz\n", "```\n", "\n", "and deactivate (when you're done) by running\n", "\n", "```\n", "$ conda deactivate \n", "```\n", "\n", "For the full documentation, see https://conda.io/docs/user-guide/tasks/manage-environments.html \n", "\n", "When the environment is activated, your prompt will show the name of the environment. Now if you install a package, it will be installed into this environment, but not in the other virtual environment. In other words, each virtual environment maintains its own \"world\" and totally different packages can be installed in each \"world\". \n", "\n", "But even if you have created a virtual environment and installed all necessary packages, you can't use this environment in the Jupyter notebook/Lab yet. Jupyter needs a special program called \"IPython kernel\" to execute code. So, for each virtual environment we need to also create a corresponding kernel. We can do manually by using `ipykernel` package like following:\n", "\n", "```\n", "$ conda activate dviz\n", "$ conda install ipykernel\n", "$ python -m ipykernel install --user --name=dviz\n", "```\n", "\n", "or we can use [`nb_conda`](https://github.com/Anaconda-Platform/nb_conda) package by installing it. \n", "\n", "```\n", "$ conda install nb_conda\n", "```\n", "\n", "Once it's installed, kernels for all virtual environments created by `conda` are automatically created and show up in Jupyter. So this is a much more convenient option. \n", "\n", "\n", "#### `pip` + `virtualenv`\n", "\n", "If you are not using Anaconda, you can use Python's `virtualenv` to create virtual environments and `pip` to install packages to the virtual environments. See https://docs.python.org/3/tutorial/venv.html For instance, you can run the following to create a virtual environment in the current directory (it will create `dviz-env` folder). \n", "\n", "```\n", "$ python3 -m venv dviz-env\n", "```\n", "\n", "and then activate and deactivate it by \n", "\n", "```\n", "$ source dviz-env/bin/activate\n", "$ deactivate\n", "```\n", "\n", "Once you activate a virtual environment, you can install packages into it with `pip`. \n", "\n", "```\n", "$ pip install packagename\n", "```\n", "\n", "Like the case with `conda`, we need to install a kernel for each virtual environment to use it in Jupyter. \n", "\n", "```\n", "$ source dviz-env/bin/activate\n", "$ pip install ipykernel\n", "$ python -m ipykernel install --user --name=dviz\n", "```\n", "\n", "#### Jupyter\n", "\n", "Once you have setup your local environment, you can run \n", "\n", "```\n", "$ jupyter lab \n", "```\n", "\n", "or use Jupyter notebook. \n", "\n", "```\n", "jupyter notebook\n", "```\n", "\n", "Jupyter lab is the 'next generation' notebook system with many powerful features. Some packages that we use work more nicely with Jupyter lab (for some lab assignments you may need to use jupyter notebook instead of the lab). \n", "\n", "Jupyter also has a desktop app: https://github.com/jupyterlab/jupyterlab-desktop \n", "\n", "#### VS code\n", "\n", "[Visual Studio code](https://code.visualstudio.com/) is a powerful editor and it supports notebooks and interactivity for Python. If you open a Jupyter notebook file, it works seamlessly within VS code. With the [Remote - SSH plugin](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-ssh), you can also maintain almost identical experience using a computing server. \n", "\n", "\n", "### Cloud setup\n", "\n", "There are many cloud-based Jupyter notebook services. The best option is probably the Google Colaboratory. It allows installation of packages and you can even use GPU/TPUs! \n", "\n", "#### Google colaboratory\n", "\n", "[Google Colaboratory](https://colab.research.google.com/) is Google's collaborative Jupyter notebook service. This is the recommended cloud option. Most packages that we use are already installed and you can install additioanl packages by running \n", "\n", "```\n", "!pip install packagename\n", "``` \n", "\n", "in the computing cell. \n", "\n", "\n", "#### Other options\n", "\n", "- Azure notebooks: Microsoft also has a cloud notebook service called [Azure notebooks](https://notebooks.azure.com/). This service also allows installing new packages through `!pip install ...`. \n", "- CoCalc (https://cocalc.com/) is a service by [SageMath](http://www.sagemath.org/). You can use it freely but the free version is slow and can be turned off without warning. Most of the packages that we use are pre-installed. We may be able to provide a subscription through the school. \n", "- Kaggle Kernels: The famous machine learning / data science competition service Kaggle offers cloud-based notebooks called [Kaggle kernels](https://www.kaggle.com/kernels). Because you can directly use all the Kaggle datasets, it is an excellent option to do your project if you use one of the Kaggle datasets. It allows uploading your own dataset and install some packages, but not all packages are supported. \n", "\n", "\n", "## Assignment\n", "\n", "You can either use a local environment or a cloud environment.\n", "\n", "1. Set up your local Python environment following the instructions. You should be using a virtual environment on your local machine. \n", "1. Install Jupyter notebook and Jupyter lab. \n", "1. Launch jupyter notebook (lab) \n", "1. Create a new notebook and play with it. Print \"Hello world!\". \n", "\n", "If you want to use a cloud environment, \n", "\n", "1. Play with the Google colaboratory. \n", "1. Try installing/importing the following packages in the notebook.\n", "1. Create a cell that prints \"Hello world!\". \n", "1. Submit the notebook. \n", "\n", "\n", "Finally, these are the packages that we plan to use. So check out their homepages and figure out what they are about. \n", "\n", "- Jupyter Notebook and Lab: https://jupyter.org/\n", "- numpy: http://www.numpy.org/\n", "- scipy: http://www.scipy.org/\n", "- matplotlib: http://matplotlib.org/\n", "- seaborn: http://seaborn.pydata.org/\n", "- pandas: http://pandas.pydata.org/\n", "- scikit-learn: http://scikit-learn.org/stable/\n", "- altair: https://github.com/altair-viz/altair\n", "- vega_datasets: https://github.com/altair-viz/vega_datasets\n", "- bokeh: http://bokeh.pydata.org/en/latest/\n", "- datashader: http://datashader.org/\n", "- holoviews: http://holoviews.org/\n", "- wordcloud: https://github.com/amueller/word_cloud\n", "- spacy: https://spacy.io/\n", "\n", "Install them using your package manager (conda or pip).\n", "\n", "Once you have installed the Jupyter locally or succeeded with a cloud environment, run the following import cell to make sure that every package is installed successfully. Submit the notebook on the canvas. \n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "2z4vxHkNOKvz" }, "outputs": [], "source": [ "!pip install numpy scipy matplotlib seaborn pandas altair vega_datasets scikit-learn bokeh datashader holoviews wordcloud spacy" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "BabmrjoRNDlN" }, "outputs": [], "source": [ "import numpy\n", "import scipy\n", "import matplotlib\n", "import seaborn\n", "import pandas\n", "import altair\n", "import vega_datasets\n", "import sklearn\n", "import bokeh\n", "import datashader\n", "import holoviews\n", "import wordcloud\n", "import spacy" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "id": "nhy-_f8eNDlO", "inputHidden": false, "jupyter": { "outputs_hidden": false }, "outputHidden": false }, "outputs": [], "source": [] } ], "metadata": { "anaconda-cloud": {}, "colab": { "provenance": [ { "file_id": "https://github.com/yy/dviz-course/blob/master/m01-intro/lab01.ipynb", "timestamp": 1683497199468 } ] }, "kernel_info": { "name": "dviz" }, "kernelspec": { "display_name": "dviz", "language": "python", "name": "dviz" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.10" }, "nteract": { "version": "0.14.5" } }, "nbformat": 4, "nbformat_minor": 0 }