{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Workshop Mumbai, July 2018\n", "\n", "The goal of this 4h webinar is to get you familiarized with some basic neuroimaging softwares. Keep in mind that the approaches that we show here are adapted to this workshop. We had to cut some corners and leave some analysis steps out, as they otherwise would take too much time. Keep this in mind when you apply the gained knowledge to your own data.\n", "\n", "All slides and notebooks of this webinar can found on the [Workshop Mumbai](https://github.com/miykael/workshop_mumbai) github page. For a much more in dept tutorial about [Nipype](http://nipype.readthedocs.io/en/latest/) take a look at the [Nipype Tutorial](https://miykael.github.io/nipype_tutorial/). Also, take a look at the [Workshop Cambridge](https://github.com/miykael/workshop_cambridge) github page that covers many of the presented concepts in much more details.\n", "\n", "### Important\n", "If you're running this notebook through a docker container, make sure that you used the `-v` flag to mount a folder on your system inside the container. Like this you will have access to any changes of the notebooks or output that you create within this container. For more, take a look at the [Docker information](https://github.com/miykael/workshop_cambridge#docker)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Schedule\n", "\n", "## Conda and Docker (~20min)\n", "\n", "There are many ways to create the right computational environment for your research. But if you want to use the newest technologies you will not get around using Conda or Docker.\n", "\n", " - Slides: [Conda and Docker](slides/conda_and_docker.html)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Tutorial Dataset (~10min)\n", "\n", "The dataset that we use in this workshop differs from previous workshops and focuses on resting-state data. We will be using part of the dataset from [Zang et al.](https://doi.org/10.1007/s12021-013-9187-0). The original data can be download [here](http://fcon_1000.projects.nitrc.org/indi/retro/BeijingEOEC.html).\n", "\n", "The docker container for this workshop already contains some subjects of this dataset. The functional data were reduced to two runs per subject of 100 volumes each. This was done to keep the data size and later pre-processing time to a minimum.\n", "\n", "- Notebook: [Dataset](notebooks/introduction_dataset.ipynb)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Nilearn and Nibabel (~30min)\n", "\n", "It's liberating to have direct access to your neuroimaging data. `Nibabel` and `Nilearn` allow exactly that. With those two neuroimaging packges you can consider the brain a simple 3D/4D matrix of datapoints and do with it whatever you want.\n", "\n", "- Slides: [Data Manipulation](slides/data_manipulation.html)\n", "- Notebook 1: [Nibabel](notebooks/basic_nibabel.ipynb)\n", "- Notebook 2: [Nilearn](notebooks/basic_nilearn.ipynb)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Introduction to Nipype (~20min)\n", "\n", "In this short introduction we will show you what Nipype is and why you should use it. It contains a powerful short example that shows the strength behind Nipype.\n", "\n", " - Slides: [Short introduction to Nipype](https://miykael.github.io/nipype_tutorial/notebooks/introduction_nipype.html)\n", " - Notebook: [Nipype Showcase](notebooks/introduction_showcase.ipynb)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Nipype Basics (~40min)\n", "\n", "Nipype can be learned very quickly, but it's nonetheless important that you know about some of the main building-blocks.\n", "\n", " - Slides: [Interfaces & Workflows](slides/nipype_basics.html)\n", " - Notebook: [Nodes](notebooks/basic_nodes.ipynb)\n", " - Notebook: [Interfaces](notebooks/basic_interfaces.ipynb)\n", " - Notebook: [Workflows](notebooks/basic_workflow.ipynb)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Nipype Hands-on (~60min)\n", "\n", "The goal of this section is that you build your own pre-processing pipeline with Nipype. We want to highlight again that this pipeline is only an abbreviated version of a real fMRI pre-processing pipeline. We took some short cuts to keep the computation time to a minimum. For a more realistic example, check out the Hands-on example from the [Nipype Tutorial](https://miykael.github.io/nipype_tutorial/) or the fully automatic state-of-the-art pre-processing pipeline [fmriprep](http://fmriprep.readthedocs.io/en/latest/).\n", "\n", "- Notebook: [Pre-Processing Hands-On](notebooks/handson_preprocessing.ipynb)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Machine Learning with Nilearn, PyMVPA and Keras (~60min)\n", "\n", "The goal of this section is to introduce you to some neuroimaging machine learning tools. Keep in mind that the examples are constructed in such a way that they only take little computation time. This means the specific approaches and classifier used might not be the best for your own research question. Also, we're no experts in machine learning and therefore cannot guarantee that everything is right. But we hope to show you the way, so that you can explore this topic more by yourself.\n", "\n", "Before we can start with anything, we first need to prepare the dataset for machine-learning:\n", "\n", "- Notebook 1: [Preparation](notebooks/machine_learning_preparation.ipynb)\n", "\n", "Now that this is done, let's take a look at the following machine-learning packages:\n", "\n", "- Notebook 2: [Nilearn & PyMVPA](notebooks/machine_learning_nilearn_and_pymvpa.ipynb)\n", "- Notebook 3: [Keras](notebooks/machine_learning_keras.ipynb)" ] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python [default]", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.4" } }, "nbformat": 4, "nbformat_minor": 2 }