{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# scona\n", "\n", "scona is a tool to perform network analysis over correlation networks of brain regions. \n", "This tutorial will go through the basic functionality of scona, taking us from our inputs (a matrix of structural regional measures over subjects) to a report of local network measures for each brain region, and network level comparisons to a cohort of random graphs of the same degree. " ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "import networkx as nx\n", "import scona as scn\n", "import scona.datasets as datasets" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Importing data\n", "\n", "A scona analysis starts with four inputs.\n", "* __regional_measures__\n", " A pandas DataFrame with subjects as rows. The columns should include structural measures for each brain region, as well as any subject-wise covariates. \n", "* __names__\n", " A list of names of the brain regions. This will be used to specify which columns of the __regional_measures__ matrix to want to correlate over.\n", "* __covars__ _(optional)_ \n", " A list of your covariates. This will be used to specify which columns of __regional_measure__ you wish to correct for. \n", "* __centroids__\n", " A list of tuples representing the cartesian coordinates of brain regions. This list should be in the same order as the list of brain regions to accurately assign coordinates to regions. The coordinates are expected to obey the convention the the x=0 plane is the same plane that separates the left and right hemispheres of the brain. " ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": true }, "outputs": [], "source": [ "# Read in sample data from the NSPN WhitakerVertes PNAS 2016 paper.\n", "df, names, covars, centroids = datasets.NSPN_WhitakerVertes_PNAS2016.import_data()" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "scrolled": true }, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
Unnamed: 0nspn_idocccentrestudy_primaryage_scansexmaleage_binmri_centre...rh_supramarginal_part5rh_supramarginal_part6rh_supramarginal_part7rh_frontalpole_part1rh_temporalpole_part1rh_transversetemporal_part1rh_insula_part1rh_insula_part2rh_insula_part3rh_insula_part4
00103560Cambridge2K_Cohort20.761Female0.04WBIC...2.5922.8412.3182.4863.5262.6383.3082.5833.1883.089
11107020Cambridge2K_Cohort16.055Male1.02WBIC...3.4483.2832.7403.2254.0443.0403.8672.9433.4783.609
22107360Cambridge2K_Cohort14.897Female0.01WBIC...3.5263.2693.0763.1333.9002.9143.8942.8983.7203.580
33107780Cambridge2K_Cohort20.022Female0.04WBIC...2.8302.9172.6472.7963.4013.0453.1382.7392.8333.349
44107940Cambridge2K_Cohort14.656Female0.01WBIC...2.6893.2942.8202.5392.1512.7342.7912.9353.5383.403
\n", "

5 rows × 324 columns

\n", "
" ], "text/plain": [ " Unnamed: 0 nspn_id occ centre study_primary age_scan sex male \\\n", "0 0 10356 0 Cambridge 2K_Cohort 20.761 Female 0.0 \n", "1 1 10702 0 Cambridge 2K_Cohort 16.055 Male 1.0 \n", "2 2 10736 0 Cambridge 2K_Cohort 14.897 Female 0.0 \n", "3 3 10778 0 Cambridge 2K_Cohort 20.022 Female 0.0 \n", "4 4 10794 0 Cambridge 2K_Cohort 14.656 Female 0.0 \n", "\n", " age_bin mri_centre ... rh_supramarginal_part5 \\\n", "0 4 WBIC ... 2.592 \n", "1 2 WBIC ... 3.448 \n", "2 1 WBIC ... 3.526 \n", "3 4 WBIC ... 2.830 \n", "4 1 WBIC ... 2.689 \n", "\n", " rh_supramarginal_part6 rh_supramarginal_part7 rh_frontalpole_part1 \\\n", "0 2.841 2.318 2.486 \n", "1 3.283 2.740 3.225 \n", "2 3.269 3.076 3.133 \n", "3 2.917 2.647 2.796 \n", "4 3.294 2.820 2.539 \n", "\n", " rh_temporalpole_part1 rh_transversetemporal_part1 rh_insula_part1 \\\n", "0 3.526 2.638 3.308 \n", "1 4.044 3.040 3.867 \n", "2 3.900 2.914 3.894 \n", "3 3.401 3.045 3.138 \n", "4 2.151 2.734 2.791 \n", "\n", " rh_insula_part2 rh_insula_part3 rh_insula_part4 \n", "0 2.583 3.188 3.089 \n", "1 2.943 3.478 3.609 \n", "2 2.898 3.720 3.580 \n", "3 2.739 2.833 3.349 \n", "4 2.935 3.538 3.403 \n", "\n", "[5 rows x 324 columns]" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "df.head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Create a correlation matrix\n", "We calculate residuals of the matrix df for the columns of names, correcting for the columns in covars." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": true }, "outputs": [], "source": [ "df_res = scn.create_residuals_df(df, names, covars)" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
lh_bankssts_part1lh_bankssts_part2lh_caudalanteriorcingulate_part1lh_caudalmiddlefrontal_part1lh_caudalmiddlefrontal_part2lh_caudalmiddlefrontal_part3lh_caudalmiddlefrontal_part4lh_cuneus_part1lh_cuneus_part2lh_entorhinal_part1...rh_supramarginal_part5rh_supramarginal_part6rh_supramarginal_part7rh_frontalpole_part1rh_temporalpole_part1rh_transversetemporal_part1rh_insula_part1rh_insula_part2rh_insula_part3rh_insula_part4
0-0.016677-0.0362530.035892-0.0045860.040114-0.005168-0.252694-0.170037-0.192492-0.119414...-0.421125-0.123926-0.37903-0.436609-0.143411-0.103404-0.321707-0.159219-0.19032-0.393141
10.2803230.3547470.4828920.6974140.4061140.4458320.3903060.3269630.3895080.340586...0.4348750.3180740.042970.3023910.3745890.2985960.2372930.2007810.099680.126859
20.1683230.0817470.3658920.4124140.2831140.1878320.576306-0.061037-0.0624920.374586...0.5128750.3040740.378970.2103910.2305890.1725960.2642930.1557810.341680.097859
3-0.086677-0.249253-0.404108-0.362586-0.046886-0.154168-0.1566940.029963-0.246492-0.009414...-0.183125-0.047926-0.05003-0.126609-0.2684110.303596-0.491707-0.003219-0.54532-0.133141
40.4003230.1367470.125892-0.218586-1.003886-0.142168-0.752694-0.091037-0.550492-1.223414...-0.3241250.3290740.12297-0.383609-1.518411-0.007404-0.8387070.1927810.15968-0.079141
50.010323-0.090253-0.248108-0.131586-0.080886-0.110168-0.1426940.146963-0.0474920.155586...-0.097125-0.074926-0.12803-0.114609-0.2414110.2095960.3262930.0827810.268680.099859
6-0.1186770.1337470.0898920.0474140.2681140.017832-0.0376940.0619630.028508-0.214414...0.0808750.0270740.00797-0.0676090.310589-0.107404-0.033707-0.015219-0.091320.048859
70.1063230.0437470.3558920.1744140.2491140.068832-0.0176940.1209630.0955080.333586...0.4278750.1340740.25697-0.229609-0.2144110.329596-0.3557070.015781-0.02732-0.118141
80.4333230.0377470.1918920.1034140.206114-0.0911680.210306-0.1050370.0095080.388586...0.1038750.1820740.31697-0.0836090.1765890.0625960.4042930.024781-0.059320.041859
90.1913230.0817470.0938920.1004140.0121140.1808320.123306-0.062037-0.180492-0.323414...-0.197125-0.022926-0.171030.5273910.0045890.1235960.413293-0.185219-0.216320.065859
10-1.036677-0.759253-0.6211080.058414-0.2988860.005832-0.4196940.3489630.232508-0.997414...-0.4631250.1530740.01097-0.540609-0.1494110.290596-0.0787070.0737810.300680.333859
110.1833230.155747-0.1001080.186414-0.0368860.2068320.1543060.020963-0.0534920.004586...0.149875-0.113926-0.112030.0153910.2335890.134596-0.206707-0.039219-0.351320.220859
12-0.058677-0.222253-0.2301080.3734140.2191140.2348320.0183060.0269630.0905080.301586...0.1028750.059074-0.052030.061391-0.1934110.1275960.292293-0.0272190.229680.481859
130.200323-0.0482530.032892-0.1065860.048114-0.502168-0.1086940.127963-0.104492-0.043414...0.6348750.1500740.307970.413391-0.4634110.3885960.2162930.0747810.235680.192859
140.2213230.3747470.107892-0.1155860.152114-0.0891680.0323060.1809630.287508-0.422414...-0.086125-0.217926-0.060030.1983910.1715890.0265960.1242930.050781-0.02332-0.113141
150.075323-0.141253-0.1611080.162414-0.100886-0.026168-0.1456940.209963-0.070492-0.518414...0.2668750.019074-0.212030.296391-0.047411-0.364404-0.3807070.2107810.230680.030859
16-0.0476770.226747-0.0671080.0194140.165114-0.168168-0.124694-0.312037-0.0354920.112586...0.056875-0.091926-0.031030.3013910.3925890.1155960.0272930.2347810.111680.111859
170.055323-0.0552530.164892-0.322586-0.029886-0.139168-0.103694-0.0040370.0765080.069586...-0.157125-0.1179260.03697-0.2866090.2435890.1625960.0452930.1487810.221680.050859
18-0.3276770.128747-0.2601080.251414-0.1008860.145832-0.0916940.0029630.0385080.231586...0.2618750.3410740.207970.0313910.052589-0.2044040.224293-0.111219-0.085320.079859
19-0.012677-0.0902530.499892-0.0715860.1111140.0238320.103306-0.2780370.0535080.236586...-0.0031250.033074-0.08003-0.0836090.0695890.0855960.416293-0.0492190.272680.344859
200.052323-0.0962530.879892-0.218586-0.087886-0.296168-0.231694-0.080037-0.212492-0.596414...0.255875-0.167926-0.181030.095391-0.579411-0.3604040.013293-0.129219-0.256320.101859
21-0.028677-0.236253-0.091108-0.280586-0.182886-0.3201680.0363060.1039630.198508-0.637414...-0.242125-0.060926-0.35703-0.273609-0.645411-0.026404-0.2407070.0467810.162680.029859
220.2353230.1027470.2358920.1034140.131114-0.0831680.1643060.3119630.240508-0.716414...0.3748750.073074-0.163030.3033910.0115890.2075960.2382930.056781-0.277320.288859
230.1203230.066747-0.1441080.1954140.0971140.1228320.1393060.1669630.0335080.176586...0.3218750.009074-0.230030.3203910.7495890.3005960.0692930.1037810.02568-0.048141
240.0843230.094747-0.203108-0.025586-0.0428860.1278320.3323060.0059630.178508-0.394414...0.3568750.0790740.105970.071391-1.2644110.001596-0.204707-0.084219-0.05932-0.320141
250.0983230.0307470.3048920.1604140.1971140.2758320.1153060.1859630.2225080.053586...-0.1411250.200074-0.050030.1623910.155589-0.1534040.4032930.0267810.330680.282859
260.024323-0.1122530.3358920.3324140.2851140.3488320.3333060.2149630.1205080.118586...0.2638750.1270740.173970.1913910.2495890.2145960.0292930.2337810.141680.342859
270.215323-0.008253-0.0411080.0834140.159114-0.0891680.3003060.1569630.0885080.425586...-0.0981250.2800740.281970.0003910.8085890.006596-0.0367070.3317810.16468-0.137141
28-0.170677-0.324253-0.035108-0.098586-0.1788860.071832-0.251694-0.172037-0.1094920.553586...-0.1191250.0940740.13297-0.610609-0.0634110.0015960.1412930.1067810.479680.154859
29-0.102677-0.035253-0.1171080.3184140.1141140.0358320.131306-0.1080370.0295080.241586...0.008875-0.023926-0.34803-0.1586090.240589-0.3924040.222293-0.063219-0.24132-0.042141
..................................................................
267-0.604677-0.283253-0.478108-0.215586-0.244886-0.302168-0.239694-0.234037-0.143492-0.014414...-0.118125-0.104926-0.22403-0.305609-0.048411-0.260404-0.130707-0.320219-0.17832-0.153141
268-0.090677-0.174253-0.005108-0.201586-0.215886-0.309168-0.170694-0.136037-0.316492-0.344414...0.110875-0.069926-0.21103-0.029609-0.027411-0.401404-0.1517070.0147810.04168-0.192141
269-0.357677-0.1192530.010892-0.0545860.0061140.097832-0.164694-0.214037-0.3634920.059586...-0.299125-0.240926-0.48403-0.392609-0.081411-0.217404-0.334707-0.1902190.187680.153859
2700.1663230.2257470.3578920.109414-0.0178860.1098320.233306-0.187037-0.090492-0.414414...0.089875-0.1499260.093970.096391-0.332411-0.0224040.0402930.0667810.147680.005859
271-0.0566770.0327470.074892-0.0725860.1401140.0238320.040306-0.314037-0.106492-0.014414...-0.155125-0.173926-0.160030.0533910.0915890.1075960.3122930.226781-0.02432-0.068141
272-0.159677-0.130253-0.410108-0.154586-0.204886-0.074168-0.299694-0.113037-0.4174920.117586...-0.390125-0.220926-0.15603-0.479609-0.070411-0.258404-0.035707-0.082219-0.24732-0.106141
273-0.044677-0.066253-0.0541080.2894140.0551140.1438320.033306-0.118037-0.057492-0.447414...-0.160125-0.059926-0.21903-0.042609-0.973411-0.063404-0.1177070.0057810.16668-0.132141
2740.5913230.2047470.9648920.0834140.2471140.0608320.2433060.2699630.017508-0.211414...0.078875-0.030926-0.10203-0.0026090.3775890.035596-0.2977070.018781-0.000320.275859
2750.2903230.1047470.129892-0.1125860.0081140.0158320.0173060.1649630.0925080.136586...0.151875-0.0449260.03697-0.1316090.2055890.0245960.012293-0.011219-0.047320.044859
276-0.197677-0.123253-0.140108-0.063586-0.268886-0.137168-0.176694-0.029037-0.158492-1.103414...-0.183125-0.317926-0.23103-0.011609-0.146411-0.106404-0.307707-0.149219-0.46132-0.449141
277-0.115677-0.187253-0.1631080.037414-0.1778860.092832-0.1586940.042963-0.0524920.311586...-0.040125-0.1449260.04497-0.0746090.241589-0.0984040.155293-0.269219-0.115320.224859
278-0.0206770.0377470.184892-0.2315860.115114-0.2161680.078306-0.1190370.0265080.500586...0.208875-0.269926-0.075030.1093910.752589-0.303404-0.075707-0.149219-0.27532-0.155141
279-0.197677-0.128253-0.1231080.2214140.141114-0.0521680.001306-0.256037-0.088492-0.018414...-0.1791250.056074-0.25603-0.3126090.540589-0.2814040.288293-0.092219-0.173320.202859
2800.0403230.115747-0.322108-0.1635860.034114-0.075168-0.051694-0.0720370.161508-0.070414...0.193875-0.043926-0.00603-0.0256090.1295890.148596-0.181707-0.1662190.119680.239859
281-0.081677-0.254253-0.144108-0.035586-0.2218860.103832-0.140694-0.201037-0.285492-0.566414...0.021875-0.259926-0.06303-0.563609-0.486411-0.4034040.208293-0.200219-0.38132-0.004141
2820.3043230.2757470.0508920.406414-0.2608860.197832-0.0606940.0719630.2725080.086586...-0.065125-0.018926-0.05403-0.079609-0.3454110.0035960.1902930.157781-0.07332-0.016141
283-0.305677-0.238253-0.178108-0.360586-0.077886-0.201168-0.061694-0.185037-0.128492-0.393414...0.1828750.4550740.07497-0.132609-0.289411-0.1274040.108293-0.408219-0.02232-0.117141
284-0.027677-0.1582530.325892-0.156586-0.180886-0.258168-0.495694-0.088037-0.003492-0.279414...-0.426125-0.618926-0.58403-0.0956090.2345890.208596-0.040707-0.3102190.08468-0.363141
285-0.216677-0.194253-0.189108-0.320586-0.349886-0.364168-0.040694-0.165037-0.418492-0.879414...0.2868750.135074-0.15703-0.095609-0.6834110.179596-0.602707-0.162219-0.05432-0.057141
286-0.073677-0.0382530.1998920.1694140.149114-0.143168-0.025694-0.1270370.317508-0.432414...0.125875-0.206926-0.10503-0.109609-0.4894110.1105960.186293-0.040219-0.093320.113859
2870.136323-0.043253-0.317108-0.010586-0.026886-0.0011680.083306-0.012037-0.007492-0.286414...0.0498750.157074-0.143030.1713910.224589-0.2214040.475293-0.159219-0.05332-0.012141
288-0.147677-0.0592530.028892-0.108586-0.0908860.235832-0.1016940.1779630.3385080.024586...-0.1631250.239074-0.006030.1873910.0665890.212596-0.5257070.0037810.26568-0.061141
289-0.2096770.0447470.043892-0.130586-0.178886-0.134168-0.152694-0.034037-0.0114920.174586...0.039875-0.199926-0.02903-0.0806090.426589-0.293404-0.201707-0.118219-0.17732-0.079141
290-0.4036770.044747-0.360108-0.476586-0.218886-0.0991680.142306-0.432037-0.2744920.156586...-0.027125-0.110926-0.178030.0673910.109589-0.089404-0.283707-0.212219-0.072320.050859
291-0.053677-0.0792530.1228920.079414-0.1238860.2248320.0183060.1449630.0005080.409586...-0.1141250.290074-0.06003-0.3316090.1945890.309596-0.2967070.5597810.36268-0.069141
292-0.065677-0.112253-0.013108-0.2335860.178114-0.049168-0.1036940.0249630.088508-0.169414...-0.417125-0.1989260.018970.4303910.0055890.1265960.3272930.2507810.184680.247859
2930.0143230.237747-0.0931080.1004140.068114-0.174168-0.1136940.0139630.121508-0.030414...0.227875-0.383926-0.027030.099391-0.410411-0.010404-0.0757070.1817810.297680.126859
2940.0663230.071747-0.301108-0.0215860.082114-0.131168-0.339694-0.174037-0.3324920.478586...0.3298750.0220740.23197-0.3356090.068589-0.0074040.1652930.018781-0.209320.309859
2950.053323-0.2692530.174892-0.041586-0.318886-0.093168-0.1996940.185963-0.2724920.304586...-0.215125-0.4669260.036970.109391-0.486411-0.272404-0.329707-0.2042190.12468-0.165141
296-0.2066770.0447470.205892-0.1595860.087114-0.107168-0.1916940.220963-0.106492-0.066414...-0.0221250.2800740.143970.2253910.3125890.3665960.0282930.1787810.47468-0.205141
\n", "

297 rows × 308 columns

\n", "
" ], "text/plain": [ " lh_bankssts_part1 lh_bankssts_part2 lh_caudalanteriorcingulate_part1 \\\n", "0 -0.016677 -0.036253 0.035892 \n", "1 0.280323 0.354747 0.482892 \n", "2 0.168323 0.081747 0.365892 \n", "3 -0.086677 -0.249253 -0.404108 \n", "4 0.400323 0.136747 0.125892 \n", "5 0.010323 -0.090253 -0.248108 \n", "6 -0.118677 0.133747 0.089892 \n", "7 0.106323 0.043747 0.355892 \n", "8 0.433323 0.037747 0.191892 \n", "9 0.191323 0.081747 0.093892 \n", "10 -1.036677 -0.759253 -0.621108 \n", "11 0.183323 0.155747 -0.100108 \n", "12 -0.058677 -0.222253 -0.230108 \n", "13 0.200323 -0.048253 0.032892 \n", "14 0.221323 0.374747 0.107892 \n", "15 0.075323 -0.141253 -0.161108 \n", "16 -0.047677 0.226747 -0.067108 \n", "17 0.055323 -0.055253 0.164892 \n", "18 -0.327677 0.128747 -0.260108 \n", "19 -0.012677 -0.090253 0.499892 \n", "20 0.052323 -0.096253 0.879892 \n", "21 -0.028677 -0.236253 -0.091108 \n", "22 0.235323 0.102747 0.235892 \n", "23 0.120323 0.066747 -0.144108 \n", "24 0.084323 0.094747 -0.203108 \n", "25 0.098323 0.030747 0.304892 \n", "26 0.024323 -0.112253 0.335892 \n", "27 0.215323 -0.008253 -0.041108 \n", "28 -0.170677 -0.324253 -0.035108 \n", "29 -0.102677 -0.035253 -0.117108 \n", ".. ... ... ... \n", "267 -0.604677 -0.283253 -0.478108 \n", "268 -0.090677 -0.174253 -0.005108 \n", "269 -0.357677 -0.119253 0.010892 \n", "270 0.166323 0.225747 0.357892 \n", "271 -0.056677 0.032747 0.074892 \n", "272 -0.159677 -0.130253 -0.410108 \n", "273 -0.044677 -0.066253 -0.054108 \n", "274 0.591323 0.204747 0.964892 \n", "275 0.290323 0.104747 0.129892 \n", "276 -0.197677 -0.123253 -0.140108 \n", "277 -0.115677 -0.187253 -0.163108 \n", "278 -0.020677 0.037747 0.184892 \n", "279 -0.197677 -0.128253 -0.123108 \n", "280 0.040323 0.115747 -0.322108 \n", "281 -0.081677 -0.254253 -0.144108 \n", "282 0.304323 0.275747 0.050892 \n", "283 -0.305677 -0.238253 -0.178108 \n", "284 -0.027677 -0.158253 0.325892 \n", "285 -0.216677 -0.194253 -0.189108 \n", "286 -0.073677 -0.038253 0.199892 \n", "287 0.136323 -0.043253 -0.317108 \n", "288 -0.147677 -0.059253 0.028892 \n", "289 -0.209677 0.044747 0.043892 \n", "290 -0.403677 0.044747 -0.360108 \n", "291 -0.053677 -0.079253 0.122892 \n", "292 -0.065677 -0.112253 -0.013108 \n", "293 0.014323 0.237747 -0.093108 \n", "294 0.066323 0.071747 -0.301108 \n", "295 0.053323 -0.269253 0.174892 \n", "296 -0.206677 0.044747 0.205892 \n", "\n", " lh_caudalmiddlefrontal_part1 lh_caudalmiddlefrontal_part2 \\\n", "0 -0.004586 0.040114 \n", "1 0.697414 0.406114 \n", "2 0.412414 0.283114 \n", "3 -0.362586 -0.046886 \n", "4 -0.218586 -1.003886 \n", "5 -0.131586 -0.080886 \n", "6 0.047414 0.268114 \n", "7 0.174414 0.249114 \n", "8 0.103414 0.206114 \n", "9 0.100414 0.012114 \n", "10 0.058414 -0.298886 \n", "11 0.186414 -0.036886 \n", "12 0.373414 0.219114 \n", "13 -0.106586 0.048114 \n", "14 -0.115586 0.152114 \n", "15 0.162414 -0.100886 \n", "16 0.019414 0.165114 \n", "17 -0.322586 -0.029886 \n", "18 0.251414 -0.100886 \n", "19 -0.071586 0.111114 \n", "20 -0.218586 -0.087886 \n", "21 -0.280586 -0.182886 \n", "22 0.103414 0.131114 \n", "23 0.195414 0.097114 \n", "24 -0.025586 -0.042886 \n", "25 0.160414 0.197114 \n", "26 0.332414 0.285114 \n", "27 0.083414 0.159114 \n", "28 -0.098586 -0.178886 \n", "29 0.318414 0.114114 \n", ".. ... ... \n", "267 -0.215586 -0.244886 \n", "268 -0.201586 -0.215886 \n", "269 -0.054586 0.006114 \n", "270 0.109414 -0.017886 \n", "271 -0.072586 0.140114 \n", "272 -0.154586 -0.204886 \n", "273 0.289414 0.055114 \n", "274 0.083414 0.247114 \n", "275 -0.112586 0.008114 \n", "276 -0.063586 -0.268886 \n", "277 0.037414 -0.177886 \n", "278 -0.231586 0.115114 \n", "279 0.221414 0.141114 \n", "280 -0.163586 0.034114 \n", "281 -0.035586 -0.221886 \n", "282 0.406414 -0.260886 \n", "283 -0.360586 -0.077886 \n", "284 -0.156586 -0.180886 \n", "285 -0.320586 -0.349886 \n", "286 0.169414 0.149114 \n", "287 -0.010586 -0.026886 \n", "288 -0.108586 -0.090886 \n", "289 -0.130586 -0.178886 \n", "290 -0.476586 -0.218886 \n", "291 0.079414 -0.123886 \n", "292 -0.233586 0.178114 \n", "293 0.100414 0.068114 \n", "294 -0.021586 0.082114 \n", "295 -0.041586 -0.318886 \n", "296 -0.159586 0.087114 \n", "\n", " lh_caudalmiddlefrontal_part3 lh_caudalmiddlefrontal_part4 \\\n", "0 -0.005168 -0.252694 \n", "1 0.445832 0.390306 \n", "2 0.187832 0.576306 \n", "3 -0.154168 -0.156694 \n", "4 -0.142168 -0.752694 \n", "5 -0.110168 -0.142694 \n", "6 0.017832 -0.037694 \n", "7 0.068832 -0.017694 \n", "8 -0.091168 0.210306 \n", "9 0.180832 0.123306 \n", "10 0.005832 -0.419694 \n", "11 0.206832 0.154306 \n", "12 0.234832 0.018306 \n", "13 -0.502168 -0.108694 \n", "14 -0.089168 0.032306 \n", "15 -0.026168 -0.145694 \n", "16 -0.168168 -0.124694 \n", "17 -0.139168 -0.103694 \n", "18 0.145832 -0.091694 \n", "19 0.023832 0.103306 \n", "20 -0.296168 -0.231694 \n", "21 -0.320168 0.036306 \n", "22 -0.083168 0.164306 \n", "23 0.122832 0.139306 \n", "24 0.127832 0.332306 \n", "25 0.275832 0.115306 \n", "26 0.348832 0.333306 \n", "27 -0.089168 0.300306 \n", "28 0.071832 -0.251694 \n", "29 0.035832 0.131306 \n", ".. ... ... \n", "267 -0.302168 -0.239694 \n", "268 -0.309168 -0.170694 \n", "269 0.097832 -0.164694 \n", "270 0.109832 0.233306 \n", "271 0.023832 0.040306 \n", "272 -0.074168 -0.299694 \n", "273 0.143832 0.033306 \n", "274 0.060832 0.243306 \n", "275 0.015832 0.017306 \n", "276 -0.137168 -0.176694 \n", "277 0.092832 -0.158694 \n", "278 -0.216168 0.078306 \n", "279 -0.052168 0.001306 \n", "280 -0.075168 -0.051694 \n", "281 0.103832 -0.140694 \n", "282 0.197832 -0.060694 \n", "283 -0.201168 -0.061694 \n", "284 -0.258168 -0.495694 \n", "285 -0.364168 -0.040694 \n", "286 -0.143168 -0.025694 \n", "287 -0.001168 0.083306 \n", "288 0.235832 -0.101694 \n", "289 -0.134168 -0.152694 \n", "290 -0.099168 0.142306 \n", "291 0.224832 0.018306 \n", "292 -0.049168 -0.103694 \n", "293 -0.174168 -0.113694 \n", "294 -0.131168 -0.339694 \n", "295 -0.093168 -0.199694 \n", "296 -0.107168 -0.191694 \n", "\n", " lh_cuneus_part1 lh_cuneus_part2 lh_entorhinal_part1 ... \\\n", "0 -0.170037 -0.192492 -0.119414 ... \n", "1 0.326963 0.389508 0.340586 ... \n", "2 -0.061037 -0.062492 0.374586 ... \n", "3 0.029963 -0.246492 -0.009414 ... \n", "4 -0.091037 -0.550492 -1.223414 ... \n", "5 0.146963 -0.047492 0.155586 ... \n", "6 0.061963 0.028508 -0.214414 ... \n", "7 0.120963 0.095508 0.333586 ... \n", "8 -0.105037 0.009508 0.388586 ... \n", "9 -0.062037 -0.180492 -0.323414 ... \n", "10 0.348963 0.232508 -0.997414 ... \n", "11 0.020963 -0.053492 0.004586 ... \n", "12 0.026963 0.090508 0.301586 ... \n", "13 0.127963 -0.104492 -0.043414 ... \n", "14 0.180963 0.287508 -0.422414 ... \n", "15 0.209963 -0.070492 -0.518414 ... \n", "16 -0.312037 -0.035492 0.112586 ... \n", "17 -0.004037 0.076508 0.069586 ... \n", "18 0.002963 0.038508 0.231586 ... \n", "19 -0.278037 0.053508 0.236586 ... \n", "20 -0.080037 -0.212492 -0.596414 ... \n", "21 0.103963 0.198508 -0.637414 ... \n", "22 0.311963 0.240508 -0.716414 ... \n", "23 0.166963 0.033508 0.176586 ... \n", "24 0.005963 0.178508 -0.394414 ... \n", "25 0.185963 0.222508 0.053586 ... \n", "26 0.214963 0.120508 0.118586 ... \n", "27 0.156963 0.088508 0.425586 ... \n", "28 -0.172037 -0.109492 0.553586 ... \n", "29 -0.108037 0.029508 0.241586 ... \n", ".. ... ... ... ... \n", "267 -0.234037 -0.143492 -0.014414 ... \n", "268 -0.136037 -0.316492 -0.344414 ... \n", "269 -0.214037 -0.363492 0.059586 ... \n", "270 -0.187037 -0.090492 -0.414414 ... \n", "271 -0.314037 -0.106492 -0.014414 ... \n", "272 -0.113037 -0.417492 0.117586 ... \n", "273 -0.118037 -0.057492 -0.447414 ... \n", "274 0.269963 0.017508 -0.211414 ... \n", "275 0.164963 0.092508 0.136586 ... \n", "276 -0.029037 -0.158492 -1.103414 ... \n", "277 0.042963 -0.052492 0.311586 ... \n", "278 -0.119037 0.026508 0.500586 ... \n", "279 -0.256037 -0.088492 -0.018414 ... \n", "280 -0.072037 0.161508 -0.070414 ... \n", "281 -0.201037 -0.285492 -0.566414 ... \n", "282 0.071963 0.272508 0.086586 ... \n", "283 -0.185037 -0.128492 -0.393414 ... \n", "284 -0.088037 -0.003492 -0.279414 ... \n", "285 -0.165037 -0.418492 -0.879414 ... \n", "286 -0.127037 0.317508 -0.432414 ... \n", "287 -0.012037 -0.007492 -0.286414 ... \n", "288 0.177963 0.338508 0.024586 ... \n", "289 -0.034037 -0.011492 0.174586 ... \n", "290 -0.432037 -0.274492 0.156586 ... \n", "291 0.144963 0.000508 0.409586 ... \n", "292 0.024963 0.088508 -0.169414 ... \n", "293 0.013963 0.121508 -0.030414 ... \n", "294 -0.174037 -0.332492 0.478586 ... \n", "295 0.185963 -0.272492 0.304586 ... \n", "296 0.220963 -0.106492 -0.066414 ... \n", "\n", " rh_supramarginal_part5 rh_supramarginal_part6 rh_supramarginal_part7 \\\n", "0 -0.421125 -0.123926 -0.37903 \n", "1 0.434875 0.318074 0.04297 \n", "2 0.512875 0.304074 0.37897 \n", "3 -0.183125 -0.047926 -0.05003 \n", "4 -0.324125 0.329074 0.12297 \n", "5 -0.097125 -0.074926 -0.12803 \n", "6 0.080875 0.027074 0.00797 \n", "7 0.427875 0.134074 0.25697 \n", "8 0.103875 0.182074 0.31697 \n", "9 -0.197125 -0.022926 -0.17103 \n", "10 -0.463125 0.153074 0.01097 \n", "11 0.149875 -0.113926 -0.11203 \n", "12 0.102875 0.059074 -0.05203 \n", "13 0.634875 0.150074 0.30797 \n", "14 -0.086125 -0.217926 -0.06003 \n", "15 0.266875 0.019074 -0.21203 \n", "16 0.056875 -0.091926 -0.03103 \n", "17 -0.157125 -0.117926 0.03697 \n", "18 0.261875 0.341074 0.20797 \n", "19 -0.003125 0.033074 -0.08003 \n", "20 0.255875 -0.167926 -0.18103 \n", "21 -0.242125 -0.060926 -0.35703 \n", "22 0.374875 0.073074 -0.16303 \n", "23 0.321875 0.009074 -0.23003 \n", "24 0.356875 0.079074 0.10597 \n", "25 -0.141125 0.200074 -0.05003 \n", "26 0.263875 0.127074 0.17397 \n", "27 -0.098125 0.280074 0.28197 \n", "28 -0.119125 0.094074 0.13297 \n", "29 0.008875 -0.023926 -0.34803 \n", ".. ... ... ... \n", "267 -0.118125 -0.104926 -0.22403 \n", "268 0.110875 -0.069926 -0.21103 \n", "269 -0.299125 -0.240926 -0.48403 \n", "270 0.089875 -0.149926 0.09397 \n", "271 -0.155125 -0.173926 -0.16003 \n", "272 -0.390125 -0.220926 -0.15603 \n", "273 -0.160125 -0.059926 -0.21903 \n", "274 0.078875 -0.030926 -0.10203 \n", "275 0.151875 -0.044926 0.03697 \n", "276 -0.183125 -0.317926 -0.23103 \n", "277 -0.040125 -0.144926 0.04497 \n", "278 0.208875 -0.269926 -0.07503 \n", "279 -0.179125 0.056074 -0.25603 \n", "280 0.193875 -0.043926 -0.00603 \n", "281 0.021875 -0.259926 -0.06303 \n", "282 -0.065125 -0.018926 -0.05403 \n", "283 0.182875 0.455074 0.07497 \n", "284 -0.426125 -0.618926 -0.58403 \n", "285 0.286875 0.135074 -0.15703 \n", "286 0.125875 -0.206926 -0.10503 \n", "287 0.049875 0.157074 -0.14303 \n", "288 -0.163125 0.239074 -0.00603 \n", "289 0.039875 -0.199926 -0.02903 \n", "290 -0.027125 -0.110926 -0.17803 \n", "291 -0.114125 0.290074 -0.06003 \n", "292 -0.417125 -0.198926 0.01897 \n", "293 0.227875 -0.383926 -0.02703 \n", "294 0.329875 0.022074 0.23197 \n", "295 -0.215125 -0.466926 0.03697 \n", "296 -0.022125 0.280074 0.14397 \n", "\n", " rh_frontalpole_part1 rh_temporalpole_part1 rh_transversetemporal_part1 \\\n", "0 -0.436609 -0.143411 -0.103404 \n", "1 0.302391 0.374589 0.298596 \n", "2 0.210391 0.230589 0.172596 \n", "3 -0.126609 -0.268411 0.303596 \n", "4 -0.383609 -1.518411 -0.007404 \n", "5 -0.114609 -0.241411 0.209596 \n", "6 -0.067609 0.310589 -0.107404 \n", "7 -0.229609 -0.214411 0.329596 \n", "8 -0.083609 0.176589 0.062596 \n", "9 0.527391 0.004589 0.123596 \n", "10 -0.540609 -0.149411 0.290596 \n", "11 0.015391 0.233589 0.134596 \n", "12 0.061391 -0.193411 0.127596 \n", "13 0.413391 -0.463411 0.388596 \n", "14 0.198391 0.171589 0.026596 \n", "15 0.296391 -0.047411 -0.364404 \n", "16 0.301391 0.392589 0.115596 \n", "17 -0.286609 0.243589 0.162596 \n", "18 0.031391 0.052589 -0.204404 \n", "19 -0.083609 0.069589 0.085596 \n", "20 0.095391 -0.579411 -0.360404 \n", "21 -0.273609 -0.645411 -0.026404 \n", "22 0.303391 0.011589 0.207596 \n", "23 0.320391 0.749589 0.300596 \n", "24 0.071391 -1.264411 0.001596 \n", "25 0.162391 0.155589 -0.153404 \n", "26 0.191391 0.249589 0.214596 \n", "27 0.000391 0.808589 0.006596 \n", "28 -0.610609 -0.063411 0.001596 \n", "29 -0.158609 0.240589 -0.392404 \n", ".. ... ... ... \n", "267 -0.305609 -0.048411 -0.260404 \n", "268 -0.029609 -0.027411 -0.401404 \n", "269 -0.392609 -0.081411 -0.217404 \n", "270 0.096391 -0.332411 -0.022404 \n", "271 0.053391 0.091589 0.107596 \n", "272 -0.479609 -0.070411 -0.258404 \n", "273 -0.042609 -0.973411 -0.063404 \n", "274 -0.002609 0.377589 0.035596 \n", "275 -0.131609 0.205589 0.024596 \n", "276 -0.011609 -0.146411 -0.106404 \n", "277 -0.074609 0.241589 -0.098404 \n", "278 0.109391 0.752589 -0.303404 \n", "279 -0.312609 0.540589 -0.281404 \n", "280 -0.025609 0.129589 0.148596 \n", "281 -0.563609 -0.486411 -0.403404 \n", "282 -0.079609 -0.345411 0.003596 \n", "283 -0.132609 -0.289411 -0.127404 \n", "284 -0.095609 0.234589 0.208596 \n", "285 -0.095609 -0.683411 0.179596 \n", "286 -0.109609 -0.489411 0.110596 \n", "287 0.171391 0.224589 -0.221404 \n", "288 0.187391 0.066589 0.212596 \n", "289 -0.080609 0.426589 -0.293404 \n", "290 0.067391 0.109589 -0.089404 \n", "291 -0.331609 0.194589 0.309596 \n", "292 0.430391 0.005589 0.126596 \n", "293 0.099391 -0.410411 -0.010404 \n", "294 -0.335609 0.068589 -0.007404 \n", "295 0.109391 -0.486411 -0.272404 \n", "296 0.225391 0.312589 0.366596 \n", "\n", " rh_insula_part1 rh_insula_part2 rh_insula_part3 rh_insula_part4 \n", "0 -0.321707 -0.159219 -0.19032 -0.393141 \n", "1 0.237293 0.200781 0.09968 0.126859 \n", "2 0.264293 0.155781 0.34168 0.097859 \n", "3 -0.491707 -0.003219 -0.54532 -0.133141 \n", "4 -0.838707 0.192781 0.15968 -0.079141 \n", "5 0.326293 0.082781 0.26868 0.099859 \n", "6 -0.033707 -0.015219 -0.09132 0.048859 \n", "7 -0.355707 0.015781 -0.02732 -0.118141 \n", "8 0.404293 0.024781 -0.05932 0.041859 \n", "9 0.413293 -0.185219 -0.21632 0.065859 \n", "10 -0.078707 0.073781 0.30068 0.333859 \n", "11 -0.206707 -0.039219 -0.35132 0.220859 \n", "12 0.292293 -0.027219 0.22968 0.481859 \n", "13 0.216293 0.074781 0.23568 0.192859 \n", "14 0.124293 0.050781 -0.02332 -0.113141 \n", "15 -0.380707 0.210781 0.23068 0.030859 \n", "16 0.027293 0.234781 0.11168 0.111859 \n", "17 0.045293 0.148781 0.22168 0.050859 \n", "18 0.224293 -0.111219 -0.08532 0.079859 \n", "19 0.416293 -0.049219 0.27268 0.344859 \n", "20 0.013293 -0.129219 -0.25632 0.101859 \n", "21 -0.240707 0.046781 0.16268 0.029859 \n", "22 0.238293 0.056781 -0.27732 0.288859 \n", "23 0.069293 0.103781 0.02568 -0.048141 \n", "24 -0.204707 -0.084219 -0.05932 -0.320141 \n", "25 0.403293 0.026781 0.33068 0.282859 \n", "26 0.029293 0.233781 0.14168 0.342859 \n", "27 -0.036707 0.331781 0.16468 -0.137141 \n", "28 0.141293 0.106781 0.47968 0.154859 \n", "29 0.222293 -0.063219 -0.24132 -0.042141 \n", ".. ... ... ... ... \n", "267 -0.130707 -0.320219 -0.17832 -0.153141 \n", "268 -0.151707 0.014781 0.04168 -0.192141 \n", "269 -0.334707 -0.190219 0.18768 0.153859 \n", "270 0.040293 0.066781 0.14768 0.005859 \n", "271 0.312293 0.226781 -0.02432 -0.068141 \n", "272 -0.035707 -0.082219 -0.24732 -0.106141 \n", "273 -0.117707 0.005781 0.16668 -0.132141 \n", "274 -0.297707 0.018781 -0.00032 0.275859 \n", "275 0.012293 -0.011219 -0.04732 0.044859 \n", "276 -0.307707 -0.149219 -0.46132 -0.449141 \n", "277 0.155293 -0.269219 -0.11532 0.224859 \n", "278 -0.075707 -0.149219 -0.27532 -0.155141 \n", "279 0.288293 -0.092219 -0.17332 0.202859 \n", "280 -0.181707 -0.166219 0.11968 0.239859 \n", "281 0.208293 -0.200219 -0.38132 -0.004141 \n", "282 0.190293 0.157781 -0.07332 -0.016141 \n", "283 0.108293 -0.408219 -0.02232 -0.117141 \n", "284 -0.040707 -0.310219 0.08468 -0.363141 \n", "285 -0.602707 -0.162219 -0.05432 -0.057141 \n", "286 0.186293 -0.040219 -0.09332 0.113859 \n", "287 0.475293 -0.159219 -0.05332 -0.012141 \n", "288 -0.525707 0.003781 0.26568 -0.061141 \n", "289 -0.201707 -0.118219 -0.17732 -0.079141 \n", "290 -0.283707 -0.212219 -0.07232 0.050859 \n", "291 -0.296707 0.559781 0.36268 -0.069141 \n", "292 0.327293 0.250781 0.18468 0.247859 \n", "293 -0.075707 0.181781 0.29768 0.126859 \n", "294 0.165293 0.018781 -0.20932 0.309859 \n", "295 -0.329707 -0.204219 0.12468 -0.165141 \n", "296 0.028293 0.178781 0.47468 -0.205141 \n", "\n", "[297 rows x 308 columns]" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "df_res" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we create a correlation matrix over the columns of df_res" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": true, "scrolled": false }, "outputs": [], "source": [ "M = scn.create_corrmat(df_res, method='pearson')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Create a weighted graph\n", "\n", "A short sidenote on the BrainNetwork class: This is a very lightweight subclass of the [`Networkx.Graph`](https://networkx.github.io/documentation/stable/reference/classes/graph.html) class. This means that any methods you can use on a `Networkx.Graph` object can also be used on a `BrainNetwork` object, although the reverse is not true. We have added various methods which allow us to keep track of measures that have already been calculated, which, especially later on when one is dealing with 10^3 random graphs, saves a lot of time. \n", "All scona measures are implemented in such a way that they can be used on a regular `Networkx.Graph` object. For example, instead of `G.threshold(10)` you can use `scn.threshold_graph(G, 10)`. \n", "Also you can create a `BrainNetwork` from a `Networkx.Graph` `G`, using `scn.BrainNetwork(network=G)`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Initialise a weighted graph `G` from the correlation matrix `M`. The `parcellation` and `centroids` arguments are used to label nodes with names and coordinates respectively. " ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": true }, "outputs": [], "source": [ "G = scn.BrainNetwork(network=M, parcellation=names, centroids=centroids)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Threshold to create a binary graph\n", "\n", "We threshold G at cost 10 to create a binary graph with 10% as many edges as the complete graph G. Ordinarily when thresholding one takes the 10% of edges with the highest weight. In our case, because we want the resulting graph to be connected, we calculate a minimum spanning tree first. If you want to omit this step, you can pass the argument `mst=False` to `threshold`.\n", "The threshold method does not edit objects inplace" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": true }, "outputs": [], "source": [ "H = G.threshold(10)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Calculate nodal summary. \n", "\n", "`calculate_nodal_measures` will compute and record the following nodal measures \n", "\n", "* average_dist (if centroids available)\n", "* total_dist (if centroids available)\n", "* betweenness\n", "* closeness\n", "* clustering coefficient\n", "* degree\n", "* interhem (if centroids are available)\n", "* interhem_proportion (if centroids are available)\n", "* nodal partition\n", "* participation coefficient under partition calculated above\n", "* shortest_path_length" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`export_nodal_measure` returns nodal attributes in a DataFrame. Let's try it now." ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
centroidsnamexyz
0[-27.965157, -19.013702, 17.919528]lh_bankssts_part1-27.9652-19.013717.9195
1[-14.455663, -13.693461, 13.713674]lh_bankssts_part2-14.4557-13.693513.7137
2[-33.906934, -22.284672, -15.821168]lh_caudalanteriorcingulate_part1-33.9069-22.2847-15.8212
3[-17.305373, -53.431573, -36.017154]lh_caudalmiddlefrontal_part1-17.3054-53.4316-36.0172
4[-22.265823, -64.366296, -37.674831]lh_caudalmiddlefrontal_part2-22.2658-64.3663-37.6748
\n", "
" ], "text/plain": [ " centroids name \\\n", "0 [-27.965157, -19.013702, 17.919528] lh_bankssts_part1 \n", "1 [-14.455663, -13.693461, 13.713674] lh_bankssts_part2 \n", "2 [-33.906934, -22.284672, -15.821168] lh_caudalanteriorcingulate_part1 \n", "3 [-17.305373, -53.431573, -36.017154] lh_caudalmiddlefrontal_part1 \n", "4 [-22.265823, -64.366296, -37.674831] lh_caudalmiddlefrontal_part2 \n", "\n", " x y z \n", "0 -27.9652 -19.0137 17.9195 \n", "1 -14.4557 -13.6935 13.7137 \n", "2 -33.9069 -22.2847 -15.8212 \n", "3 -17.3054 -53.4316 -36.0172 \n", "4 -22.2658 -64.3663 -37.6748 " ] }, "execution_count": 11, "metadata": {}, "output_type": "execute_result" } ], "source": [ "H.report_nodal_measures().head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Use `calculate_nodal_measures` to fill in a bunch of nodal measures" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " Calculating participation coefficient - may take a little while\n" ] } ], "source": [ "H.calculate_nodal_measures()" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "scrolled": false }, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
betweennesscentroidsclosenessclusteringdegreemodulenameparticipation_coefficientshortest_path_lengthxyz
00.00824713[-27.965157, -19.013702, 17.919528]0.4959610.3358470lh_bankssts_part10.7170670.00824713-27.9652-19.013717.9195
10.0124798[-14.455663, -13.693461, 13.713674]0.5074380.278788550lh_bankssts_part20.8095870.0124798-14.4557-13.693513.7137
20[-33.906934, -22.284672, -15.821168]0.336254121lh_caudalanteriorcingulate_part10.750-33.9069-22.2847-15.8212
30.0120765[-17.305373, -53.431573, -36.017154]0.5256850.383485832lh_caudalmiddlefrontal_part10.4598640.0120765-17.3054-53.4316-36.0172
40.0292617[-22.265823, -64.366296, -37.674831]0.5491950.293617952lh_caudalmiddlefrontal_part20.6887530.0292617-22.2658-64.3663-37.6748
\n", "
" ], "text/plain": [ " betweenness centroids closeness clustering \\\n", "0 0.00824713 [-27.965157, -19.013702, 17.919528] 0.495961 0.3358 \n", "1 0.0124798 [-14.455663, -13.693461, 13.713674] 0.507438 0.278788 \n", "2 0 [-33.906934, -22.284672, -15.821168] 0.336254 1 \n", "3 0.0120765 [-17.305373, -53.431573, -36.017154] 0.525685 0.383485 \n", "4 0.0292617 [-22.265823, -64.366296, -37.674831] 0.549195 0.293617 \n", "\n", " degree module name participation_coefficient \\\n", "0 47 0 lh_bankssts_part1 0.717067 \n", "1 55 0 lh_bankssts_part2 0.809587 \n", "2 2 1 lh_caudalanteriorcingulate_part1 0.75 \n", "3 83 2 lh_caudalmiddlefrontal_part1 0.459864 \n", "4 95 2 lh_caudalmiddlefrontal_part2 0.688753 \n", "\n", " shortest_path_length x y z \n", "0 0.00824713 -27.9652 -19.0137 17.9195 \n", "1 0.0124798 -14.4557 -13.6935 13.7137 \n", "2 0 -33.9069 -22.2847 -15.8212 \n", "3 0.0120765 -17.3054 -53.4316 -36.0172 \n", "4 0.0292617 -22.2658 -64.3663 -37.6748 " ] }, "execution_count": 14, "metadata": {}, "output_type": "execute_result" } ], "source": [ "H.report_nodal_measures().head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can also add measures as one might normally add nodal attributes to a networkx graph" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "collapsed": true }, "outputs": [], "source": [ "nx.set_node_attributes(H, name=\"hat\", values={x: x**2 for x in H.nodes})" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "These show up in our DataFrame too" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
degreehatname
0470lh_bankssts_part1
1551lh_bankssts_part2
224lh_caudalanteriorcingulate_part1
3839lh_caudalmiddlefrontal_part1
49516lh_caudalmiddlefrontal_part2
\n", "
" ], "text/plain": [ " degree hat name\n", "0 47 0 lh_bankssts_part1\n", "1 55 1 lh_bankssts_part2\n", "2 2 4 lh_caudalanteriorcingulate_part1\n", "3 83 9 lh_caudalmiddlefrontal_part1\n", "4 95 16 lh_caudalmiddlefrontal_part2" ] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" } ], "source": [ "H.report_nodal_measures(columns=['name', 'degree', 'hat']).head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Calculate Global measures" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'assortativity': 0.09076922258276784,\n", " 'average_clustering': 0.4498887255891581,\n", " 'average_shortest_path_length': 2.376242649858285,\n", " 'efficiency': 0.47983958611582617,\n", " 'modularity': 0.3828553111606414}" ] }, "execution_count": 18, "metadata": {}, "output_type": "execute_result" } ], "source": [ "H.calculate_global_measures()" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "scrolled": false }, "outputs": [], "source": [ "H.rich_club();" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Create a GraphBundle\n", "\n", "The `GraphBundle` object is the scona way to handle across network comparisons. What is it? Essentially it's a python dictionary with `BrainNetwork` objects as values. " ] }, { "cell_type": "code", "execution_count": 21, "metadata": { "collapsed": true }, "outputs": [], "source": [ "brain_bundle = scn.GraphBundle([H], ['NSPN_cost=10'])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This creates a dictionary-like object with BrainNetwork `H` keyed by `'NSPN_cost=10'`" ] }, { "cell_type": "code", "execution_count": 22, "metadata": { "scrolled": true }, "outputs": [ { "data": { "text/plain": [ "{'NSPN_cost=10': }" ] }, "execution_count": 22, "metadata": {}, "output_type": "execute_result" } ], "source": [ "brain_bundle" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now add a series of random_graphs created by edge swap randomisation of H (keyed by `'NSPN_cost=10'`)" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " Creating 10 random graphs - may take a little while\n" ] } ], "source": [ "# Note that 10 is not usually a sufficient number of random graphs to do meaningful analysis,\n", "# it is used here for time considerations\n", "brain_bundle.create_random_graphs('NSPN_cost=10', 10)" ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "{'NSPN_cost=10': ,\n", " 'NSPN_cost=10_R1': ,\n", " 'NSPN_cost=10_R10': ,\n", " 'NSPN_cost=10_R2': ,\n", " 'NSPN_cost=10_R3': ,\n", " 'NSPN_cost=10_R4': ,\n", " 'NSPN_cost=10_R5': ,\n", " 'NSPN_cost=10_R6': ,\n", " 'NSPN_cost=10_R7': ,\n", " 'NSPN_cost=10_R8': ,\n", " 'NSPN_cost=10_R9': }" ] }, "execution_count": 24, "metadata": {}, "output_type": "execute_result" } ], "source": [ "brain_bundle" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Report on a GraphBundle\n", "\n", "The following method will calculate global measures ( if they have not already been calculated) for all of the graphs in `graph_bundle` and report the results in a DataFrame. We can do the same for rich club coefficients below." ] }, { "cell_type": "code", "execution_count": 25, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
assortativityaverage_clusteringaverage_shortest_path_lengthefficiencymodularity
NSPN_cost=100.0907690.4498892.3762430.4798400.382855
NSPN_cost=10_R1-0.0839930.2225712.0867210.5191470.000000
NSPN_cost=10_R10-0.0876520.2224622.0878210.5189180.000000
NSPN_cost=10_R2-0.0808970.2303842.0904230.5186150.000000
NSPN_cost=10_R3-0.1026260.2218942.0827450.5198030.000000
NSPN_cost=10_R4-0.0721220.2219242.0878420.5189020.000000
NSPN_cost=10_R5-0.0869520.2327062.0847540.5193570.000000
NSPN_cost=10_R6-0.0844380.2237272.0853250.5194400.000000
NSPN_cost=10_R7-0.0802850.2249302.0875250.5190030.000000
NSPN_cost=10_R8-0.0703320.2257302.0851350.5192810.000000
NSPN_cost=10_R9-0.0865100.2271592.0907820.5185270.000000
\n", "
" ], "text/plain": [ " assortativity average_clustering \\\n", "NSPN_cost=10 0.090769 0.449889 \n", "NSPN_cost=10_R1 -0.083993 0.222571 \n", "NSPN_cost=10_R10 -0.087652 0.222462 \n", "NSPN_cost=10_R2 -0.080897 0.230384 \n", "NSPN_cost=10_R3 -0.102626 0.221894 \n", "NSPN_cost=10_R4 -0.072122 0.221924 \n", "NSPN_cost=10_R5 -0.086952 0.232706 \n", "NSPN_cost=10_R6 -0.084438 0.223727 \n", "NSPN_cost=10_R7 -0.080285 0.224930 \n", "NSPN_cost=10_R8 -0.070332 0.225730 \n", "NSPN_cost=10_R9 -0.086510 0.227159 \n", "\n", " average_shortest_path_length efficiency modularity \n", "NSPN_cost=10 2.376243 0.479840 0.382855 \n", "NSPN_cost=10_R1 2.086721 0.519147 0.000000 \n", "NSPN_cost=10_R10 2.087821 0.518918 0.000000 \n", "NSPN_cost=10_R2 2.090423 0.518615 0.000000 \n", "NSPN_cost=10_R3 2.082745 0.519803 0.000000 \n", "NSPN_cost=10_R4 2.087842 0.518902 0.000000 \n", "NSPN_cost=10_R5 2.084754 0.519357 0.000000 \n", "NSPN_cost=10_R6 2.085325 0.519440 0.000000 \n", "NSPN_cost=10_R7 2.087525 0.519003 0.000000 \n", "NSPN_cost=10_R8 2.085135 0.519281 0.000000 \n", "NSPN_cost=10_R9 2.090782 0.518527 0.000000 " ] }, "execution_count": 25, "metadata": {}, "output_type": "execute_result" } ], "source": [ "brain_bundle.report_global_measures()" ] }, { "cell_type": "code", "execution_count": 26, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
0123456789...96979899100101102103104105
NSPN_cost=100.1000040.1032280.1072440.1120390.1178420.1223980.1279750.1318990.1368200.141069...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R10.1000040.1032280.1071750.1119200.1175640.1219500.1272260.1310920.1358250.139877...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R100.1000040.1032280.1071750.1119200.1175640.1219500.1272260.1311500.1358850.139908...0.00.00.00.00.00.00.00.00.00.0
NSPN_cost=10_R20.1000040.1032280.1071750.1119200.1175640.1219500.1272260.1310920.1358550.139971...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R30.1000040.1032280.1071750.1119200.1175890.1220030.1272820.1311500.1358850.139940...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R40.1000040.1032280.1071750.1119200.1175640.1219760.1272820.1311500.1358850.139940...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R50.1000040.1032280.1071750.1119200.1175640.1219500.1272540.1311500.1359150.139940...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R60.1000040.1032280.1071750.1119200.1175640.1219500.1272260.1310920.1358850.139908...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R70.1000040.1032280.1071750.1119200.1175640.1219760.1272540.1311500.1359150.139971...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R80.1000040.1032280.1071750.1119200.1175640.1219500.1272260.1311210.1359150.139940...1.01.01.01.01.01.01.01.01.01.0
NSPN_cost=10_R90.1000040.1032280.1071750.1119200.1175640.1219760.1272540.1311500.1359150.140034...0.00.00.00.00.00.00.00.00.00.0
\n", "

11 rows × 106 columns

\n", "
" ], "text/plain": [ " 0 1 2 3 4 5 \\\n", "NSPN_cost=10 0.100004 0.103228 0.107244 0.112039 0.117842 0.122398 \n", "NSPN_cost=10_R1 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 \n", "NSPN_cost=10_R10 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 \n", "NSPN_cost=10_R2 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 \n", "NSPN_cost=10_R3 0.100004 0.103228 0.107175 0.111920 0.117589 0.122003 \n", "NSPN_cost=10_R4 0.100004 0.103228 0.107175 0.111920 0.117564 0.121976 \n", "NSPN_cost=10_R5 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 \n", "NSPN_cost=10_R6 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 \n", "NSPN_cost=10_R7 0.100004 0.103228 0.107175 0.111920 0.117564 0.121976 \n", "NSPN_cost=10_R8 0.100004 0.103228 0.107175 0.111920 0.117564 0.121950 \n", "NSPN_cost=10_R9 0.100004 0.103228 0.107175 0.111920 0.117564 0.121976 \n", "\n", " 6 7 8 9 ... 96 97 98 \\\n", "NSPN_cost=10 0.127975 0.131899 0.136820 0.141069 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R1 0.127226 0.131092 0.135825 0.139877 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R10 0.127226 0.131150 0.135885 0.139908 ... 0.0 0.0 0.0 \n", "NSPN_cost=10_R2 0.127226 0.131092 0.135855 0.139971 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R3 0.127282 0.131150 0.135885 0.139940 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R4 0.127282 0.131150 0.135885 0.139940 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R5 0.127254 0.131150 0.135915 0.139940 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R6 0.127226 0.131092 0.135885 0.139908 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R7 0.127254 0.131150 0.135915 0.139971 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R8 0.127226 0.131121 0.135915 0.139940 ... 1.0 1.0 1.0 \n", "NSPN_cost=10_R9 0.127254 0.131150 0.135915 0.140034 ... 0.0 0.0 0.0 \n", "\n", " 99 100 101 102 103 104 105 \n", "NSPN_cost=10 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R1 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R10 0.0 0.0 0.0 0.0 0.0 0.0 0.0 \n", "NSPN_cost=10_R2 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R3 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R4 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R5 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R6 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R7 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R8 1.0 1.0 1.0 1.0 1.0 1.0 1.0 \n", "NSPN_cost=10_R9 0.0 0.0 0.0 0.0 0.0 0.0 0.0 \n", "\n", "[11 rows x 106 columns]" ] }, "execution_count": 26, "metadata": {}, "output_type": "execute_result" } ], "source": [ "brain_bundle.report_rich_club()" ] } ], "metadata": { "anaconda-cloud": {}, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.3" } }, "nbformat": 4, "nbformat_minor": 1 }