{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Complexity in fitting Linear Mixed Models" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Linear mixed-effects models are increasingly used for the analysis of data from experiments in fields like psychology where several subjects are each exposed to each of several different items.\n", "In addition to a response, which here will be assumed to be on a continuous scale, such as a _response time_, a number of experimental conditions are systematically varied during the experiment.\n", "In the language of statistical experimental design the latter variables are called _experimental factors_ whereas factors like `Subject` and `Item` are _blocking factors_.\n", "That is, these are known sources of variation that usually are not of interest by themselves but still should be accounted for when looking for systematic variation in the response." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## An example data set" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The data from experiment 2 in [_Kronmueller and Barr (2007)_](https://doi.org/10.1016/j.jml.2006.05.002) are available in `.rds` (R Data Set) format in the file `kb07_exp2_rt.rds` in the [github repository](https://github.com/dalejbarr/kronmueller-barr-2007) provided by Dale Barr.\n", "Files in this format can be loaded using the [_RData_ package](https://github.com/JuliaData/RData.jl) for [__Julia__](https://julialang.org)." ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "┌ Info: Loading DataFrames support into Gadfly.jl\n", "└ @ Gadfly /home/bates/.julia/packages/Gadfly/09PWZ/src/mapping.jl:228\n" ] } ], "source": [ "using BenchmarkTools, DataFrames, Distributions, FreqTables\n", "using MixedModels, RData, Statistics, StatsBase, StatsModels\n", "using Gadfly" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "data": { "text/html": [ "

1,789 rows × 7 columns

subjitemspkrprecloadrt_truncrt_raw
Categorical…Categorical…Categorical…Categorical…Categorical…Int32Int32
1301newbreakyes22672267
2302oldmaintainno38563856
3303oldbreakno15671567
4304newmaintainno17321732
5305newbreakno26602660
6306oldmaintainyes27632763
7307oldbreakyes35283528
8308newmaintainyes17411741
9309newbreakyes36923692
103010oldmaintainno19491949
113011oldbreakno21892189
123012newmaintainno22072207
133013newbreakno20782078
143014oldmaintainyes19011901
153015oldbreakyes40154015
163016newmaintainyes18801880
173017newbreakyes14441444
183018oldmaintainno16831683
193019oldbreakno20372037
203020newmaintainno11681168
213021newbreakno19301930
223022oldmaintainyes18431843
233023oldbreakyes49694969
243024newmaintainyes17981798
253025newbreakyes24362436
263026oldmaintainno20182018
273027oldbreakno22782278
283028newmaintainno18661866
293029newbreakno17431743
303030oldmaintainyes19631963
" ], "text/latex": [ "\\begin{tabular}{r|ccccccc}\n", "\t& subj & item & spkr & prec & load & rt\\_trunc & rt\\_raw\\\\\n", "\t\\hline\n", "\t& Categorical… & Categorical… & Categorical… & Categorical… & Categorical… & Int32 & Int32\\\\\n", "\t\\hline\n", "\t1 & 30 & 1 & new & break & yes & 2267 & 2267 \\\\\n", "\t2 & 30 & 2 & old & maintain & no & 3856 & 3856 \\\\\n", "\t3 & 30 & 3 & old & break & no & 1567 & 1567 \\\\\n", "\t4 & 30 & 4 & new & maintain & no & 1732 & 1732 \\\\\n", "\t5 & 30 & 5 & new & break & no & 2660 & 2660 \\\\\n", "\t6 & 30 & 6 & old & maintain & yes & 2763 & 2763 \\\\\n", "\t7 & 30 & 7 & old & break & yes & 3528 & 3528 \\\\\n", "\t8 & 30 & 8 & new & maintain & yes & 1741 & 1741 \\\\\n", "\t9 & 30 & 9 & new & break & yes & 3692 & 3692 \\\\\n", "\t10 & 30 & 10 & old & maintain & no & 1949 & 1949 \\\\\n", "\t11 & 30 & 11 & old & break & no & 2189 & 2189 \\\\\n", "\t12 & 30 & 12 & new & maintain & no & 2207 & 2207 \\\\\n", "\t13 & 30 & 13 & new & break & no & 2078 & 2078 \\\\\n", "\t14 & 30 & 14 & old & maintain & yes & 1901 & 1901 \\\\\n", "\t15 & 30 & 15 & old & break & yes & 4015 & 4015 \\\\\n", "\t16 & 30 & 16 & new & maintain & yes & 1880 & 1880 \\\\\n", "\t17 & 30 & 17 & new & break & yes & 1444 & 1444 \\\\\n", "\t18 & 30 & 18 & old & maintain & no & 1683 & 1683 \\\\\n", "\t19 & 30 & 19 & old & break & no & 2037 & 2037 \\\\\n", "\t20 & 30 & 20 & new & maintain & no & 1168 & 1168 \\\\\n", "\t21 & 30 & 21 & new & break & no & 1930 & 1930 \\\\\n", "\t22 & 30 & 22 & old & maintain & yes & 1843 & 1843 \\\\\n", "\t23 & 30 & 23 & old & break & yes & 4969 & 4969 \\\\\n", "\t24 & 30 & 24 & new & maintain & yes & 1798 & 1798 \\\\\n", "\t25 & 30 & 25 & new & break & yes & 2436 & 2436 \\\\\n", "\t26 & 30 & 26 & old & maintain & no & 2018 & 2018 \\\\\n", "\t27 & 30 & 27 & old & break & no & 2278 & 2278 \\\\\n", "\t28 & 30 & 28 & new & maintain & no & 1866 & 1866 \\\\\n", "\t29 & 30 & 29 & new & break & no & 1743 & 1743 \\\\\n", "\t30 & 30 & 30 & old & maintain & yes & 1963 & 1963 \\\\\n", "\t$\\dots$ & $\\dots$ & $\\dots$ & $\\dots$ & $\\dots$ & $\\dots$ & $\\dots$ & $\\dots$ \\\\\n", "\\end{tabular}\n" ], "text/plain": [ "1789×7 DataFrame. Omitted printing of 3 columns\n", "│ Row │ subj │ item │ spkr │ prec │\n", "│ │ \u001b[90mCategorical…\u001b[39m │ \u001b[90mCategorical…\u001b[39m │ \u001b[90mCategorical…\u001b[39m │ \u001b[90mCategorical…\u001b[39m │\n", "├──────┼──────────────┼──────────────┼──────────────┼──────────────┤\n", "│ 1 │ 30 │ 1 │ new │ break │\n", "│ 2 │ 30 │ 2 │ old │ maintain │\n", "│ 3 │ 30 │ 3 │ old │ break │\n", "│ 4 │ 30 │ 4 │ new │ maintain │\n", "│ 5 │ 30 │ 5 │ new │ break │\n", "│ 6 │ 30 │ 6 │ old │ maintain │\n", "│ 7 │ 30 │ 7 │ old │ break │\n", "│ 8 │ 30 │ 8 │ new │ maintain │\n", "│ 9 │ 30 │ 9 │ new │ break │\n", "│ 10 │ 30 │ 10 │ old │ maintain │\n", "⋮\n", "│ 1779 │ 103 │ 22 │ new │ break │\n", "│ 1780 │ 103 │ 23 │ old │ maintain │\n", "│ 1781 │ 103 │ 24 │ old │ break │\n", "│ 1782 │ 103 │ 25 │ new │ maintain │\n", "│ 1783 │ 103 │ 26 │ new │ break │\n", "│ 1784 │ 103 │ 27 │ old │ maintain │\n", "│ 1785 │ 103 │ 28 │ old │ break │\n", "│ 1786 │ 103 │ 29 │ new │ maintain │\n", "│ 1787 │ 103 │ 30 │ new │ break │\n", "│ 1788 │ 103 │ 31 │ old │ maintain │\n", "│ 1789 │ 103 │ 32 │ old │ break │" ] }, "execution_count": 2, "metadata": {}, "output_type": "execute_result" } ], "source": [ "kb07 = load(\"../kronmueller-barr-2007/kb07_exp2_rt.rds\")" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "data": { "text/html": [ "

7 rows × 8 columns

variablemeanminmedianmaxnuniquenmissingeltype
SymbolUnion…AnyUnion…AnyUnion…NothingDataType
1subj3010356CategoricalString{UInt8}
2item13232CategoricalString{UInt8}
3spkrnewold2CategoricalString{UInt8}
4precbreakmaintain2CategoricalString{UInt8}
5loadnoyes2CategoricalString{UInt8}
6rt_trunc2182.25791940.05171Int32
7rt_raw2226.245791940.015923Int32
" ], "text/latex": [ "\\begin{tabular}{r|cccccccc}\n", "\t& variable & mean & min & median & max & nunique & nmissing & eltype\\\\\n", "\t\\hline\n", "\t& Symbol & Union… & Any & Union… & Any & Union… & Nothing & DataType\\\\\n", "\t\\hline\n", "\t1 & subj & & 30 & & 103 & 56 & & CategoricalString\\{UInt8\\} \\\\\n", "\t2 & item & & 1 & & 32 & 32 & & CategoricalString\\{UInt8\\} \\\\\n", "\t3 & spkr & & new & & old & 2 & & CategoricalString\\{UInt8\\} \\\\\n", "\t4 & prec & & break & & maintain & 2 & & CategoricalString\\{UInt8\\} \\\\\n", "\t5 & load & & no & & yes & 2 & & CategoricalString\\{UInt8\\} \\\\\n", "\t6 & rt\\_trunc & 2182.2 & 579 & 1940.0 & 5171 & & & Int32 \\\\\n", "\t7 & rt\\_raw & 2226.24 & 579 & 1940.0 & 15923 & & & Int32 \\\\\n", "\\end{tabular}\n" ], "text/plain": [ "7×8 DataFrame. Omitted printing of 1 columns\n", "│ Row │ variable │ mean │ min │ median │ max │ nunique │ nmissing │\n", "│ │ \u001b[90mSymbol\u001b[39m │ \u001b[90mUnion…\u001b[39m │ \u001b[90mAny\u001b[39m │ \u001b[90mUnion…\u001b[39m │ \u001b[90mAny\u001b[39m │ \u001b[90mUnion…\u001b[39m │ \u001b[90mNothing\u001b[39m │\n", "├─────┼──────────┼─────────┼───────┼────────┼──────────┼─────────┼──────────┤\n", "│ 1 │ subj │ │ 30 │ │ 103 │ 56 │ │\n", "│ 2 │ item │ │ 1 │ │ 32 │ 32 │ │\n", "│ 3 │ spkr │ │ new │ │ old │ 2 │ │\n", "│ 4 │ prec │ │ break │ │ maintain │ 2 │ │\n", "│ 5 │ load │ │ no │ │ yes │ 2 │ │\n", "│ 6 │ rt_trunc │ 2182.2 │ 579 │ 1940.0 │ 5171 │ │ │\n", "│ 7 │ rt_raw │ 2226.24 │ 579 │ 1940.0 │ 15923 │ │ │" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "describe(kb07)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The blocking factors are `subj` and `item` with 56 and 32 levels respectively.\n", "There are three experimental factors each with two levels: `spkr` (speaker), `prec` (precedence), and `load` (cognitive load).\n", "The response time, `rt_raw`, is measured in milliseconds.\n", "A few very large values, e.g. the maximum which is nearly 16 seconds, which could skew the results, are truncated in the `rt_trunc` column.\n", "In addition, three erroneously recorded responses (values of 300 ms.) have been dropped, resulting in a slight imbalance in the data." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "56×32 Named Array{Int64,2}\n", "subj ╲ item │ 1 2 3 4 5 6 7 8 … 25 26 27 28 29 30 31 32\n", "────────────┼──────────────────────────────────────────────────────────────────\n", "30 │ 1 1 1 1 1 1 1 1 … 1 1 1 1 1 1 1 1\n", "31 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "34 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "35 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "36 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "37 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "38 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "39 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "41 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "42 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "43 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "44 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋱ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮\n", "89 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "91 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "93 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "94 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "95 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "96 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "97 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "98 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "100 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "101 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "102 │ 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1\n", "103 │ 1 1 1 1 1 1 1 1 … 1 1 1 1 1 1 1 1" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "subjitemtbl = freqtable(kb07, :subj, :item)" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Dict{Int64,Int64} with 2 entries:\n", " 0 => 3\n", " 1 => 1789" ] }, "execution_count": 5, "metadata": {}, "output_type": "execute_result" } ], "source": [ "countmap(vec(subjitemtbl))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "All of the experimental factors vary within subject and within item, as can be verified by examining the frequency tables for the experimental and grouping factors. For example" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "2×56 Named Array{Int64,2}\n", "spkr ╲ subj │ 30 31 34 35 36 37 … 97 98 100 101 102 103\n", "────────────┼──────────────────────────────────────────────────────────────\n", "new │ 16 16 16 16 16 16 … 16 16 16 16 16 16\n", "old │ 16 16 16 16 16 16 … 16 16 16 16 15 16" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "freqtable(kb07, :spkr, :subj)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A table of mean responses and standard deviations for combinations of the experimental factors, as shown in Table 3 of the paper and on the data repository can be reproduced as" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/html": [ "

8 rows × 6 columns

spkrprecloadmeanRTsdRTn
Categorical…Categorical…Categorical…Float64Float64Int64
1newbreakno2345.17972.693223
2newbreakyes2505.341012.04224
3newmaintainno1760.66587.445224
4newmaintainyes1843.28609.672224
5oldbreakno2536.291058.78224
6oldbreakyes2674.321119.67224
7oldmaintainno1770.76669.901222
8oldmaintainyes2018.83725.359224
" ], "text/latex": [ "\\begin{tabular}{r|cccccc}\n", "\t& spkr & prec & load & meanRT & sdRT & n\\\\\n", "\t\\hline\n", "\t& Categorical… & Categorical… & Categorical… & Float64 & Float64 & Int64\\\\\n", "\t\\hline\n", "\t1 & new & break & no & 2345.17 & 972.693 & 223 \\\\\n", "\t2 & new & break & yes & 2505.34 & 1012.04 & 224 \\\\\n", "\t3 & new & maintain & no & 1760.66 & 587.445 & 224 \\\\\n", "\t4 & new & maintain & yes & 1843.28 & 609.672 & 224 \\\\\n", "\t5 & old & break & no & 2536.29 & 1058.78 & 224 \\\\\n", "\t6 & old & break & yes & 2674.32 & 1119.67 & 224 \\\\\n", "\t7 & old & maintain & no & 1770.76 & 669.901 & 222 \\\\\n", "\t8 & old & maintain & yes & 2018.83 & 725.359 & 224 \\\\\n", "\\end{tabular}\n" ], "text/plain": [ "8×6 DataFrame\n", "│ Row │ spkr │ prec │ load │ meanRT │ sdRT │ n │\n", "│ │ \u001b[90mCategorical…\u001b[39m │ \u001b[90mCategorical…\u001b[39m │ \u001b[90mCategorical…\u001b[39m │ \u001b[90mFloat64\u001b[39m │ \u001b[90mFloat64\u001b[39m │ \u001b[90mInt64\u001b[39m │\n", "├─────┼──────────────┼──────────────┼──────────────┼─────────┼─────────┼───────┤\n", "│ 1 │ new │ break │ no │ 2345.17 │ 972.693 │ 223 │\n", "│ 2 │ new │ break │ yes │ 2505.34 │ 1012.04 │ 224 │\n", "│ 3 │ new │ maintain │ no │ 1760.66 │ 587.445 │ 224 │\n", "│ 4 │ new │ maintain │ yes │ 1843.28 │ 609.672 │ 224 │\n", "│ 5 │ old │ break │ no │ 2536.29 │ 1058.78 │ 224 │\n", "│ 6 │ old │ break │ yes │ 2674.32 │ 1119.67 │ 224 │\n", "│ 7 │ old │ maintain │ no │ 1770.76 │ 669.901 │ 222 │\n", "│ 8 │ old │ maintain │ yes │ 2018.83 │ 725.359 │ 224 │" ] }, "execution_count": 7, "metadata": {}, "output_type": "execute_result" } ], "source": [ "cellmeans = by(kb07, [:spkr, :prec, :load], meanRT = :rt_trunc => mean, sdRT = :rt_trunc => std, n = :rt_trunc => length)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "An interaction plot of these means" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "data": { "image/svg+xml": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " load by spkr\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " break\n", " \n", " \n", " \n", " \n", " maintain\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " prec\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " old\n", " \n", " \n", " \n", " \n", " \n", " \n", " new\n", " \n", " \n", " \n", " \n", " \n", " \n", " no\n", " \n", " \n", " \n", " \n", " yes\n", " \n", " \n", " \n", " \n", " \n", " \n", " no\n", " \n", " \n", " \n", " \n", " yes\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " 1750\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2250\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " 2750\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " meanRT\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "\n", "\n" ], "text/html": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " load by spkr\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " break\n", " \n", " \n", " \n", " \n", " maintain\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " prec\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " old\n", " \n", " \n", " \n", " \n", " \n", " \n", " new\n", " \n", " \n", " \n", " \n", " \n", " \n", " no\n", " \n", " \n", " \n", " \n", " yes\n", " \n", " \n", " \n", " \n", " \n", " \n", " no\n", " \n", " \n", " \n", " \n", " yes\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " 500\n", " \n", " \n", " \n", " \n", " 750\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1250\n", " \n", " \n", " \n", " \n", " 1500\n", " \n", " \n", " \n", " \n", " 1750\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2250\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " 2750\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 3250\n", " \n", " \n", " \n", " \n", " 3500\n", " \n", " \n", " \n", " \n", " 3750\n", " \n", " \n", " \n", " \n", " 4000\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 50\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 150\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " 250\n", " \n", " \n", " \n", " \n", " 300\n", " \n", " \n", " \n", " \n", " 350\n", " \n", " \n", " \n", " \n", " 400\n", " \n", " \n", " \n", " \n", " 450\n", " \n", " \n", " \n", " \n", " 500\n", " \n", " \n", " \n", " \n", " 550\n", " \n", " \n", " \n", " \n", " 600\n", " \n", " \n", " \n", " \n", " 650\n", " \n", " \n", " \n", " \n", " 700\n", " \n", " \n", " \n", " \n", " 750\n", " \n", " \n", " \n", " \n", " 800\n", " \n", " \n", " \n", " \n", " 850\n", " \n", " \n", " \n", " \n", " 900\n", " \n", " \n", " \n", " \n", " 950\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1050\n", " \n", " \n", " \n", " \n", " 1100\n", " \n", " \n", " \n", " \n", " 1150\n", " \n", " \n", " \n", " \n", " 1200\n", " \n", " \n", " \n", " \n", " 1250\n", " \n", " \n", " \n", " \n", " 1300\n", " \n", " \n", " \n", " \n", " 1350\n", " \n", " \n", " \n", " \n", " 1400\n", " \n", " \n", " \n", " \n", " 1450\n", " \n", " \n", " \n", " \n", " 1500\n", " \n", " \n", " \n", " \n", " 1550\n", " \n", " \n", " \n", " \n", " 1600\n", " \n", " \n", " \n", " \n", " 1650\n", " \n", " \n", " \n", " \n", " 1700\n", " \n", " \n", " \n", " \n", " 1750\n", " \n", " \n", " \n", " \n", " 1800\n", " \n", " \n", " \n", " \n", " 1850\n", " \n", " \n", " \n", " \n", " 1900\n", " \n", " \n", " \n", " \n", " 1950\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2050\n", " \n", " \n", " \n", " \n", " 2100\n", " \n", " \n", " \n", " \n", " 2150\n", " \n", " \n", " \n", " \n", " 2200\n", " \n", " \n", " \n", " \n", " 2250\n", " \n", " \n", " \n", " \n", " 2300\n", " \n", " \n", " \n", " \n", " 2350\n", " \n", " \n", " \n", " \n", " 2400\n", " \n", " \n", " \n", " \n", " 2450\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " 2550\n", " \n", " \n", " \n", " \n", " 2600\n", " \n", " \n", " \n", " \n", " 2650\n", " \n", " \n", " \n", " \n", " 2700\n", " \n", " \n", " \n", " \n", " 2750\n", " \n", " \n", " \n", " \n", " 2800\n", " \n", " \n", " \n", " \n", " 2850\n", " \n", " \n", " \n", " \n", " 2900\n", " \n", " \n", " \n", " \n", " 2950\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 3050\n", " \n", " \n", " \n", " \n", " 3100\n", " \n", " \n", " \n", " \n", " 3150\n", " \n", " \n", " \n", " \n", " 3200\n", " \n", " \n", " \n", " \n", " 3250\n", " \n", " \n", " \n", " \n", " 3300\n", " \n", " \n", " \n", " \n", " 3350\n", " \n", " \n", " \n", " \n", " 3400\n", " \n", " \n", " \n", " \n", " 3450\n", " \n", " \n", " \n", " \n", " 3500\n", " \n", " \n", " \n", " \n", " 3550\n", " \n", " \n", " \n", " \n", " 3600\n", " \n", " \n", " \n", " \n", " 3650\n", " \n", " \n", " \n", " \n", " 3700\n", " \n", " \n", " \n", " \n", " 3750\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 4000\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " 300\n", " \n", " \n", " \n", " \n", " 400\n", " \n", " \n", " \n", " \n", " 500\n", " \n", " \n", " \n", " \n", " 600\n", " \n", " \n", " \n", " \n", " 700\n", " \n", " \n", " \n", " \n", " 800\n", " \n", " \n", " \n", " \n", " 900\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1100\n", " \n", " \n", " \n", " \n", " 1200\n", " \n", " \n", " \n", " \n", " 1300\n", " \n", " \n", " \n", " \n", " 1400\n", " \n", " \n", " \n", " \n", " 1500\n", " \n", " \n", " \n", " \n", " 1600\n", " \n", " \n", " \n", " \n", " 1700\n", " \n", " \n", " \n", " \n", " 1800\n", " \n", " \n", " \n", " \n", " 1900\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2100\n", " \n", " \n", " \n", " \n", " 2200\n", " \n", " \n", " \n", " \n", " 2300\n", " \n", " \n", " \n", " \n", " 2400\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " 2600\n", " \n", " \n", " \n", " \n", " 2700\n", " \n", " \n", " \n", " \n", " 2800\n", " \n", " \n", " \n", " \n", " 2900\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 3100\n", " \n", " \n", " \n", " \n", " 3200\n", " \n", " \n", " \n", " \n", " 3300\n", " \n", " \n", " \n", " \n", " 3400\n", " \n", " \n", " \n", " \n", " 3500\n", " \n", " \n", " \n", " \n", " 3600\n", " \n", " \n", " \n", " \n", " 3700\n", " \n", " \n", " \n", " \n", " 3800\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " meanRT\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "\n", "\n", "\n", "\n" ], "text/plain": [ "Plot(...)" ] }, "execution_count": 8, "metadata": {}, "output_type": "execute_result" } ], "source": [ "plot(cellmeans, x=:load, y=:meanRT, color=:prec, xgroup=:spkr, Geom.subplot_grid(Geom.point))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "shows that the dominant factor is the precedent with the response times under _break_ being considerably larger than under _maintain_." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Formulating a simple model" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A simple model with main-effects for each of the experimental factors and with random effects for subject and for item is described by the formula `rt_trunc ~ 1 + spkr + prec + load + (1|subj) + (1|item)`.\n", "In the _MixedModels_ package, which uses the formula specifications from the [_StatsModels_ package](https://github.com/JuliaStats/StatsModels.jl), a formula must be wrapped in a call to the `@formula` macro.\n", "The model is created as an instance of a `LinearMixedModel` type then fit with a call to the `fit!` generic.\n", "\n", "(By convention, the names in Julia of _mutating functions_, which modify the value of one or more of their arguments, end in `!` as a warning to the user that arguments, usually just the first argument, can be overwritten with new values.)" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Linear mixed model fit by maximum likelihood\n", " rt_trunc ~ 1 + spkr + prec + load + (1 | subj) + (1 | item)\n", " logLik -2 logLik AIC BIC \n", " -1.44115694×10⁴ 2.88231387×10⁴ 2.88371387×10⁴ 2.88755646×10⁴\n", "\n", "Variance components:\n", " Variance Std.Dev. \n", " subj 101529.32 318.63666\n", " item 132314.19 363.75018\n", " Residual 521145.70 721.90422\n", " Number of obs: 1789; levels of grouping factors: 56, 32\n", "\n", " Fixed-effects parameters:\n", " Estimate Std.Error z value P(>|z|)\n", "(Intercept) 2369.24 84.3435 28.0903 <1e-99\n", "spkr: old 136.167 34.1367 3.98886 <1e-4\n", "prec: maintain -667.173 34.1367 -19.5441 <1e-84\n", "load: yes 156.709 34.1366 4.59065 <1e-5\n" ] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m1 = fit!(LinearMixedModel(@formula(rt_trunc ~ 1 + spkr + prec + load + (1|subj) + (1|item)), kb07))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The first fit of such a model can take several seconds because the Just-In-Time (JIT) compiler must analyze and compile a considerable amount of code. (All of the code in the _MixedModels_ package is Julia code.)\n", "Subsequent fits of this or similar models are much faster.\n", "\n", "When timing fits that take a few seconds or less the `@btime` macro will be used, as it performs the evaluation of the expression several times and takes the minimum of these evaluation times. \n", "This is to eliminate the noise introduced by garbage collection and compilation.\n", "\n", "Furthermore, in what follows the formulas and models will be declared `const`, which in Julia means that the type of the value cannot be changed, although the values in, say, a vector can be.\n", "\n", "Also the expression is written using `$` before an argument name. This is an interpolation in a macro call. The effect is to time only the expression evaluation and not the name lookup for the arguments." ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 2.324 ms (3082 allocations: 841.83 KiB)\n" ] } ], "source": [ "const f1 = @formula(rt_trunc ~ 1 + spkr + prec + load + (1|subj) + (1|item));\n", "@btime fit!(LinearMixedModel($f1, $kb07));" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The coding for the experimental factors in the model matrix for this model is the default or \"dummy\" coding, where the first level is the reference level and the estimated coefficients represent differences of the response at the indicated level relative to the first.\n", "\n", "In this case the model matrix for the fixed effects is" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "1789×4 Array{Float64,2}:\n", " 1.0 0.0 0.0 1.0\n", " 1.0 1.0 1.0 0.0\n", " 1.0 1.0 0.0 0.0\n", " 1.0 0.0 1.0 0.0\n", " 1.0 0.0 0.0 0.0\n", " 1.0 1.0 1.0 1.0\n", " 1.0 1.0 0.0 1.0\n", " 1.0 0.0 1.0 1.0\n", " 1.0 0.0 0.0 1.0\n", " 1.0 1.0 1.0 0.0\n", " 1.0 1.0 0.0 0.0\n", " 1.0 0.0 1.0 0.0\n", " 1.0 0.0 0.0 0.0\n", " ⋮ \n", " 1.0 0.0 1.0 0.0\n", " 1.0 0.0 0.0 0.0\n", " 1.0 1.0 1.0 1.0\n", " 1.0 1.0 0.0 1.0\n", " 1.0 0.0 1.0 1.0\n", " 1.0 0.0 0.0 1.0\n", " 1.0 1.0 1.0 0.0\n", " 1.0 1.0 0.0 0.0\n", " 1.0 0.0 1.0 0.0\n", " 1.0 0.0 0.0 0.0\n", " 1.0 1.0 1.0 1.0\n", " 1.0 1.0 0.0 1.0" ] }, "execution_count": 11, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m1.X" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In two-level factorial experiments like this it is an advantage to use a $\\pm1$ coding, which can be derived from `EffectsCoding` or `HelmertCoding` (these codings are the same for a two-level factor)." ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [], "source": [ "const contrasts = Dict(:spkr=>HelmertCoding(), :prec=>HelmertCoding(), :load=>HelmertCoding());" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 2.397 ms (3300 allocations: 849.63 KiB)\n" ] }, { "data": { "text/plain": [ "Linear mixed model fit by maximum likelihood\n", " rt_trunc ~ 1 + spkr + prec + load + (1 | subj) + (1 | item)\n", " logLik -2 logLik AIC BIC \n", " -1.44115694×10⁴ 2.88231387×10⁴ 2.88371387×10⁴ 2.88755646×10⁴\n", "\n", "Variance components:\n", " Variance Std.Dev. \n", " subj 101529.32 318.63666\n", " item 132314.19 363.75018\n", " Residual 521145.70 721.90422\n", " Number of obs: 1789; levels of grouping factors: 56, 32\n", "\n", " Fixed-effects parameters:\n", " Estimate Std.Error z value P(>|z|)\n", "(Intercept) 2182.09 78.9884 27.6254 <1e-99\n", "spkr: old 68.0833 17.0684 3.98886 <1e-4\n", "prec: maintain -333.586 17.0684 -19.5441 <1e-84\n", "load: yes 78.3546 17.0683 4.59065 <1e-5\n" ] }, "execution_count": 13, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m1 = @btime fit!(LinearMixedModel($f1, $kb07, $contrasts))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The estimate of the `(Intercept)` coefficient is now approximately the sample mean. (It may be slightly different from the sample mean due to the \n", "slight imbalance in the design because of two unusually long response times being dropped.)\n", "\n", "The coefficients for the experimental factors are half the previously estimated effects, because of the $\\pm1$ coding,\n", "Their standard errors are also half those from the previous fit.\n", "The estimates of the variance components, the log-likelihood and other goodness of fit criteria, and the z statistics and p-values for the effects remain the same.\n", "\n", "The model matrix for the fixed effects is now" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "1789×4 Array{Float64,2}:\n", " 1.0 -1.0 -1.0 1.0\n", " 1.0 1.0 1.0 -1.0\n", " 1.0 1.0 -1.0 -1.0\n", " 1.0 -1.0 1.0 -1.0\n", " 1.0 -1.0 -1.0 -1.0\n", " 1.0 1.0 1.0 1.0\n", " 1.0 1.0 -1.0 1.0\n", " 1.0 -1.0 1.0 1.0\n", " 1.0 -1.0 -1.0 1.0\n", " 1.0 1.0 1.0 -1.0\n", " 1.0 1.0 -1.0 -1.0\n", " 1.0 -1.0 1.0 -1.0\n", " 1.0 -1.0 -1.0 -1.0\n", " ⋮ \n", " 1.0 -1.0 1.0 -1.0\n", " 1.0 -1.0 -1.0 -1.0\n", " 1.0 1.0 1.0 1.0\n", " 1.0 1.0 -1.0 1.0\n", " 1.0 -1.0 1.0 1.0\n", " 1.0 -1.0 -1.0 1.0\n", " 1.0 1.0 1.0 -1.0\n", " 1.0 1.0 -1.0 -1.0\n", " 1.0 -1.0 1.0 -1.0\n", " 1.0 -1.0 -1.0 -1.0\n", " 1.0 1.0 1.0 1.0\n", " 1.0 1.0 -1.0 1.0" ] }, "execution_count": 14, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m1.X" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Model construction versus model optimization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `m1` object is created in the call to the constructor function, `LinearMixedModel`, then the parameters are optimized or fit in the call to `fit!`.\n", "Usually the process of fitting a model will take longer than creating the numerical representation but, for simple models like this, the creation time can be a significant portion of the overall running time." ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 923.172 μs (1583 allocations: 805.44 KiB)\n" ] } ], "source": [ "@btime LinearMixedModel($f1, $kb07, $contrasts);" ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 1.370 ms (1716 allocations: 44.09 KiB)\n" ] } ], "source": [ "@btime fit!($m1);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Factors affecting the time to optimize the parameters" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The optimization process is summarized in the `optsum` property of the model." ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Initial parameter vector: [1.0, 1.0]\n", "Initial objective value: 28881.249730604657\n", "\n", "Optimizer (from NLopt): LN_BOBYQA\n", "Lower bounds: [0.0, 0.0]\n", "ftol_rel: 1.0e-12\n", "ftol_abs: 1.0e-8\n", "xtol_rel: 0.0\n", "xtol_abs: [1.0e-10, 1.0e-10]\n", "initial_step: [0.75, 0.75]\n", "maxfeval: -1\n", "\n", "Function evaluations: 28\n", "Final parameter vector: [0.441384, 0.503876]\n", "Final objective value: 28823.13873586134\n", "Return code: FTOL_REACHED\n" ] }, "execution_count": 17, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m1.optsum" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For this model there are two parameters to be optimized because the objective function, negative twice the log-likelihood, can be _profiled_ with respect to all the other parameters.\n", "(See section 3 of [_Bates et al. 2015_](https://www.jstatsoft.org/article/view/v067i01) for details.)\n", "Both these parameters must be non-negative (i.e. both have a lower bound of zero) and both have an initial value of one.\n", "After 21 function evaluations an optimum is declared according to the function value tolerance, either $10^{-8}$ in absolute terms or $10^{-12}$ relative to the current value.\n", "\n", "The optimization itself has a certain amount of setup and summary time but the majority of the time is spent in the evaluation of the objective - the profiled log-likelihood.\n", "\n", "Each function evaluation is of the form" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [], "source": [ "const θ1 = m1.θ;" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 42.720 μs (57 allocations: 1.42 KiB)\n" ] } ], "source": [ "@btime objective(updateL!(setθ!($m1, $θ1)));" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Remember that this is a \"best case\" time. During the course of the optimization some of the function evaluations could take longer due to garbage collection or other factors.\n", "\n", "On this machine 21 function evaluations, each taking 42 microseconds or more, gives the total function evaluation time of at least 0.85 ms., which is practically all of the time to fit the model.\n", "\n", "The majority of the time for the function evaluation for this model is in the call to `updateL!`" ] }, { "cell_type": "code", "execution_count": 20, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 32.016 μs (17 allocations: 320 bytes)\n" ] } ], "source": [ "@btime updateL!($m1);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is an operation that updates the lower Cholesky factor (often written as `L`) of a blocked sparse matrix.\n", "\n", "There are 4 rows and columns of blocks.\n", "The first row and column correspond to the random effects for subject, the second to the random effects for item, the third to the fixed-effects parameters and the fourth to the response.\n", "Their sizes and types are" ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "1,1: LinearAlgebra.Diagonal{Float64,Array{Float64,1}} (56, 56) LinearAlgebra.Diagonal{Float64,Array{Float64,1}}\n", "2,1: Array{Float64,2} (32, 56) Array{Float64,2}\n", "2,2: LinearAlgebra.Diagonal{Float64,Array{Float64,1}} (32, 32) Array{Float64,2}\n", "3,1: Array{Float64,2} (4, 56) Array{Float64,2}\n", "3,2: Array{Float64,2} (4, 32) Array{Float64,2}\n", "3,3: Array{Float64,2} (4, 4) Array{Float64,2}\n", "4,1: Array{Float64,2} (1, 56) Array{Float64,2}\n", "4,2: Array{Float64,2} (1, 32) Array{Float64,2}\n", "4,3: Array{Float64,2} (1, 4) Array{Float64,2}\n", "4,4: Array{Float64,2} (1, 1) Array{Float64,2}\n" ] } ], "source": [ "describeblocks(m1)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There are two lower-triangular blocked matrices: `A` with fixed entries determined by the model and data, and `L` which is updated for each evaluation of the objective function.\n", "The type of the `A` block is given before the size and the type of the `L` block is after the size.\n", "For scalar random effects, generated by a random-effects term like `(1|G)`, the (1,1) block is always diagonal for both `A` and `L`.\n", "Its size is the number of levels of the grouping factor, `G`.\n", "\n", "Because subject and item are crossed, the (2,1) block of `A` is dense, as is the (2,1) block of `L`.\n", "The (2,2) block of `A` is diagonal because, like the (1,1) block, it is generated from a scalar random effects term.\n", "However, the (2,2) block of `L` ends up being dense as a result of \"fill-in\" in the sparse Cholesky factorization.\n", "All the blocks associated with the fixed-effects or the response are stored as dense matrices but their dimensions are (relatively) small." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Increasing the complexity" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In general, adding more terms to a model will increase the time required to fit the model.\n", "However, there is a big difference between adding fixed-effects terms and adding complexity to the random effects.\n", "\n", "Adding the two- and three-factor interactions to the fixed-effects terms increases the time required to fit the model." ] }, { "cell_type": "code", "execution_count": 22, "metadata": {}, "outputs": [], "source": [ "const f2 = @formula(rt_trunc ~ 1 + spkr*prec*load + (1|subj) + (1|item));" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 3.902 ms (5685 allocations: 1.65 MiB)\n" ] }, { "data": { "text/plain": [ "Linear mixed model fit by maximum likelihood\n", " rt_trunc ~ 1 + spkr + prec + load + spkr & prec + spkr & load + prec & load + spkr & prec & load + (1 | subj) + (1 | item)\n", " logLik -2 logLik AIC BIC \n", " -1.4409243×10⁴ 2.88184861×10⁴ 2.88404861×10⁴ 2.89008696×10⁴\n", "\n", "Variance components:\n", " Variance Std.Dev. \n", " subj 101581.912 318.71917\n", " item 132326.015 363.76643\n", " Residual 519722.754 720.91799\n", " Number of obs: 1789; levels of grouping factors: 56, 32\n", "\n", " Fixed-effects parameters:\n", " Estimate Std.Error z value P(>|z|)\n", "(Intercept) 2182.0 78.9917 27.6231 <1e-99\n", "spkr: old 68.0177 17.0451 3.99045 <1e-4\n", "prec: maintain -333.652 17.0451 -19.5747 <1e-84\n", "load: yes 78.4464 17.0451 4.6023 <1e-5\n", "spkr: old & prec: maintain -21.6417 17.0451 -1.26968 0.2042\n", "spkr: old & load: yes 18.1151 17.0451 1.06278 0.2879\n", "prec: maintain & load: yes 4.2647 17.0451 0.250201 0.8024\n", "spkr: old & prec: maintain & load: yes 23.2834 17.0451 1.36599 0.1719\n" ] }, "execution_count": 23, "metadata": {}, "output_type": "execute_result" } ], "source": [ "const m2 = @btime fit!(LinearMixedModel($f2, $kb07, $contrasts))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "(Notice that none of the interactions are statistically significant.)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In this case, the increase in fitting time is more because the number of function evaluations to determine the optimum increases than because of increased evaluation time for the objective function." ] }, { "cell_type": "code", "execution_count": 24, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "32" ] }, "execution_count": 24, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m2.optsum.feval" ] }, { "cell_type": "code", "execution_count": 25, "metadata": {}, "outputs": [], "source": [ "const θ2 = m2.θ;" ] }, { "cell_type": "code", "execution_count": 26, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 48.086 μs (57 allocations: 1.42 KiB)\n" ] } ], "source": [ "@btime objective(updateL!(setθ!($m2, $θ2)));" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Increasing complexity of the random effects" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Another way in which the model can be extended is to switch to vector-valued random effects.\n", "Sometimes this is described as having _random slopes_, so that a subject not only brings their own shift in the typical response but also their own shift in the change due to, say, `Load` versus `No Load`.\n", "Instead of just one, scalar, change associated with each subject there is an entire vector of changes in the coefficients.\n", "\n", "A model with a random slopes for each of the experimental factors for both subject and item is specified as" ] }, { "cell_type": "code", "execution_count": 27, "metadata": {}, "outputs": [], "source": [ "const f3 = @formula(rt_trunc ~ 1 + spkr*prec*load + (1+spkr+prec+load|subj) + (1+spkr+prec+load|item));" ] }, { "cell_type": "code", "execution_count": 28, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 517.561 ms (715239 allocations: 31.42 MiB)\n" ] }, { "data": { "text/plain": [ "Linear mixed model fit by maximum likelihood\n", " rt_trunc ~ 1 + spkr + prec + load + spkr & prec + spkr & load + prec & load + spkr & prec & load + (1 + spkr + prec + load | subj) + (1 + spkr + prec + load | item)\n", " logLik -2 logLik AIC BIC \n", " -1.43189855×10⁴ 2.8637971×10⁴ 2.8695971×10⁴ 2.8855164×10⁴\n", "\n", "Variance components:\n", " Variance Std.Dev. Corr.\n", " subj 91130.9567 301.879043\n", " 1092.7366 33.056567 1.00\n", " 3463.3273 58.850041 -0.62 -0.62\n", " 4481.7460 66.945844 0.36 0.36 0.51\n", " item 131377.4183 362.460230\n", " 1690.3829 41.114266 0.42\n", " 61168.0277 247.321709 -0.69 0.37\n", " 1882.8841 43.392212 0.29 0.14 -0.13\n", " Residual 447630.8098 669.052173\n", " Number of obs: 1789; levels of grouping factors: 56, 32\n", "\n", " Fixed-effects parameters:\n", " Estimate Std.Error z value P(>|z|)\n", "(Intercept) 2181.64 77.3507 28.2046 <1e-99\n", "spkr: old 67.7496 17.9606 3.77213 0.0002\n", "prec: maintain -333.92 47.155 -7.08133 <1e-11\n", "load: yes 78.8007 19.726 3.99476 <1e-4\n", "spkr: old & prec: maintain -21.996 15.8191 -1.39047 0.1644\n", "spkr: old & load: yes 18.3832 15.8191 1.16209 0.2452\n", "prec: maintain & load: yes 4.53273 15.8191 0.286535 0.7745\n", "spkr: old & prec: maintain & load: yes 23.6377 15.8191 1.49425 0.1351\n" ] }, "execution_count": 28, "metadata": {}, "output_type": "execute_result" } ], "source": [ "const m3 = @btime fit!(LinearMixedModel($f3, $kb07, $contrasts))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There are several interesting aspects of this model fit.\n", "\n", "First, the number of parameters optimized directly has increased substantially.\n", "What was previously a 2-dimensional optimization has now become 20 dimensional." ] }, { "cell_type": "code", "execution_count": 29, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "Initial parameter vector: [1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 1.0]\n", "Initial objective value: 29303.38411137385\n", "\n", "Optimizer (from NLopt): LN_BOBYQA\n", "Lower bounds: [0.0, -Inf, -Inf, -Inf, 0.0, -Inf, -Inf, 0.0, -Inf, 0.0, 0.0, -Inf, -Inf, -Inf, 0.0, -Inf, -Inf, 0.0, -Inf, 0.0]\n", "ftol_rel: 1.0e-12\n", "ftol_abs: 1.0e-8\n", "xtol_rel: 0.0\n", "xtol_abs: [1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10, 1.0e-10]\n", "initial_step: [0.75, 1.0, 1.0, 1.0, 0.75, 1.0, 1.0, 0.75, 1.0, 0.75, 0.75, 1.0, 1.0, 1.0, 0.75, 1.0, 1.0, 0.75, 1.0, 0.75]\n", "maxfeval: -1\n", "\n", "Function evaluations: 781\n", "Final parameter vector: [0.451204, 0.0494081, -0.0547485, 0.0358954, 0.0, -0.0657285, -0.0891811, 0.0204788, 0.0277562, 0.0, 0.541752, 0.0259224, -0.253773, 0.0189352, 0.0557164, 0.267834, 0.000872977, 0.0226436, 0.0620244, 0.0]\n", "Final objective value: 28637.9710101938\n", "Return code: FTOL_REACHED\n" ] }, "execution_count": 29, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m3.optsum" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "and the number of function evaluations to convergence has gone from under 40 to over 650.\n", "\n", "The time required for each function evaluation has also increased considerably," ] }, { "cell_type": "code", "execution_count": 30, "metadata": {}, "outputs": [], "source": [ "const θ3 = m3.θ;" ] }, { "cell_type": "code", "execution_count": 31, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 592.043 μs (559 allocations: 29.98 KiB)\n" ] } ], "source": [ "@btime objective(updateL!(setθ!($m3, $θ3)));" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "resulting in much longer times for model fitting - about half a second for the \"best case\" on this machine." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Notice that the estimates of the fixed-effects coefficients and their standard errors have not changed substantially except for the standard error of `P` (_Precedent_), which is also the largest effect." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The parameters in the optimization for this model can be arranged as two lower-triangular 4 by 4 matrices." ] }, { "cell_type": "code", "execution_count": 32, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "4×4 LinearAlgebra.LowerTriangular{Float64,Array{Float64,2}}:\n", " 0.451204 ⋅ ⋅ ⋅ \n", " 0.0494081 0.0 ⋅ ⋅ \n", " -0.0547485 -0.0657285 0.0204788 ⋅ \n", " 0.0358954 -0.0891811 0.0277562 0.0" ] }, "execution_count": 32, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m3.λ[1]" ] }, { "cell_type": "code", "execution_count": 33, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "4×4 LinearAlgebra.LowerTriangular{Float64,Array{Float64,2}}:\n", " 0.541752 ⋅ ⋅ ⋅ \n", " 0.0259224 0.0557164 ⋅ ⋅ \n", " -0.253773 0.267834 0.0226436 ⋅ \n", " 0.0189352 0.000872977 0.0620244 0.0" ] }, "execution_count": 33, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m3.λ[2]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "which generate the covariance matrices for the random effects.\n", "The cumulative proportion of the variance in the principal components of these covariance matrices, available as" ] }, { "cell_type": "code", "execution_count": 34, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "2-element Array{Array{Float64,1},1}:\n", " [0.93984, 1.0, 1.0, 1.0] \n", " [0.853217, 0.991263, 1.0, 1.0]" ] }, "execution_count": 34, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m3.rePCA" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "show that 93% of the variation in the random effects for subject is in the first principal direction and 99% in the first two principal directions.\n", "The random effects for item also have 99% of the variation in the first two principal directions.\n", "\n", "Furthermore the estimates of the standard deviations of the \"slope\" random effects are much smaller than the those of the intercept random effects except for the `P` coefficient random effect for `item`, which suggests that the model could be reduced to `rt_trunc ~ 1 + spkr*prec*load + (1|subj) + (1+prec|item)` or even `RTtrunc ~ 1 + spkr+prec+load + (1|subj) + (1+prec|item)`." ] }, { "cell_type": "code", "execution_count": 35, "metadata": {}, "outputs": [], "source": [ "const f4 = @formula(rt_trunc ~ 1 + spkr+prec+load + (1|subj) + (1+prec|item));" ] }, { "cell_type": "code", "execution_count": 36, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 17.886 ms (75285 allocations: 3.31 MiB)\n" ] }, { "data": { "text/plain": [ "Linear mixed model fit by maximum likelihood\n", " rt_trunc ~ 1 + spkr + prec + load + (1 | subj) + (1 + prec | item)\n", " logLik -2 logLik AIC BIC \n", " -1.43319251×10⁴ 2.86638501×10⁴ 2.86818501×10⁴ 2.87312548×10⁴\n", "\n", "Variance components:\n", " Variance Std.Dev. Corr.\n", " item 133015.242 364.71255\n", " 63766.936 252.52116 -0.70\n", " subj 88819.437 298.02590\n", " Residual 462443.388 680.03190\n", " Number of obs: 1789; levels of grouping factors: 32, 56\n", "\n", " Fixed-effects parameters:\n", " Estimate Std.Error z value P(>|z|)\n", "(Intercept) 2181.85 77.4681 28.1645 <1e-99\n", "spkr: old 67.879 16.0785 4.22172 <1e-4\n", "prec: maintain -333.791 47.4472 -7.03499 <1e-11\n", "load: yes 78.5904 16.0785 4.88792 <1e-5\n" ] }, "execution_count": 36, "metadata": {}, "output_type": "execute_result" } ], "source": [ "const m4 = @btime fit!(LinearMixedModel($f4, $kb07, $contrasts))" ] }, { "cell_type": "code", "execution_count": 37, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "93" ] }, "execution_count": 37, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m4.optsum.feval" ] }, { "cell_type": "code", "execution_count": 38, "metadata": {}, "outputs": [], "source": [ "const θ4 = m4.θ;" ] }, { "cell_type": "code", "execution_count": 39, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 136.383 μs (289 allocations: 14.30 KiB)\n" ] } ], "source": [ "@btime objective(updateL!(setθ!($m4, $θ4)));" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "These two model fits can be compared with one of the information criteria, `AIC` or `BIC`, for which \"smaller is better\".\n", "They both indicate a preference for the smaller model, `m4`.\n", "\n", "These criteria are values of the objective, negative twice the log-likelihood at convergence, plus a penalty that depends on the number of parameters being estimated.\n", "\n", "Because model `m4` is a special case of model `m3`, a likelihood ratio test could also be used.\n", "The alternative model, `m3`, will always produce an objective that is less than or equal to that from the null model, `m4`.\n", "The difference in these value is similar to the change in the residual sum of squares in a linear model fit.\n", "This objective would be called the _deviance_ if there was a way of defining a saturated model but it is not clear what this should be.\n", "However, if there was a way to define a deviance then the difference in the deviances would be the same as the differences in these objectives, which is" ] }, { "cell_type": "code", "execution_count": 40, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "1-element Array{Float64,1}:\n", " 25.879117466858588" ] }, "execution_count": 40, "metadata": {}, "output_type": "execute_result" } ], "source": [ "diff(objective.([m3, m4]))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This difference is compared to a $\\chi^2$ distribution with degrees of freedom corresponding to the difference in the number of parameters" ] }, { "cell_type": "code", "execution_count": 41, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "1-element Array{Int64,1}:\n", " 20" ] }, "execution_count": 41, "metadata": {}, "output_type": "execute_result" } ], "source": [ "diff(dof.([m4, m3]))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "producing a p-value of" ] }, { "cell_type": "code", "execution_count": 42, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "0.16984148520447773" ] }, "execution_count": 42, "metadata": {}, "output_type": "execute_result" } ], "source": [ "ccdf(Chisq(20), first(diff(objective.([m3,m4]))))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The \"maximal\" model as proposed by [_Barr et al., 2013_](https://www.sciencedirect.com/science/article/pii/S0749596X12001180) would include all possible interactions of experimental and grouping factors." ] }, { "cell_type": "code", "execution_count": 43, "metadata": {}, "outputs": [], "source": [ "const f5 = @formula(rt_trunc ~ 1 + spkr*prec*load + (1+spkr*prec*load|subj) + (1+spkr*prec*load|item));" ] }, { "cell_type": "code", "execution_count": 44, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 10.297 s (2568234 allocations: 107.73 MiB)\n" ] }, { "data": { "text/plain": [ "Linear mixed model fit by maximum likelihood\n", " rt_trunc ~ 1 + spkr + prec + load + spkr & prec + spkr & load + prec & load + spkr & prec & load + (1 + spkr + prec + spkr & prec + load + spkr & load + prec & load + spkr & prec & load | subj) + (1 + spkr + prec + spkr & prec + load + spkr & load + prec & load + spkr & prec & load | item)\n", " logLik -2 logLik AIC BIC \n", " -1.42891729×10⁴ 2.85783458×10⁴ 2.87403458×10⁴ 2.91849882×10⁴\n", "\n", "Variance components:\n", " Variance Std.Dev. Corr.\n", " subj 90608.5644 301.012565\n", " 5143.6952 71.719559 0.43\n", " 5644.8543 75.132245 -0.48 -0.09\n", " 9013.7657 94.940854 -0.20 -0.76 0.55\n", " 7685.1682 87.665091 0.22 0.22 0.39 0.20\n", " 1817.5847 42.633141 -0.45 -0.51 0.10 0.29 0.45\n", " 7501.2782 86.609919 -0.10 -0.14 -0.04 0.04 -0.86 -0.72\n", " 3885.1757 62.331178 0.48 0.41 0.37 0.16 0.09 -0.78 0.39\n", " item 130494.7324 361.240546\n", " 1749.7106 41.829542 0.36\n", " 62347.1395 249.694092 -0.68 0.43\n", " 1036.0759 32.188132 -0.57 -0.79 -0.03\n", " 2871.3208 53.584706 0.20 0.06 -0.17 -0.03\n", " 1472.8731 38.378029 -0.30 0.03 0.31 -0.20 -0.53\n", " 4738.0323 68.833366 0.07 0.26 0.22 0.25 -0.15 -0.03\n", " 4990.5418 70.643767 -0.05 -0.48 -0.31 0.65 0.69 -0.72 0.11\n", " Residual 401618.2118 633.733550\n", " Number of obs: 1789; levels of grouping factors: 56, 32\n", "\n", " Fixed-effects parameters:\n", " Estimate Std.Error z value P(>|z|)\n", "(Intercept) 2181.86 76.9448 28.3562 <1e-99\n", "spkr: old 67.7443 19.263 3.5168 0.0004\n", "prec: maintain -333.925 47.6832 -7.003 <1e-11\n", "load: yes 78.581 21.2485 3.69819 0.0002\n", "spkr: old & prec: maintain -21.7763 20.4419 -1.06528 0.2868\n", "spkr: old & load: yes 18.3885 17.4074 1.05636 0.2908\n", "prec: maintain & load: yes 4.53807 22.5067 0.201632 0.8402\n", "spkr: old & prec: maintain & load: yes 23.4181 21.2101 1.1041 0.2695\n" ] }, "execution_count": 44, "metadata": {}, "output_type": "execute_result" } ], "source": [ "const m5 = @btime fit!(LinearMixedModel($f5, $kb07, $contrasts))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As is common in models with high-dimensional vector-valued random effects, the dominant portion of the variation is in the first few principal components" ] }, { "cell_type": "code", "execution_count": 45, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "2-element Array{Array{Float64,1},1}:\n", " [0.724878, 0.846927, 0.94087, 0.999998, 1.0, 1.0, 1.0, 1.0] \n", " [0.796978, 0.94188, 0.975067, 0.996119, 0.999996, 1.0, 1.0, 1.0]" ] }, "execution_count": 45, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m5.rePCA" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For both the subjects and the items practically all the variation of these 8-dimensional random effects is in the first 4 principal components.\n", "\n", "The dimension of $\\theta$, the parameters in the optimization, increases considerably" ] }, { "cell_type": "code", "execution_count": 46, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "72" ] }, "execution_count": 46, "metadata": {}, "output_type": "execute_result" } ], "source": [ "const θ5 = m5.θ;\n", "length(θ5)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Of these 72 parameters, 36 are estimated from variation between items, yet there are only 32 items.\n", "\n", "Because the dimension of the optimization problem has gotten much larger the number of function evaluations to convergence increases correspondingly." ] }, { "cell_type": "code", "execution_count": 47, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "2810" ] }, "execution_count": 47, "metadata": {}, "output_type": "execute_result" } ], "source": [ "m5.optsum.feval" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Also, each function evaluation requires more time" ] }, { "cell_type": "code", "execution_count": 48, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 2.284 ms (561 allocations: 30.02 KiB)\n" ] } ], "source": [ "@btime objective(updateL!(setθ!($m5, $θ5)));" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "almost all of which is for the call to `updateL!`." ] }, { "cell_type": "code", "execution_count": 49, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 2.228 ms (519 allocations: 28.88 KiB)\n" ] } ], "source": [ "@btime updateL!($m5);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To provide more granularity in the plots of execution time shown below, fit one more model without random effects for the third-order interaction of the experimental factors." ] }, { "cell_type": "code", "execution_count": 50, "metadata": {}, "outputs": [], "source": [ "const f6 = @formula(rt_trunc ~ 1 + spkr*prec*load + \n", " (1+spkr+prec+load+spkr&prec+spkr&load+prec&load|subj) + \n", " (1+spkr+prec+load+spkr&prec+spkr&load+prec&load|item));" ] }, { "cell_type": "code", "execution_count": 51, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 2.950 s (1526668 allocations: 59.76 MiB)\n" ] }, { "data": { "text/plain": [ "Linear mixed model fit by maximum likelihood\n", " rt_trunc ~ 1 + spkr + prec + load + spkr & prec + spkr & load + prec & load + spkr & prec & load + (1 + spkr + prec + load + spkr & prec + spkr & load + prec & load | subj) + (1 + spkr + prec + load + spkr & prec + spkr & load + prec & load | item)\n", " logLik -2 logLik AIC BIC \n", " -1.43019526×10⁴ 2.86039053×10⁴ 2.87339053×10⁴ 2.9090717×10⁴\n", "\n", "Variance components:\n", " Variance Std.Dev. Corr.\n", " subj 92941.31347 304.862778\n", " 5017.82659 70.836619 0.49\n", " 5026.50804 70.897871 -0.53 -0.18\n", " 7197.78240 84.839745 0.24 0.24 0.33\n", " 8159.97984 90.332607 -0.19 -0.76 0.55 0.17\n", " 1310.88763 36.206182 -0.58 -0.52 0.43 0.52 0.43\n", " 7451.93456 86.324588 -0.17 -0.23 -0.09 -0.93 0.03 -0.60\n", " item 130595.36541 361.379808\n", " 229.13521 15.137213 1.00\n", " 62451.29181 249.902565 -0.68 -0.68\n", " 2103.67100 45.865793 0.21 0.21 -0.10\n", " 708.89408 26.625065 -0.59 -0.59 -0.04 -0.35\n", " 335.37064 18.313128 -0.78 -0.78 0.69 -0.41 0.54\n", " 4296.57645 65.548276 0.11 0.11 0.27 -0.32 0.10 0.42\n", " Residual 417488.17855 646.133251\n", " Number of obs: 1789; levels of grouping factors: 56, 32\n", "\n", " Fixed-effects parameters:\n", " Estimate Std.Error z value P(>|z|)\n", "(Intercept) 2181.91 77.2928 28.2292 <1e-99\n", "spkr: old 67.7697 18.1705 3.72966 0.0002\n", "prec: maintain -333.9 47.6945 -7.00081 <1e-11\n", "load: yes 78.5329 20.6802 3.79749 0.0001\n", "spkr: old & prec: maintain -21.7283 20.0317 -1.08469 0.2781\n", "spkr: old & load: yes 18.3631 16.349 1.12319 0.2614\n", "prec: maintain & load: yes 4.51266 22.3772 0.201663 0.8402\n", "spkr: old & prec: maintain & load: yes 23.37 15.2775 1.5297 0.1261\n" ] }, "execution_count": 51, "metadata": {}, "output_type": "execute_result" } ], "source": [ "const m6 = @btime fit!(LinearMixedModel($f6, $kb07, $contrasts))" ] }, { "cell_type": "code", "execution_count": 52, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "56" ] }, "execution_count": 52, "metadata": {}, "output_type": "execute_result" } ], "source": [ "const θ6 = m6.θ;\n", "length(θ6)" ] }, { "cell_type": "code", "execution_count": 53, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 1.568 ms (561 allocations: 30.02 KiB)\n" ] } ], "source": [ "@btime objective(updateL!(setθ!($m6, $θ6)));" ] }, { "cell_type": "code", "execution_count": 54, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ " 1.538 ms (519 allocations: 28.88 KiB)\n" ] } ], "source": [ "@btime updateL!($m6);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Summary of goodness of fit" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Apply the goodness of fit measures to `m1` to `m6` creating a data frame" ] }, { "cell_type": "code", "execution_count": 55, "metadata": {}, "outputs": [ { "data": { "text/html": [ "

6 rows × 5 columns

dofdevianceAICAICcBIC
Int64Float64Float64Float64Float64
1728823.128837.128837.228875.6
21128818.528840.528840.628900.9
32928638.028696.028697.028855.2
4928663.928681.928682.028731.3
58128578.328740.328748.129185.0
66528603.928733.928738.929090.7
" ], "text/latex": [ "\\begin{tabular}{r|ccccc}\n", "\t& dof & deviance & AIC & AICc & BIC\\\\\n", "\t\\hline\n", "\t& Int64 & Float64 & Float64 & Float64 & Float64\\\\\n", "\t\\hline\n", "\t1 & 7 & 28823.1 & 28837.1 & 28837.2 & 28875.6 \\\\\n", "\t2 & 11 & 28818.5 & 28840.5 & 28840.6 & 28900.9 \\\\\n", "\t3 & 29 & 28638.0 & 28696.0 & 28697.0 & 28855.2 \\\\\n", "\t4 & 9 & 28663.9 & 28681.9 & 28682.0 & 28731.3 \\\\\n", "\t5 & 81 & 28578.3 & 28740.3 & 28748.1 & 29185.0 \\\\\n", "\t6 & 65 & 28603.9 & 28733.9 & 28738.9 & 29090.7 \\\\\n", "\\end{tabular}\n" ], "text/plain": [ "6×5 DataFrame\n", "│ Row │ dof │ deviance │ AIC │ AICc │ BIC │\n", "│ │ \u001b[90mInt64\u001b[39m │ \u001b[90mFloat64\u001b[39m │ \u001b[90mFloat64\u001b[39m │ \u001b[90mFloat64\u001b[39m │ \u001b[90mFloat64\u001b[39m │\n", "├─────┼───────┼──────────┼─────────┼─────────┼─────────┤\n", "│ 1 │ 7 │ 28823.1 │ 28837.1 │ 28837.2 │ 28875.6 │\n", "│ 2 │ 11 │ 28818.5 │ 28840.5 │ 28840.6 │ 28900.9 │\n", "│ 3 │ 29 │ 28638.0 │ 28696.0 │ 28697.0 │ 28855.2 │\n", "│ 4 │ 9 │ 28663.9 │ 28681.9 │ 28682.0 │ 28731.3 │\n", "│ 5 │ 81 │ 28578.3 │ 28740.3 │ 28748.1 │ 29185.0 │\n", "│ 6 │ 65 │ 28603.9 │ 28733.9 │ 28738.9 │ 29090.7 │" ] }, "execution_count": 55, "metadata": {}, "output_type": "execute_result" } ], "source": [ "const mods = [m1, m2, m3, m4, m5, m6];\n", "gofsumry = DataFrame(dof=dof.(mods), deviance=deviance.(mods),\n", " AIC = aic.(mods), AICc = aicc.(mods), BIC = bic.(mods))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here `dof` or degrees of freedom is the total number of parameters estimated in the model and `deviance` is simply negative twice the log-likelihood at convergence, without a correction for a saturated model. All the information criteria are on a scale of \"smaller is better\" and all would select model 4 as \"best\"." ] }, { "cell_type": "code", "execution_count": 56, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "(AIC = 4, AICc = 4, BIC = 4)" ] }, "execution_count": 56, "metadata": {}, "output_type": "execute_result" } ], "source": [ "gofnms = (:AIC, :AICc, :BIC)\n", "map(nm -> argmin(gofsumry[nm]), NamedTuple{gofnms, NTuple{3, Symbol}}(gofnms))" ] }, { "cell_type": "code", "execution_count": 57, "metadata": {}, "outputs": [ { "data": { "text/html": [ "

6 rows × 10 columns

pq1n1q2n2nparnevμsevfit
Int64Int64Int64Int64Int64Int64Int64Int64Float64Float64
14156132272842.0350.002072
281561322113246.950.004233
384564322029781589.7840.428509
442321564993134.5390.015163
58856832728128102248.05.536
68756732566513561552.02.423
" ], "text/latex": [ "\\begin{tabular}{r|cccccccccc}\n", "\t& p & q1 & n1 & q2 & n2 & nθ & npar & nev & μsev & fit\\\\\n", "\t\\hline\n", "\t& Int64 & Int64 & Int64 & Int64 & Int64 & Int64 & Int64 & Int64 & Float64 & Float64\\\\\n", "\t\\hline\n", "\t1 & 4 & 1 & 56 & 1 & 32 & 2 & 7 & 28 & 42.035 & 0.002072 \\\\\n", "\t2 & 8 & 1 & 56 & 1 & 32 & 2 & 11 & 32 & 46.95 & 0.004233 \\\\\n", "\t3 & 8 & 4 & 56 & 4 & 32 & 20 & 29 & 781 & 589.784 & 0.428509 \\\\\n", "\t4 & 4 & 2 & 32 & 1 & 56 & 4 & 9 & 93 & 134.539 & 0.015163 \\\\\n", "\t5 & 8 & 8 & 56 & 8 & 32 & 72 & 81 & 2810 & 2248.0 & 5.536 \\\\\n", "\t6 & 8 & 7 & 56 & 7 & 32 & 56 & 65 & 1356 & 1552.0 & 2.423 \\\\\n", "\\end{tabular}\n" ], "text/plain": [ "6×10 DataFrame. Omitted printing of 2 columns\n", "│ Row │ p │ q1 │ n1 │ q2 │ n2 │ nθ │ npar │ nev │\n", "│ │ \u001b[90mInt64\u001b[39m │ \u001b[90mInt64\u001b[39m │ \u001b[90mInt64\u001b[39m │ \u001b[90mInt64\u001b[39m │ \u001b[90mInt64\u001b[39m │ \u001b[90mInt64\u001b[39m │ \u001b[90mInt64\u001b[39m │ \u001b[90mInt64\u001b[39m │\n", "├─────┼───────┼───────┼───────┼───────┼───────┼───────┼───────┼───────┤\n", "│ 1 │ 4 │ 1 │ 56 │ 1 │ 32 │ 2 │ 7 │ 28 │\n", "│ 2 │ 8 │ 1 │ 56 │ 1 │ 32 │ 2 │ 11 │ 32 │\n", "│ 3 │ 8 │ 4 │ 56 │ 4 │ 32 │ 20 │ 29 │ 781 │\n", "│ 4 │ 4 │ 2 │ 32 │ 1 │ 56 │ 4 │ 9 │ 93 │\n", "│ 5 │ 8 │ 8 │ 56 │ 8 │ 32 │ 72 │ 81 │ 2810 │\n", "│ 6 │ 8 │ 7 │ 56 │ 7 │ 32 │ 56 │ 65 │ 1356 │" ] }, "execution_count": 57, "metadata": {}, "output_type": "execute_result" } ], "source": [ "nfe(m) = length(coef(m));\n", "nre1(m) = MixedModels.vsize(first(m.reterms));\n", "nlv1(m) = MixedModels.nlevs(first(m.reterms));\n", "nre2(m) = MixedModels.vsize(last(m.reterms));\n", "nlv2(m) = MixedModels.nlevs(last(m.reterms));\n", "nθ(m) = sum(MixedModels.nθ, m.reterms);\n", "nev = map(m -> m.optsum.feval, mods);\n", "μsev = [42.035, 46.950, 589.784, 134.539, 2248, 1552];\n", "fit = [0.002072, 0.004233, 0.428509, 0.015163, 5.536, 2.423]\n", "dimsumry = DataFrame(p = nfe.(mods), q1 = nre1.(mods), n1 = nlv1.(mods),\n", " q2 = nre2.(mods), n2 = nlv2.(mods), nθ = nθ.(mods),\n", " npar = dof.(mods), nev = nev, μsev = μsev, \n", " fit = fit)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In this table, `p` is the dimension of the fixed-effects vector, `q1` is the dimension of the random-effects for each of the `n1` levels of the first grouping factor while `q2` and `n2` are similar for the second grouping factor.\n", "`nθ` is the dimension of the parameter vector and `npar` is the total number of parameters being estimated, and is equal to `nθ + p + 1`.\n", "\n", "`nev` is the number of function evaluations to convergence.\n", "Because this number will depend on the number of parameters in the model and several other factors such as the setting of the convergence criteria, only its magnitude should be considered reproducible.\n", "`μsev` is the best-case time, in microseconds, per function evaluation and `fit` is the best-case time, in seconds, to fit the model." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Relationships between model complexity and fitting time" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As would be expected, the time required to fit a model increases with the complexity of the model.\n", "In particular, the number of function evaluations to convergence increases with the number of parameters being optimized." ] }, { "cell_type": "code", "execution_count": 58, "metadata": {}, "outputs": [ { "data": { "image/svg+xml": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " # of parameters in the optimization\n", " \n", " \n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 1×10³\n", " \n", " \n", " \n", " \n", " 2×10³\n", " \n", " \n", " \n", " \n", " 3×10³\n", " \n", " \n", " \n", " \n", " \n", " \n", " Function evaluations to convergence\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", "\n", "\n" ], "text/html": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " # of parameters in the optimization\n", " \n", " \n", " \n", " \n", " \n", " \n", " -100\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " 180\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -75\n", " \n", " \n", " \n", " \n", " -70\n", " \n", " \n", " \n", " \n", " -65\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -55\n", " \n", " \n", " \n", " \n", " -50\n", " \n", " \n", " \n", " \n", " -45\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -35\n", " \n", " \n", " \n", " \n", " -30\n", " \n", " \n", " \n", " \n", " -25\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " -15\n", " \n", " \n", " \n", " \n", " -10\n", " \n", " \n", " \n", " \n", " -5\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 5\n", " \n", " \n", " \n", " \n", " 10\n", " \n", " \n", " \n", " \n", " 15\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 25\n", " \n", " \n", " \n", " \n", " 30\n", " \n", " \n", " \n", " \n", " 35\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 45\n", " \n", " \n", " \n", " \n", " 50\n", " \n", " \n", " \n", " \n", " 55\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 65\n", " \n", " \n", " \n", " \n", " 70\n", " \n", " \n", " \n", " \n", " 75\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 85\n", " \n", " \n", " \n", " \n", " 90\n", " \n", " \n", " \n", " \n", " 95\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 105\n", " \n", " \n", " \n", " \n", " 110\n", " \n", " \n", " \n", " \n", " 115\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 125\n", " \n", " \n", " \n", " \n", " 130\n", " \n", " \n", " \n", " \n", " 135\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 145\n", " \n", " \n", " \n", " \n", " 150\n", " \n", " \n", " \n", " \n", " 155\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " -100\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -75\n", " \n", " \n", " \n", " \n", " -70\n", " \n", " \n", " \n", " \n", " -65\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -55\n", " \n", " \n", " \n", " \n", " -50\n", " \n", " \n", " \n", " \n", " -45\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -35\n", " \n", " \n", " \n", " \n", " -30\n", " \n", " \n", " \n", " \n", " -25\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " -15\n", " \n", " \n", " \n", " \n", " -10\n", " \n", " \n", " \n", " \n", " -5\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 5\n", " \n", " \n", " \n", " \n", " 10\n", " \n", " \n", " \n", " \n", " 15\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 25\n", " \n", " \n", " \n", " \n", " 30\n", " \n", " \n", " \n", " \n", " 35\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 45\n", " \n", " \n", " \n", " \n", " 50\n", " \n", " \n", " \n", " \n", " 55\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 65\n", " \n", " \n", " \n", " \n", " 70\n", " \n", " \n", " \n", " \n", " 75\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 85\n", " \n", " \n", " \n", " \n", " 90\n", " \n", " \n", " \n", " \n", " 95\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 105\n", " \n", " \n", " \n", " \n", " 110\n", " \n", " \n", " \n", " \n", " 115\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 125\n", " \n", " \n", " \n", " \n", " 130\n", " \n", " \n", " \n", " \n", " 135\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 145\n", " \n", " \n", " \n", " \n", " 150\n", " \n", " \n", " \n", " \n", " 155\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " h,j,k,l,arrows,drag to pan\n", " \n", " \n", " \n", " \n", " i,o,+,-,scroll,shift-drag to zoom\n", " \n", " \n", " \n", " \n", " r,dbl-click to reset\n", " \n", " \n", " \n", " \n", " c for coordinates\n", " \n", " \n", " \n", " \n", " ? for help\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " ?\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " -4×10³\n", " \n", " \n", " \n", " \n", " -3×10³\n", " \n", " \n", " \n", " \n", " -2×10³\n", " \n", " \n", " \n", " \n", " -1×10³\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 1×10³\n", " \n", " \n", " \n", " \n", " 2×10³\n", " \n", " \n", " \n", " \n", " 3×10³\n", " \n", " \n", " \n", " \n", " 4×10³\n", " \n", " \n", " \n", " \n", " 5×10³\n", " \n", " \n", " \n", " \n", " 6×10³\n", " \n", " \n", " \n", " \n", " 7×10³\n", " \n", " \n", " \n", " \n", " -3000\n", " \n", " \n", " \n", " \n", " -2900\n", " \n", " \n", " \n", " \n", " -2800\n", " \n", " \n", " \n", " \n", " -2700\n", " \n", " \n", " \n", " \n", " -2600\n", " \n", " \n", " \n", " \n", " -2500\n", " \n", " \n", " \n", " \n", " -2400\n", " \n", " \n", " \n", " \n", " -2300\n", " \n", " \n", " \n", " \n", " -2200\n", " \n", " \n", " \n", " \n", " -2100\n", " \n", " \n", " \n", " \n", " -2000\n", " \n", " \n", " \n", " \n", " -1900\n", " \n", " \n", " \n", " \n", " -1800\n", " \n", " \n", " \n", " \n", " -1700\n", " \n", " \n", " \n", " \n", " -1600\n", " \n", " \n", " \n", " \n", " -1500\n", " \n", " \n", " \n", " \n", " -1400\n", " \n", " \n", " \n", " \n", " -1300\n", " \n", " \n", " \n", " \n", " -1200\n", " \n", " \n", " \n", " \n", " -1100\n", " \n", " \n", " \n", " \n", " -1000\n", " \n", " \n", " \n", " \n", " -900\n", " \n", " \n", " \n", " \n", " -800\n", " \n", " \n", " \n", " \n", " -700\n", " \n", " \n", " \n", " \n", " -600\n", " \n", " \n", " \n", " \n", " -500\n", " \n", " \n", " \n", " \n", " -400\n", " \n", " \n", " \n", " \n", " -300\n", " \n", " \n", " \n", " \n", " -200\n", " \n", " \n", " \n", " \n", " -100\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " 300\n", " \n", " \n", " \n", " \n", " 400\n", " \n", " \n", " \n", " \n", " 500\n", " \n", " \n", " \n", " \n", " 600\n", " \n", " \n", " \n", " \n", " 700\n", " \n", " \n", " \n", " \n", " 800\n", " \n", " \n", " \n", " \n", " 900\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1100\n", " \n", " \n", " \n", " \n", " 1200\n", " \n", " \n", " \n", " \n", " 1300\n", " \n", " \n", " \n", " \n", " 1400\n", " \n", " \n", " \n", " \n", " 1500\n", " \n", " \n", " \n", " \n", " 1600\n", " \n", " \n", " \n", " \n", " 1700\n", " \n", " \n", " \n", " \n", " 1800\n", " \n", " \n", " \n", " \n", " 1900\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2100\n", " \n", " \n", " \n", " \n", " 2200\n", " \n", " \n", " \n", " \n", " 2300\n", " \n", " \n", " \n", " \n", " 2400\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " 2600\n", " \n", " \n", " \n", " \n", " 2700\n", " \n", " \n", " \n", " \n", " 2800\n", " \n", " \n", " \n", " \n", " 2900\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 3100\n", " \n", " \n", " \n", " \n", " 3200\n", " \n", " \n", " \n", " \n", " 3300\n", " \n", " \n", " \n", " \n", " 3400\n", " \n", " \n", " \n", " \n", " 3500\n", " \n", " \n", " \n", " \n", " 3600\n", " \n", " \n", " \n", " \n", " 3700\n", " \n", " \n", " \n", " \n", " 3800\n", " \n", " \n", " \n", " \n", " 3900\n", " \n", " \n", " \n", " \n", " 4000\n", " \n", " \n", " \n", " \n", " 4100\n", " \n", " \n", " \n", " \n", " 4200\n", " \n", " \n", " \n", " \n", " 4300\n", " \n", " \n", " \n", " \n", " 4400\n", " \n", " \n", " \n", " \n", " 4500\n", " \n", " \n", " \n", " \n", " 4600\n", " \n", " \n", " \n", " \n", " 4700\n", " \n", " \n", " \n", " \n", " 4800\n", " \n", " \n", " \n", " \n", " 4900\n", " \n", " \n", " \n", " \n", " 5000\n", " \n", " \n", " \n", " \n", " 5100\n", " \n", " \n", " \n", " \n", " 5200\n", " \n", " \n", " \n", " \n", " 5300\n", " \n", " \n", " \n", " \n", " 5400\n", " \n", " \n", " \n", " \n", " 5500\n", " \n", " \n", " \n", " \n", " 5600\n", " \n", " \n", " \n", " \n", " 5700\n", " \n", " \n", " \n", " \n", " 5800\n", " \n", " \n", " \n", " \n", " 5900\n", " \n", " \n", " \n", " \n", " 6000\n", " \n", " \n", " \n", " \n", " -3000\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 6000\n", " \n", " \n", " \n", " \n", " -3000\n", " \n", " \n", " \n", " \n", " -2800\n", " \n", " \n", " \n", " \n", " -2600\n", " \n", " \n", " \n", " \n", " -2400\n", " \n", " \n", " \n", " \n", " -2200\n", " \n", " \n", " \n", " \n", " -2000\n", " \n", " \n", " \n", " \n", " -1800\n", " \n", " \n", " \n", " \n", " -1600\n", " \n", " \n", " \n", " \n", " -1400\n", " \n", " \n", " \n", " \n", " -1200\n", " \n", " \n", " \n", " \n", " -1000\n", " \n", " \n", " \n", " \n", " -800\n", " \n", " \n", " \n", " \n", " -600\n", " \n", " \n", " \n", " \n", " -400\n", " \n", " \n", " \n", " \n", " -200\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " 400\n", " \n", " \n", " \n", " \n", " 600\n", " \n", " \n", " \n", " \n", " 800\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1200\n", " \n", " \n", " \n", " \n", " 1400\n", " \n", " \n", " \n", " \n", " 1600\n", " \n", " \n", " \n", " \n", " 1800\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2200\n", " \n", " \n", " \n", " \n", " 2400\n", " \n", " \n", " \n", " \n", " 2600\n", " \n", " \n", " \n", " \n", " 2800\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 3200\n", " \n", " \n", " \n", " \n", " 3400\n", " \n", " \n", " \n", " \n", " 3600\n", " \n", " \n", " \n", " \n", " 3800\n", " \n", " \n", " \n", " \n", " 4000\n", " \n", " \n", " \n", " \n", " 4200\n", " \n", " \n", " \n", " \n", " 4400\n", " \n", " \n", " \n", " \n", " 4600\n", " \n", " \n", " \n", " \n", " 4800\n", " \n", " \n", " \n", " \n", " 5000\n", " \n", " \n", " \n", " \n", " 5200\n", " \n", " \n", " \n", " \n", " 5400\n", " \n", " \n", " \n", " \n", " 5600\n", " \n", " \n", " \n", " \n", " 5800\n", " \n", " \n", " \n", " \n", " 6000\n", " \n", " \n", " \n", " \n", " \n", " \n", " Function evaluations to convergence\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", "\n", "\n" ], "text/plain": [ "Plot(...)" ] }, "execution_count": 58, "metadata": {}, "output_type": "execute_result" } ], "source": [ "plot(dimsumry, x=:nθ, y=:nev, Geom.point,\n", " Guide.xlabel(\"# of parameters in the optimization\"),\n", " Guide.ylabel(\"Function evaluations to convergence\"))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The relationship is more-or-less linear, based on this small sample.\n", "\n", "The time to evaluate the objective also increases with n$\\theta$, the number parameters in the random-effects covariance matrices." ] }, { "cell_type": "code", "execution_count": 59, "metadata": {}, "outputs": [ { "data": { "image/svg+xml": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " # of parameters in the optimization\n", " \n", " \n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 500\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1500\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " \n", " \n", " Microseconds per function evaluation\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", "\n", "\n" ], "text/html": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " # of parameters in the optimization\n", " \n", " \n", " \n", " \n", " \n", " \n", " -100\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " 180\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -75\n", " \n", " \n", " \n", " \n", " -70\n", " \n", " \n", " \n", " \n", " -65\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -55\n", " \n", " \n", " \n", " \n", " -50\n", " \n", " \n", " \n", " \n", " -45\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -35\n", " \n", " \n", " \n", " \n", " -30\n", " \n", " \n", " \n", " \n", " -25\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " -15\n", " \n", " \n", " \n", " \n", " -10\n", " \n", " \n", " \n", " \n", " -5\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 5\n", " \n", " \n", " \n", " \n", " 10\n", " \n", " \n", " \n", " \n", " 15\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 25\n", " \n", " \n", " \n", " \n", " 30\n", " \n", " \n", " \n", " \n", " 35\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 45\n", " \n", " \n", " \n", " \n", " 50\n", " \n", " \n", " \n", " \n", " 55\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 65\n", " \n", " \n", " \n", " \n", " 70\n", " \n", " \n", " \n", " \n", " 75\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 85\n", " \n", " \n", " \n", " \n", " 90\n", " \n", " \n", " \n", " \n", " 95\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 105\n", " \n", " \n", " \n", " \n", " 110\n", " \n", " \n", " \n", " \n", " 115\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 125\n", " \n", " \n", " \n", " \n", " 130\n", " \n", " \n", " \n", " \n", " 135\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 145\n", " \n", " \n", " \n", " \n", " 150\n", " \n", " \n", " \n", " \n", " 155\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " -100\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -75\n", " \n", " \n", " \n", " \n", " -70\n", " \n", " \n", " \n", " \n", " -65\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -55\n", " \n", " \n", " \n", " \n", " -50\n", " \n", " \n", " \n", " \n", " -45\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -35\n", " \n", " \n", " \n", " \n", " -30\n", " \n", " \n", " \n", " \n", " -25\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " -15\n", " \n", " \n", " \n", " \n", " -10\n", " \n", " \n", " \n", " \n", " -5\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 5\n", " \n", " \n", " \n", " \n", " 10\n", " \n", " \n", " \n", " \n", " 15\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 25\n", " \n", " \n", " \n", " \n", " 30\n", " \n", " \n", " \n", " \n", " 35\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 45\n", " \n", " \n", " \n", " \n", " 50\n", " \n", " \n", " \n", " \n", " 55\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 65\n", " \n", " \n", " \n", " \n", " 70\n", " \n", " \n", " \n", " \n", " 75\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 85\n", " \n", " \n", " \n", " \n", " 90\n", " \n", " \n", " \n", " \n", " 95\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 105\n", " \n", " \n", " \n", " \n", " 110\n", " \n", " \n", " \n", " \n", " 115\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 125\n", " \n", " \n", " \n", " \n", " 130\n", " \n", " \n", " \n", " \n", " 135\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 145\n", " \n", " \n", " \n", " \n", " 150\n", " \n", " \n", " \n", " \n", " 155\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " h,j,k,l,arrows,drag to pan\n", " \n", " \n", " \n", " \n", " i,o,+,-,scroll,shift-drag to zoom\n", " \n", " \n", " \n", " \n", " r,dbl-click to reset\n", " \n", " \n", " \n", " \n", " c for coordinates\n", " \n", " \n", " \n", " \n", " ? for help\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " ?\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " -3000\n", " \n", " \n", " \n", " \n", " -2500\n", " \n", " \n", " \n", " \n", " -2000\n", " \n", " \n", " \n", " \n", " -1500\n", " \n", " \n", " \n", " \n", " -1000\n", " \n", " \n", " \n", " \n", " -500\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 500\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1500\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 3500\n", " \n", " \n", " \n", " \n", " 4000\n", " \n", " \n", " \n", " \n", " 4500\n", " \n", " \n", " \n", " \n", " 5000\n", " \n", " \n", " \n", " \n", " 5500\n", " \n", " \n", " \n", " \n", " -2500\n", " \n", " \n", " \n", " \n", " -2400\n", " \n", " \n", " \n", " \n", " -2300\n", " \n", " \n", " \n", " \n", " -2200\n", " \n", " \n", " \n", " \n", " -2100\n", " \n", " \n", " \n", " \n", " -2000\n", " \n", " \n", " \n", " \n", " -1900\n", " \n", " \n", " \n", " \n", " -1800\n", " \n", " \n", " \n", " \n", " -1700\n", " \n", " \n", " \n", " \n", " -1600\n", " \n", " \n", " \n", " \n", " -1500\n", " \n", " \n", " \n", " \n", " -1400\n", " \n", " \n", " \n", " \n", " -1300\n", " \n", " \n", " \n", " \n", " -1200\n", " \n", " \n", " \n", " \n", " -1100\n", " \n", " \n", " \n", " \n", " -1000\n", " \n", " \n", " \n", " \n", " -900\n", " \n", " \n", " \n", " \n", " -800\n", " \n", " \n", " \n", " \n", " -700\n", " \n", " \n", " \n", " \n", " -600\n", " \n", " \n", " \n", " \n", " -500\n", " \n", " \n", " \n", " \n", " -400\n", " \n", " \n", " \n", " \n", " -300\n", " \n", " \n", " \n", " \n", " -200\n", " \n", " \n", " \n", " \n", " -100\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " 300\n", " \n", " \n", " \n", " \n", " 400\n", " \n", " \n", " \n", " \n", " 500\n", " \n", " \n", " \n", " \n", " 600\n", " \n", " \n", " \n", " \n", " 700\n", " \n", " \n", " \n", " \n", " 800\n", " \n", " \n", " \n", " \n", " 900\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1100\n", " \n", " \n", " \n", " \n", " 1200\n", " \n", " \n", " \n", " \n", " 1300\n", " \n", " \n", " \n", " \n", " 1400\n", " \n", " \n", " \n", " \n", " 1500\n", " \n", " \n", " \n", " \n", " 1600\n", " \n", " \n", " \n", " \n", " 1700\n", " \n", " \n", " \n", " \n", " 1800\n", " \n", " \n", " \n", " \n", " 1900\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2100\n", " \n", " \n", " \n", " \n", " 2200\n", " \n", " \n", " \n", " \n", " 2300\n", " \n", " \n", " \n", " \n", " 2400\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " 2600\n", " \n", " \n", " \n", " \n", " 2700\n", " \n", " \n", " \n", " \n", " 2800\n", " \n", " \n", " \n", " \n", " 2900\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 3100\n", " \n", " \n", " \n", " \n", " 3200\n", " \n", " \n", " \n", " \n", " 3300\n", " \n", " \n", " \n", " \n", " 3400\n", " \n", " \n", " \n", " \n", " 3500\n", " \n", " \n", " \n", " \n", " 3600\n", " \n", " \n", " \n", " \n", " 3700\n", " \n", " \n", " \n", " \n", " 3800\n", " \n", " \n", " \n", " \n", " 3900\n", " \n", " \n", " \n", " \n", " 4000\n", " \n", " \n", " \n", " \n", " 4100\n", " \n", " \n", " \n", " \n", " 4200\n", " \n", " \n", " \n", " \n", " 4300\n", " \n", " \n", " \n", " \n", " 4400\n", " \n", " \n", " \n", " \n", " 4500\n", " \n", " \n", " \n", " \n", " 4600\n", " \n", " \n", " \n", " \n", " 4700\n", " \n", " \n", " \n", " \n", " 4800\n", " \n", " \n", " \n", " \n", " 4900\n", " \n", " \n", " \n", " \n", " 5000\n", " \n", " \n", " \n", " \n", " -2500\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 2500\n", " \n", " \n", " \n", " \n", " 5000\n", " \n", " \n", " \n", " \n", " -2600\n", " \n", " \n", " \n", " \n", " -2400\n", " \n", " \n", " \n", " \n", " -2200\n", " \n", " \n", " \n", " \n", " -2000\n", " \n", " \n", " \n", " \n", " -1800\n", " \n", " \n", " \n", " \n", " -1600\n", " \n", " \n", " \n", " \n", " -1400\n", " \n", " \n", " \n", " \n", " -1200\n", " \n", " \n", " \n", " \n", " -1000\n", " \n", " \n", " \n", " \n", " -800\n", " \n", " \n", " \n", " \n", " -600\n", " \n", " \n", " \n", " \n", " -400\n", " \n", " \n", " \n", " \n", " -200\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " 400\n", " \n", " \n", " \n", " \n", " 600\n", " \n", " \n", " \n", " \n", " 800\n", " \n", " \n", " \n", " \n", " 1000\n", " \n", " \n", " \n", " \n", " 1200\n", " \n", " \n", " \n", " \n", " 1400\n", " \n", " \n", " \n", " \n", " 1600\n", " \n", " \n", " \n", " \n", " 1800\n", " \n", " \n", " \n", " \n", " 2000\n", " \n", " \n", " \n", " \n", " 2200\n", " \n", " \n", " \n", " \n", " 2400\n", " \n", " \n", " \n", " \n", " 2600\n", " \n", " \n", " \n", " \n", " 2800\n", " \n", " \n", " \n", " \n", " 3000\n", " \n", " \n", " \n", " \n", " 3200\n", " \n", " \n", " \n", " \n", " 3400\n", " \n", " \n", " \n", " \n", " 3600\n", " \n", " \n", " \n", " \n", " 3800\n", " \n", " \n", " \n", " \n", " 4000\n", " \n", " \n", " \n", " \n", " 4200\n", " \n", " \n", " \n", " \n", " 4400\n", " \n", " \n", " \n", " \n", " 4600\n", " \n", " \n", " \n", " \n", " 4800\n", " \n", " \n", " \n", " \n", " 5000\n", " \n", " \n", " \n", " \n", " \n", " \n", " Microseconds per function evaluation\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", "\n", "\n" ], "text/plain": [ "Plot(...)" ] }, "execution_count": 59, "metadata": {}, "output_type": "execute_result" } ], "source": [ "plot(dimsumry, x=:nθ, y=:μsev, Geom.point, \n", " Guide.xlabel(\"# of parameters in the optimization\"),\n", " Guide.ylabel(\"Microseconds per function evaluation\"))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This relationship also appears linear, based on this small sample.\n", "\n", "Finally, consider the time to fit the model versus the number of parameters in the optimization," ] }, { "cell_type": "code", "execution_count": 60, "metadata": {}, "outputs": [ { "data": { "image/svg+xml": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " # of parameters in the optimization\n", " \n", " \n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 1\n", " \n", " \n", " \n", " \n", " 2\n", " \n", " \n", " \n", " \n", " 3\n", " \n", " \n", " \n", " \n", " 4\n", " \n", " \n", " \n", " \n", " 5\n", " \n", " \n", " \n", " \n", " 6\n", " \n", " \n", " \n", " \n", " \n", " \n", " Time (s) to fit the model\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", "\n", "\n" ], "text/html": [ "\n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", " # of parameters in the optimization\n", " \n", " \n", " \n", " \n", " \n", " \n", " -100\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " 180\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -75\n", " \n", " \n", " \n", " \n", " -70\n", " \n", " \n", " \n", " \n", " -65\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -55\n", " \n", " \n", " \n", " \n", " -50\n", " \n", " \n", " \n", " \n", " -45\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -35\n", " \n", " \n", " \n", " \n", " -30\n", " \n", " \n", " \n", " \n", " -25\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " -15\n", " \n", " \n", " \n", " \n", " -10\n", " \n", " \n", " \n", " \n", " -5\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 5\n", " \n", " \n", " \n", " \n", " 10\n", " \n", " \n", " \n", " \n", " 15\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 25\n", " \n", " \n", " \n", " \n", " 30\n", " \n", " \n", " \n", " \n", " 35\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 45\n", " \n", " \n", " \n", " \n", " 50\n", " \n", " \n", " \n", " \n", " 55\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 65\n", " \n", " \n", " \n", " \n", " 70\n", " \n", " \n", " \n", " \n", " 75\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 85\n", " \n", " \n", " \n", " \n", " 90\n", " \n", " \n", " \n", " \n", " 95\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 105\n", " \n", " \n", " \n", " \n", " 110\n", " \n", " \n", " \n", " \n", " 115\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 125\n", " \n", " \n", " \n", " \n", " 130\n", " \n", " \n", " \n", " \n", " 135\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 145\n", " \n", " \n", " \n", " \n", " 150\n", " \n", " \n", " \n", " \n", " 155\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " -100\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 200\n", " \n", " \n", " \n", " \n", " -80\n", " \n", " \n", " \n", " \n", " -75\n", " \n", " \n", " \n", " \n", " -70\n", " \n", " \n", " \n", " \n", " -65\n", " \n", " \n", " \n", " \n", " -60\n", " \n", " \n", " \n", " \n", " -55\n", " \n", " \n", " \n", " \n", " -50\n", " \n", " \n", " \n", " \n", " -45\n", " \n", " \n", " \n", " \n", " -40\n", " \n", " \n", " \n", " \n", " -35\n", " \n", " \n", " \n", " \n", " -30\n", " \n", " \n", " \n", " \n", " -25\n", " \n", " \n", " \n", " \n", " -20\n", " \n", " \n", " \n", " \n", " -15\n", " \n", " \n", " \n", " \n", " -10\n", " \n", " \n", " \n", " \n", " -5\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 5\n", " \n", " \n", " \n", " \n", " 10\n", " \n", " \n", " \n", " \n", " 15\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " 25\n", " \n", " \n", " \n", " \n", " 30\n", " \n", " \n", " \n", " \n", " 35\n", " \n", " \n", " \n", " \n", " 40\n", " \n", " \n", " \n", " \n", " 45\n", " \n", " \n", " \n", " \n", " 50\n", " \n", " \n", " \n", " \n", " 55\n", " \n", " \n", " \n", " \n", " 60\n", " \n", " \n", " \n", " \n", " 65\n", " \n", " \n", " \n", " \n", " 70\n", " \n", " \n", " \n", " \n", " 75\n", " \n", " \n", " \n", " \n", " 80\n", " \n", " \n", " \n", " \n", " 85\n", " \n", " \n", " \n", " \n", " 90\n", " \n", " \n", " \n", " \n", " 95\n", " \n", " \n", " \n", " \n", " 100\n", " \n", " \n", " \n", " \n", " 105\n", " \n", " \n", " \n", " \n", " 110\n", " \n", " \n", " \n", " \n", " 115\n", " \n", " \n", " \n", " \n", " 120\n", " \n", " \n", " \n", " \n", " 125\n", " \n", " \n", " \n", " \n", " 130\n", " \n", " \n", " \n", " \n", " 135\n", " \n", " \n", " \n", " \n", " 140\n", " \n", " \n", " \n", " \n", " 145\n", " \n", " \n", " \n", " \n", " 150\n", " \n", " \n", " \n", " \n", " 155\n", " \n", " \n", " \n", " \n", " 160\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " h,j,k,l,arrows,drag to pan\n", " \n", " \n", " \n", " \n", " i,o,+,-,scroll,shift-drag to zoom\n", " \n", " \n", " \n", " \n", " r,dbl-click to reset\n", " \n", " \n", " \n", " \n", " c for coordinates\n", " \n", " \n", " \n", " \n", " ? for help\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " ?\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " -7\n", " \n", " \n", " \n", " \n", " -6\n", " \n", " \n", " \n", " \n", " -5\n", " \n", " \n", " \n", " \n", " -4\n", " \n", " \n", " \n", " \n", " -3\n", " \n", " \n", " \n", " \n", " -2\n", " \n", " \n", " \n", " \n", " -1\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 1\n", " \n", " \n", " \n", " \n", " 2\n", " \n", " \n", " \n", " \n", " 3\n", " \n", " \n", " \n", " \n", " 4\n", " \n", " \n", " \n", " \n", " 5\n", " \n", " \n", " \n", " \n", " 6\n", " \n", " \n", " \n", " \n", " 7\n", " \n", " \n", " \n", " \n", " 8\n", " \n", " \n", " \n", " \n", " 9\n", " \n", " \n", " \n", " \n", " 10\n", " \n", " \n", " \n", " \n", " 11\n", " \n", " \n", " \n", " \n", " 12\n", " \n", " \n", " \n", " \n", " 13\n", " \n", " \n", " \n", " \n", " -6.0\n", " \n", " \n", " \n", " \n", " -5.8\n", " \n", " \n", " \n", " \n", " -5.6\n", " \n", " \n", " \n", " \n", " -5.4\n", " \n", " \n", " \n", " \n", " -5.2\n", " \n", " \n", " \n", " \n", " -5.0\n", " \n", " \n", " \n", " \n", " -4.8\n", " \n", " \n", " \n", " \n", " -4.6\n", " \n", " \n", " \n", " \n", " -4.4\n", " \n", " \n", " \n", " \n", " -4.2\n", " \n", " \n", " \n", " \n", " -4.0\n", " \n", " \n", " \n", " \n", " -3.8\n", " \n", " \n", " \n", " \n", " -3.6\n", " \n", " \n", " \n", " \n", " -3.4\n", " \n", " \n", " \n", " \n", " -3.2\n", " \n", " \n", " \n", " \n", " -3.0\n", " \n", " \n", " \n", " \n", " -2.8\n", " \n", " \n", " \n", " \n", " -2.6\n", " \n", " \n", " \n", " \n", " -2.4\n", " \n", " \n", " \n", " \n", " -2.2\n", " \n", " \n", " \n", " \n", " -2.0\n", " \n", " \n", " \n", " \n", " -1.8\n", " \n", " \n", " \n", " \n", " -1.6\n", " \n", " \n", " \n", " \n", " -1.4\n", " \n", " \n", " \n", " \n", " -1.2\n", " \n", " \n", " \n", " \n", " -1.0\n", " \n", " \n", " \n", " \n", " -0.8\n", " \n", " \n", " \n", " \n", " -0.6\n", " \n", " \n", " \n", " \n", " -0.4\n", " \n", " \n", " \n", " \n", " -0.2\n", " \n", " \n", " \n", " \n", " 0.0\n", " \n", " \n", " \n", " \n", " 0.2\n", " \n", " \n", " \n", " \n", " 0.4\n", " \n", " \n", " \n", " \n", " 0.6\n", " \n", " \n", " \n", " \n", " 0.8\n", " \n", " \n", " \n", " \n", " 1.0\n", " \n", " \n", " \n", " \n", " 1.2\n", " \n", " \n", " \n", " \n", " 1.4\n", " \n", " \n", " \n", " \n", " 1.6\n", " \n", " \n", " \n", " \n", " 1.8\n", " \n", " \n", " \n", " \n", " 2.0\n", " \n", " \n", " \n", " \n", " 2.2\n", " \n", " \n", " \n", " \n", " 2.4\n", " \n", " \n", " \n", " \n", " 2.6\n", " \n", " \n", " \n", " \n", " 2.8\n", " \n", " \n", " \n", " \n", " 3.0\n", " \n", " \n", " \n", " \n", " 3.2\n", " \n", " \n", " \n", " \n", " 3.4\n", " \n", " \n", " \n", " \n", " 3.6\n", " \n", " \n", " \n", " \n", " 3.8\n", " \n", " \n", " \n", " \n", " 4.0\n", " \n", " \n", " \n", " \n", " 4.2\n", " \n", " \n", " \n", " \n", " 4.4\n", " \n", " \n", " \n", " \n", " 4.6\n", " \n", " \n", " \n", " \n", " 4.8\n", " \n", " \n", " \n", " \n", " 5.0\n", " \n", " \n", " \n", " \n", " 5.2\n", " \n", " \n", " \n", " \n", " 5.4\n", " \n", " \n", " \n", " \n", " 5.6\n", " \n", " \n", " \n", " \n", " 5.8\n", " \n", " \n", " \n", " \n", " 6.0\n", " \n", " \n", " \n", " \n", " 6.2\n", " \n", " \n", " \n", " \n", " 6.4\n", " \n", " \n", " \n", " \n", " 6.6\n", " \n", " \n", " \n", " \n", " 6.8\n", " \n", " \n", " \n", " \n", " 7.0\n", " \n", " \n", " \n", " \n", " 7.2\n", " \n", " \n", " \n", " \n", " 7.4\n", " \n", " \n", " \n", " \n", " 7.6\n", " \n", " \n", " \n", " \n", " 7.8\n", " \n", " \n", " \n", " \n", " 8.0\n", " \n", " \n", " \n", " \n", " 8.2\n", " \n", " \n", " \n", " \n", " 8.4\n", " \n", " \n", " \n", " \n", " 8.6\n", " \n", " \n", " \n", " \n", " 8.8\n", " \n", " \n", " \n", " \n", " 9.0\n", " \n", " \n", " \n", " \n", " 9.2\n", " \n", " \n", " \n", " \n", " 9.4\n", " \n", " \n", " \n", " \n", " 9.6\n", " \n", " \n", " \n", " \n", " 9.8\n", " \n", " \n", " \n", " \n", " 10.0\n", " \n", " \n", " \n", " \n", " 10.2\n", " \n", " \n", " \n", " \n", " 10.4\n", " \n", " \n", " \n", " \n", " 10.6\n", " \n", " \n", " \n", " \n", " 10.8\n", " \n", " \n", " \n", " \n", " 11.0\n", " \n", " \n", " \n", " \n", " 11.2\n", " \n", " \n", " \n", " \n", " 11.4\n", " \n", " \n", " \n", " \n", " 11.6\n", " \n", " \n", " \n", " \n", " 11.8\n", " \n", " \n", " \n", " \n", " 12.0\n", " \n", " \n", " \n", " \n", " -10\n", " \n", " \n", " \n", " \n", " 0\n", " \n", " \n", " \n", " \n", " 10\n", " \n", " \n", " \n", " \n", " 20\n", " \n", " \n", " \n", " \n", " -6.0\n", " \n", " \n", " \n", " \n", " -5.5\n", " \n", " \n", " \n", " \n", " -5.0\n", " \n", " \n", " \n", " \n", " -4.5\n", " \n", " \n", " \n", " \n", " -4.0\n", " \n", " \n", " \n", " \n", " -3.5\n", " \n", " \n", " \n", " \n", " -3.0\n", " \n", " \n", " \n", " \n", " -2.5\n", " \n", " \n", " \n", " \n", " -2.0\n", " \n", " \n", " \n", " \n", " -1.5\n", " \n", " \n", " \n", " \n", " -1.0\n", " \n", " \n", " \n", " \n", " -0.5\n", " \n", " \n", " \n", " \n", " 0.0\n", " \n", " \n", " \n", " \n", " 0.5\n", " \n", " \n", " \n", " \n", " 1.0\n", " \n", " \n", " \n", " \n", " 1.5\n", " \n", " \n", " \n", " \n", " 2.0\n", " \n", " \n", " \n", " \n", " 2.5\n", " \n", " \n", " \n", " \n", " 3.0\n", " \n", " \n", " \n", " \n", " 3.5\n", " \n", " \n", " \n", " \n", " 4.0\n", " \n", " \n", " \n", " \n", " 4.5\n", " \n", " \n", " \n", " \n", " 5.0\n", " \n", " \n", " \n", " \n", " 5.5\n", " \n", " \n", " \n", " \n", " 6.0\n", " \n", " \n", " \n", " \n", " 6.5\n", " \n", " \n", " \n", " \n", " 7.0\n", " \n", " \n", " \n", " \n", " 7.5\n", " \n", " \n", " \n", " \n", " 8.0\n", " \n", " \n", " \n", " \n", " 8.5\n", " \n", " \n", " \n", " \n", " 9.0\n", " \n", " \n", " \n", " \n", " 9.5\n", " \n", " \n", " \n", " \n", " 10.0\n", " \n", " \n", " \n", " \n", " 10.5\n", " \n", " \n", " \n", " \n", " 11.0\n", " \n", " \n", " \n", " \n", " 11.5\n", " \n", " \n", " \n", " \n", " 12.0\n", " \n", " \n", " \n", " \n", " \n", " \n", " Time (s) to fit the model\n", " \n", " \n", " \n", "\n", "\n", " \n", " \n", " \n", "\n", "\n", "\n", "\n" ], "text/plain": [ "Plot(...)" ] }, "execution_count": 60, "metadata": {}, "output_type": "execute_result" } ], "source": [ "plot(dimsumry, x=:nθ, y=:fit, Geom.point, \n", " Guide.xlabel(\"# of parameters in the optimization\"),\n", " Guide.ylabel(\"Time (s) to fit the model\"))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "which, given the shape of the previous plots, would be expected to be approximately quadratic." ] } ], "metadata": { "kernelspec": { "display_name": "Julia 1.1.0", "language": "julia", "name": "julia-1.1" }, "language_info": { "file_extension": ".jl", "mimetype": "application/julia", "name": "julia", "version": "1.1.0" } }, "nbformat": 4, "nbformat_minor": 2 }