--- title: 'Vectors (BV Chapters 1, 3, 5)' subtitle: Biostat 216 author: Dr. Hua Zhou @ UCLA date: today format: html: theme: cosmo embed-resources: true number-sections: true toc: true toc-depth: 4 toc-location: left code-fold: false link-external-icon: true link-external-newwindow: true comments: hypothesis: true jupyter: jupytext: formats: 'ipynb,qmd' text_representation: extension: .qmd format_name: quarto format_version: '1.0' jupytext_version: 1.15.2 kernelspec: display_name: Julia 1.10.5 language: julia name: julia-1.10 --- System information (for reproducibility): ```{julia} versioninfo() ``` Load packages: ```{julia} using Pkg Pkg.activate(pwd()) Pkg.instantiate() Pkg.status() ``` ```{julia} using BenchmarkTools, LaTeXStrings, LinearAlgebra, Plots, Random, Statistics Random.seed!(216) ``` # Vectors (BV Chapters 1, 3, 5) ## Notation - A (column) vector $\mathbf{x} \in \mathbb{R}^n$: $$ \mathbf{x} = \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_n \end{pmatrix} = (x_1, x_2, \ldots, x_n). $$ - $n$ is the **dimension** (or **length** or **size**) of the vector. - $x_i$ is the $i$-th **entry** (or **element** or **component** or **coefficient**) of $\mathbf{x}$. - A vector of length $n$ is called an **$n$-vector**. - A row vector $\mathbf{x}' = (x_1 \, x_2 \, \ldots x_n)$ (without commas). ```{julia} # a (column) vector in Julia x = [1, -1, 2, 0, 3] ``` ```{julia} # row vector x' ``` - Subvector: $\mathbf{x}_{2:4} = \begin{pmatrix} x_2 \\ x_3 \\ x_4 \end{pmatrix}$. ```{julia} x[2:4] ``` - Stacked vector: $$ \mathbf{x} = \begin{pmatrix} \mathbf{y} \\ \mathbf{z} \end{pmatrix} = (\mathbf{y}, \mathbf{z}) = \begin{pmatrix} y_1 \\ \vdots \\ y_m \\ z_1 \\ \vdots \\ z_n \end{pmatrix}. $$ ```{julia} y = [1, 2, 3] z = [4, 5] x = [y; z] ``` ## Some special vectors - Zero vector: $$ \mathbf{0}_n = \begin{pmatrix} 0 \\ \vdots \\ 0 \end{pmatrix}. $$ The subscript denotes the length of the vector; it sometimes is omitted if obvious from context. ```{julia} zeros(5) ``` - One vector: $$ \mathbf{1}_n = \begin{pmatrix} 1 \\ \vdots \\ 1 \end{pmatrix}. $$ ```{julia} ones(5) ``` - The unit vector (or elementary basis vector), $\mathbf{e}_i$, has all zero entries except the $i$-th entry being 1. Note the length of $\mathbf{e}_i$ is often implied by context. $$ \mathbf{e}_1 = \begin{pmatrix} 1 \\ 0 \\ 0 \\ \vdots \\ 0 \end{pmatrix}, \quad \mathbf{e}_2 = \begin{pmatrix} 0 \\ 1 \\ 0 \\ \vdots \\ 0 \end{pmatrix}, \quad \ldots, \quad \mathbf{e}_n = \begin{pmatrix} 0 \\ 0 \\ \vdots \\ 0 \\ 1 \end{pmatrix}. $$ ```{julia} # define a unit vector by comprehension e3 = [i == 3 ? 1 : 0 for i in 1:5] ``` ## Vector operations - **Vector addition** (or **elementwise addition**) and **vector substraction** (or **elementwise substraction**). For two vectors $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$ (of same length), $$ \mathbf{x} + \mathbf{y} = \begin{pmatrix} x_1 + y_1 \\ \vdots \\ x_n + y_n \end{pmatrix}, \quad \mathbf{x} - \mathbf{y} = \begin{pmatrix} x_1 - y_1 \\ \vdots \\ x_n - y_n \end{pmatrix}. $$ ```{julia} x = [1, 2, 3, 4, 5] y = [6, 7, 8, 9, 10] x + y ``` ```{julia} x - y ``` - **Scalar-vector multiplication**. For a scalar $\alpha \in \mathbb{R}$ and a vector $\mathbf{x} \in \mathbb{R}^n$, $$ \alpha \mathbf{x} = \begin{pmatrix} \alpha x_1 \\ \alpha x_2 \\ \vdots \\ \alpha x_n \end{pmatrix}. $$ ```{julia} α = 0.5 x = [1, 2, 3, 4, 5] α * x ``` - **Elementwise multiplication** or **Hadamard product**. For two vectors $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$ (of same length), $$ \mathbf{x} \circ \mathbf{y} = \begin{pmatrix} x_1 y_1 \\ \vdots \\ x_n y_n \end{pmatrix}. $$ ```{julia} # in Julia, dot operation is elementwise operation x = [1, 2, 3, 4, 5] y = [6, 7, 8, 9, 10] x .* y ``` - For scalars $\alpha_1, \ldots, \alpha_k \in \mathbb{R}$ and vectors $\mathbf{x}_1, \ldots, \mathbf{x}_k \in \mathbb{R}^n$, the **linear combination** $$ \sum_{i=1}^k \alpha_i \mathbf{x}_i = \alpha_1 \mathbf{x}_1 + \cdots + \alpha_k \mathbf{x}_k $$ is a sum of scalar-vector products. ```{julia} x = [1, 2, 3, 4, 5] y = [6, 7, 8, 9, 10] 1 * x + 0.5 * y ``` - **Vector transpose**: $$ \mathbf{x}' = \begin{pmatrix} x_1 \\ \vdots \\ x_n \end{pmatrix}' = (x_1 \, \ldots \, x_n). $$ Properties: 1. $(\mathbf{x}')' = \mathbf{x}$. 2. Transposition is a linear operator: $(\alpha \mathbf{x} + \beta \mathbf{y})' = \alpha \mathbf{x}' + \beta \mathbf{y}'$. Superscript $^t$ or $^T$ is also commonly used to denote transpose: $\mathbf{x}^t$, $\mathbf{x}^T$. - The **inner product** or **dot product** between two vectors $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$ is $$ \mathbf{x}' \mathbf{y} = (x_1 \, \ldots \, x_n) \begin{pmatrix} y_1 \\ \vdots \\ y_n \end{pmatrix} = x_1 y_1 + \cdots + x_n y_n. $$ Other notations for inner products: $\langle \mathbf{x}, \mathbf{y} \rangle$, $\sum_{i=1}^n x_i y_i$. ```{julia} x = [1, 2, 3, 4, 5] y = [6, 7, 8, 9, 10] x'y ``` ```{julia} # dot function is avaiable from the standard library Linear Algebra dot(x, y) ``` Properties of inner product: 1. $\mathbf{x}' \mathbf{x} = x_1^2 + \cdots + x_n^2 \ge 0$. 2. $\mathbf{x}' \mathbf{x} = 0$ if and only if $\mathbf{x} = \mathbf{0}$. 3. Commutative: $\mathbf{x}' \mathbf{y} = \mathbf{y}' \mathbf{x}$. 4. Associative with scalar multiplication: $(\alpha \mathbf{x})' \mathbf{y} = \alpha (\mathbf{x}' \mathbf{y}) = \mathbf{x}' (\alpha \mathbf{y})$. 5. Distributive with vector addition: $(\mathbf{x} + \mathbf{y})' \mathbf{z} = \mathbf{x}' \mathbf{z} + \mathbf{y}' \mathbf{z}$. Examples of inner product. **In class exercises**: express the following quantities using vector inner products of a vector $\mathbf{x} \in \mathbb{R}^n$ and another vector. 1. Individual component: $x_i = \rule[-0.1cm]{1cm}{0.15mm}$. 2. Differencing: $x_i - x_j = \rule[-0.1cm]{1cm}{0.15mm}$. 3. Sum: $x_1 + \cdots + x_n = \rule[-0.1cm]{1cm}{0.15mm}$. 4. Average: $(x_1 + \cdots + x_n) / n = \rule[-0.1cm]{1cm}{0.15mm}$. ## Computer representation of numbers and vectors - Real numbers are represented as the **floating point numbers** in computers. - (Double precision) floating point numbers approximate real numbers to an accuracy of 15-16 digits. - One (double precision) floating point number takes 64 bits (0s and 1s), or 8 bytes. ![](data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABOYAAADmCAYAAABxs4o5AAAMJmlDQ1BJQ0MgUHJvZmlsZQAASImVVwdYU8kWnluSkJDQAghICb2J0qvU0CJVqmAjJIGEEmJCULEjiwquBRURrOiqiKJrAWSxYS+LgNjLw4KKsi4WbKC8SQLo6vfe+975vrnz3zNnzvnPuTPz3QFANZotEmWhagBkC3PFMSEBjElJyQzSI4AAHJCBGtBmcyQi/+jocABluP+nvL8BraFcs5P5+nn8v4o6lyfhAIBEQ5zKlXCyIT4MAO7KEYlzASD0QL3pzFwRxETIEmiKIUGIzWQ4XYHdZThVgcPlNnExTIhTAFCistnidABUZLwYeZx06EdlOcT2Qq5ACHETxD4cPpsL8QDEY7KzcyBWtYLYKvU7P+n/8Jk64pPNTh/BilzkohQokIiy2LP/z3L8b8nOkg7HMIWNyheHxshyltUtMydMhqkQXxCmRkZBrAFxh4Art5fhp3xpaPyQ/UeOhAlrBrQBQKlcdmAYxPoQmwizIsOH9D5pgmAWxLD2aJwglxWnmItyxTkxQ/7RWTxJUOwwZovlsWQ2xdLMeP8hn5v5PNawz8Z8flyigifamidIiIRYBeJ7kszYsCGbF/l8ZuSwjVgaI+MMvzkG0sTBMQobzCxbMpwX5skXsCKHcHguPy5UMRebxmHLuelAnMGTTAof5snlBQYp8sIKeML4If5YqSg3IGbIfocoK3rIHmviZYXI9CYQt0jyYofn9ubCxabIFwei3Og4BTdcM4M9IVrBAbcB4YAJAgEDSGFLBTkgAwhaeup74JtiJBiwgRikAx6wG9IMz0iUjwjhMxbkg78g4gHJyLwA+SgP5EH9lxGt4mkH0uSjefIZmeApxNkgDGTBd6l8lnAkWgJ4AjWCn6JzINcs2GRjP+kYqsM6YhAxkBhKDCZa43q4D+6Fh8OnH2yOuDvuMczrmz3hKaGN8IhwndBJuD1dUCD+gTkDRIBOyDF4KLvU77PDLaBXFzwA94b+oW9cG9cDdrgzjOSP+8LYLlD7PVfpSMbfajnki2xPRsmjyH5kqx8ZqNiouIx4kVXq+1ooeKWOVIs5MvJjHszv6seFfdiPlthS7BB2HjuFXcSasHrAwE5gDdgV7JgMj6yNJ/K1MRwtRs4nE/oR/BSPPRRTVjWJfY19t/3A0BjI5c3KlW0WZo5otliQzs9l+MPTmsdgCTljxzAc7R3gKSo7+xVHS+9V+ZmO6Kp/0y0qAmD8rsHBwaPfdBEHATi8DABKxzed5Wa4nVsBuLCVIxXnKXS47EEAFKAKd4ouMIRnlxXMyBG4Ai/gB4LABBAF4kASmAbrzIfrVAxmgrlgESgCJWAVWAcqwBawHewG+8BBUA+awClwDlwGreA6uAvXShd4CXrBe9CPIAgJoSF0RBcxQswRW8QRcUd8kCAkHIlBkpAUJB0RIlJkLrIYKUFKkQpkG1KN/I4cRU4hF5E25DbyEOlG3iCfUQylopqoAWqBjkPdUX80DI1Dp6Lp6Aw0Hy1EV6DlaBW6F61DT6GX0etoJ/oS7cMApoxpY8aYHeaOMbEoLBlLw8TYfKwYK8OqsFqsEX7pa1gn1oN9wok4HWfgdnC9huLxOAefgc/Hl+MV+G68Dj+DX8Mf4r34VwKNoE+wJXgSWIRJhHTCTEIRoYywk3CEcBbunS7CeyKRqE20JLrBvZdEzCDOIS4nbiLuJ54kthEfE/tIJJIuyZbkTYoisUm5pCLSBtJe0glSO6mL9FFJWclIyVEpWClZSahUoFSmtEfpuFK70jOlfrIa2ZzsSY4ic8mzySvJO8iN5KvkLnI/RZ1iSfGmxFEyKIso5ZRaylnKPcpbZWVlE2UP5YnKAuWFyuXKB5QvKD9U/kTVoNpQmdQpVCl1BXUX9ST1NvUtjUazoPnRkmm5tBW0atpp2gPaRxW6ylgVlgpXZYFKpUqdSrvKK1Wyqrmqv+o01XzVMtVDqldVe9TIahZqTDW22ny1SrWjajfV+tTp6g7qUerZ6svV96hfVH+uQdKw0AjS4GoUamzXOK3xmI7RTelMOoe+mL6DfpbepUnUtNRkaWZolmju02zR7NXS0HLWStCapVWpdUyrUxvTttBmaWdpr9Q+qH1D+/Mog1H+o3ijlo2qHdU+6oPOaB0/HZ5Osc5+nes6n3UZukG6mbqrdet17+vhejZ6E/Vm6m3WO6vXM1pztNdozuji0QdH39FH9W30Y/Tn6G/Xv6LfZ2BoEGIgMthgcNqgx1Db0M8ww3Ct4XHDbiO6kY+RwGit0QmjFwwthj8ji1HOOMPoNdY3DjWWGm8zbjHuN7E0iTcpMNlvct+UYupumma61rTZtNfMyCzCbK5Zjdkdc7K5uznffL35efMPFpYWiRZLLOotnlvqWLIs8y1rLO9Z0ax8rWZYVVl1WBOt3a0zrTdZt9qgNi42fJtKm6u2qK2rrcB2k23bGMIYjzHCMVVjbtpR7fzt8uxq7B6O1R4bPrZgbP3YV+PMxiWPWz3u/Liv9i72WfY77O86aDhMcChwaHR442jjyHGsdOxwojkFOy1wanB67WzrzHPe7HzLhe4S4bLEpdnli6ubq9i11rXbzcwtxW2j2013Tfdo9+XuFzwIHgEeCzyaPD55unrmeh70/NvLzivTa4/X8/GW43njd4x/7G3izfbe5t3pw/BJ8dnq0+lr7Mv2rfJ95Gfqx/Xb6ffM39o/w3+v/6sA+wBxwJGAD0xP5jzmyUAsMCSwOLAlSCMoPqgi6EGwSXB6cE1wb4hLyJyQk6GE0LDQ1aE3WQYsDqua1TvBbcK8CWfCqGGxYRVhj8JtwsXhjRFoxISINRH3Is0jhZH1USCKFbUm6n60ZfSM6D8mEidGT6yc+DTGIWZuzPlYeuz02D2x7+MC4lbG3Y23ipfGNyeoJkxJqE74kBiYWJrYOWncpHmTLifpJQmSGpJJyQnJO5P7JgdNXje5a4rLlKIpN6ZaTp019eI0vWlZ045NV53Onn4ohZCSmLInZYAdxa5i96WyUjem9nKYnPWcl1w/7lpuN8+bV8p7luadVpr2PN07fU16N9+XX8bvETAFFYLXGaEZWzI+ZEZl7soczErM2p+tlJ2SfVSoIcwUnskxzJmV0yayFRWJOmd4zlg3o1ccJt4pQSRTJQ25mvAn+4rUSvqL9GGeT15l3seZCTMPzVKfJZx1ZbbN7GWzn+UH5/82B5/DmdM813juorkP5/nP2zYfmZ86v3mB6YLCBV0LQxbuXkRZlLnozwL7gtKCd4sTFzcWGhQuLHz8S8gvNUUqReKim0u8lmxZii8VLG1Z5rRsw7KvxdziSyX2JWUlA8s5yy/96vBr+a+DK9JWtKx0Xbl5FXGVcNWN1b6rd5eql+aXPl4TsaZuLWNt8dp366avu1jmXLZlPWW9dH1neXh5wwazDas2DFTwK65XBlTu36i/cdnGD5u4m9o3+22u3WKwpWTL562Crbe2hWyrq7KoKttO3J63/emOhB3nf3P/rXqn3s6SnV92CXd17o7Zfabarbp6j/6elTVojbSme++Uva37Avc11NrVbtuvvb/kADggPfDi95TfbxwMO9h8yP1Q7WHzwxuP0I8U1yF1s+t66/n1nQ1JDW1HJxxtbvRqPPLH2D92NRk3VR7TOrbyOOV44fHBE/kn+k6KTvacSj/1uHl6893Tk053nJl4puVs2NkL54LPnT7vf/7EBe8LTRc9Lx695H6p/rLr5borLleO/Ony55EW15a6q25XG1o9Whvbxrcdb/dtP3Ut8Nq5DlbH5euR19tuxN+4dXPKzc5b3FvPb2fdfn0n707/3YX3CPeK76vdL3ug/6DqX9b/2t/p2nnsYeDDK49iH919zHn88onkyUBX4VPa07JnRs+qnzs+b+oO7m59MflF10vRy/6eor/U/9r4yurV4b/9/r7SO6m367X49eCb5W913+565/yuuS+678H77Pf9H4o/6n7c/cn90/nPiZ+f9c8cIA2Uf7H+0vg17Ou9wezBQRFbzJb/CmCwoWlpALzZBQAtCQA6/FegTFbczeSCKO6TcgT+E1bc3+TiCkAt7GS/4cyTAByAzcIP+oZ9FOzj/ADq5DTShkSS5uSo8KVSAwDJeHDwTQ4AZNgGQgYH+6MHB79shGQ7ADj+XHEnlInsDrrVXobajQ6BH+Xf5CBxwTbTWz0AAAAJcEhZcwAAFiUAABYlAUlSJPAAAAGeaVRYdFhNTDpjb20uYWRvYmUueG1wAAAAAAA8eDp4bXBtZXRhIHhtbG5zOng9ImFkb2JlOm5zOm1ldGEvIiB4OnhtcHRrPSJYTVAgQ29yZSA1LjQuMCI+CiAgIDxyZGY6UkRGIHhtbG5zOnJkZj0iaHR0cDovL3d3dy53My5vcmcvMTk5OS8wMi8yMi1yZGYtc3ludGF4LW5zIyI+CiAgICAgIDxyZGY6RGVzY3JpcHRpb24gcmRmOmFib3V0PSIiCiAgICAgICAgICAgIHhtbG5zOmV4aWY9Imh0dHA6Ly9ucy5hZG9iZS5jb20vZXhpZi8xLjAvIj4KICAgICAgICAgPGV4aWY6UGl4ZWxYRGltZW5zaW9uPjEyNTQ8L2V4aWY6UGl4ZWxYRGltZW5zaW9uPgogICAgICAgICA8ZXhpZjpQaXhlbFlEaW1lbnNpb24+MjMwPC9leGlmOlBpeGVsWURpbWVuc2lvbj4KICAgICAgPC9yZGY6RGVzY3JpcHRpb24+CiAgIDwvcmRmOlJERj4KPC94OnhtcG1ldGE+Cm4pjcoAAAAcaURPVAAAAAIAAAAAAAAAcwAAACgAAABzAAAAcwAAI5rVDErWAAAjZklEQVR4Aezdf7BVZb3HcSAdQHREkzKNAGWaTEfFUoGgYEqUq2nklRMkDSLOtSwHS+UkBwRDUSCqqQT9o5JfCl2FETHM7IemkFNYjsFYSUCjk40DHvMXGnDtc/rMZT086+wfZ6199l7n7R/7e77PWutZz3o9m9nL71p77e773/mvG/8hgAACCCCAAAIIIIAAAggggAACCCCAQE0FulOYq6k3O0MAAQQQQAABBBBAAAEEEEAAAQQQQEACFOZ4IyCAAAIIIIAAAggggAACCCCAAAIIINAJAhTmOgGdXSKAAAIIIIAAAkUSuOeee3Q406ZNU3zxxRcVi/rElO7duxf6+HRwvCCAAAIIIIBATQQozNWEmZ0ggAACCCCAAALFFaAwV9y55cgQQAABBBBAIF8BCnP5+tI7AggggAACCCBQeIH+/fvrGFetWqU4fPjwhjzmcu+EK3e9hkRg0AgggAACCCBQUwEKczXlZmcIIIAAAggggEDxBCjMFW9OOSIEEEAAAQQQqI0AhbnaOLOXBhXginiDThzDRgABBBCoqUCPHj20v7179yr687Omg8hgZx53UZ+NlwERXSCAAAIIIIBAxgIU5jIGpbtiCXCCXqz55GgQQAABBPIRoDCXjyu9IoAAAggggEDxBSjMFX+OOcIOCFCY6wAemyKAAAIIFF7An5NpB+o7z7zeokWLtKrj888/r3zfvn2K27ZtU/Svu/7iF79Q/vbbbyuOHj1a8a677lJ8z3veoxi+3H///Wq66aabFJ955hnF9773vYqzZs1SvPzyyxU9PiXtvITH4zzc5I477lDTggULFHfu3Kn4gQ98QHH69OmKV1xxhWL44vGsWLFCixYuXKi4ZcsWxZ49eyqec845iosXL1bs16+fIi8IIIAAAggg0DgCFOYaZ64YaScI+MQ47cS7E4bELhFAAAEEEKgbAX9Opg3In59ezwU5RwpzFObS3ju0I4AAAggg0FUEKMx1cKZ9RfbGG29UT76SefTRRyufPHmy4je+8Q1FX/H92Mc+pvwrX/mK4mWXXaYYvvzwhz9U0+2336746KOPKh522GGKS5YsUfSV1B07dij3Q5hnzpyp3ONQEnmptyu7lboecsghiaPy/wBUe6XZ2yc6jST+H47IIpoQQAABBBDoMgL+3Ez7XPTy8847TyY+vzn22GMTRieffLLy7373u4rDhg1TfOuttxR9vvXSSy8pX758uaJf1q9frz+vvPJKRe9n5MiRyv/2t78p+k66pUuXKveLx5l2HKXWW7ZsmVbxHXl333238jPOOEPxd7/7neKECRMUb7nlFsWJEycq+sXjOPXUU9Xk88AzzzxTeWtrq6LvvNu9e7fyNWvWKPKCAAIIIIAAAo0jQGGug3NVaQGJwlx5X7mo1JXCXAffyGyOAAIIIIBABwRcSEoraHk5hTkKcx14m7EpAggggAAChRSgMFfltD700EPa8oYbblD0sz2GDBmi/IUXXlD88pe/rOgrnjfffLNyP2vEd87dc889ane+ceNG5ZdeeqniY489pnjccccp+gT3xBNPVL5y5UrF008/XfGpp55S9BVY31E3btw4tful3q7sdtTVx2Ufu1d7pdn9pP2PhvdHRAABBBBAoCsLlPq89PKtW7eK6UMf+lBVXK+//rq2GzhwoOI//vGPRD/Dhw9X7vOzCy64ILG8VOJxlvrcT1vvox/9qHbhb0qMHTs2ussHH3xQ7bNnz1Z88sknE+u5f38T46STTkosd+I75fxNiVdffdWLiAgggAACCCDQIAIU5qqcqI4WkCjMxb9y0VFXT6dPaCnMWYSIAAIIIIBAfgL+3E0raHk5hbm2OaAwl997kZ4RQAABBBBoNAEKc1XO2Mc//nFt6WegnHbaadGeXnzxRbX72SJ+yLFX9p1wftbI6tWrtWjSpEmKzj/ykY94E0Wf4LqQNWbMmMRyJxs2bNCfc+bMUfSdeF5eb1d2s3K1T0evNLuftP/RsCMRAQQQQACBrixQ6vPSy/fu3SumHj16RLmee+45tV999dWKv/71rxVfeeWVxPruz7/m6oV+Bq+fQefcy0tF91vqcz9tvd69e2sXu3btUnQe7td3/r373e/WojfeeCOxivv38TlPrHRA4uWlxn3AJvyJAAIIIIAAAnUiQGGuyonIqoBEYS75lYusXH2CSmGuyjc4myGAAAIIIFCBgD930wpDXk5hrg2VwlwFby5WRQABBBBAoOACFOaqnOA+ffpoyz179iR68AlpGH1C6iufiY3eSfzrql/60pe0aNWqVYqXXHJJuKpy9+cTu1JXZI855hht5/Xdqberlyu7Wbnax97Ofdxh9HLPm5entXs5EQEEEEAAAQS6dSv1eVlquQ2HDh2qP/0jEV/84heV9+vXT9Gf64ceeqjy8HPbd8gV5Y658Ph00JGXcn0jm9KEAAIIIIAAAp0sQGGuygnIqoDk3VOYa5PIytUnqD6Bd27vMHp5eAKc1h5uT44AAggggEBXFij1eVlque0ozLVJlOtlt0rX93ZEBBBAAAEEEOh8AQpzVc6B7zTzr68eddRRVfbUttlNN92kP/wrXv6VrhkzZkT79QmY74DzeMKVvbxWd8yFha1wPM49/nB9H0dHXdP69/7DmLZ+Wnu4PTkCCCCAAAJdWaDU52Wp5bbznXD//Oc/1dSrVy8vUty0aZPisGHDFMPzCP8qq8+fzj///MT2pRI/+85fufW4w+3cHu7fz+6dO3euNvGdf+H2P/nJT9R04403Kqb9KmvYf9iP87TxeDkRAQQQQAABBOpXgMJclXOTVQHJu6cw1yaRlWulJ6hp66e1e96ICCCAAAIIIJDdV1kpzLW9myo9/6h0fd6zCCCAAAIIIFA/AhTmqpyLESNGaMupU6cqTp48uaqefMXUz1BZs2aN+rngggsUf/CDHyiee+65if59ApbVr7LWy5XdrFzt09ErzeVeOU9MDgkCCCCAAAJdTKDU526p5eY6+eST9afPr3x+5F9rnThxopY//fTTiuHn/Lp169R+1VVXKf7oRz9S9PnFzp07lfuC6NKlS5X7pX///vrT7aNGjVLu8Xs95+H+ly1bplX8zYeVK1cqP+OMMxQ3b96sOGHCBEWff/m41PjOS1r/Xh7GStcPtydHAAEEEEAAgc4ToDBXpb1P8HziSGGuuyTDE9Q03rQTyKxc0/qvdDwU5tLEaEcAAQQQQOD/BUp97pZa7p4ozLVJlOtlt0rX93ZEBBBAAAEEEOh8AQpzVc7BI488oi2bmpoUv//97yv6Tjd362ehLFq0SE3r169X3LZtm6KfkbJ27dpEvnHjRuUXX3yx4uOPP644aNAgRZ+ADR48WPmKFSsUhwwZougrsr4CO3/+fLW7PyXvvNTbld2Ouvq47NPRQmG5V869XyICCCCAAAJdUaDU526p5Tbzs9amTJmipmeffVbxuOOOU7z22msVr776asW0z/n77rtPy/3s3i1btih3PzNnzlTu/Sh552X58uX6s7m5WdHPvPV+HEsdj3/Ua+HChepnx44digMGDFCcPn264hVXXKEYvpTqv6Prh9uTI4AAAggggEDnCVCYq9K+owUkCnPxO+w66urpzOqElsKcRYkIIIAAAgikC5T63C213D1TmGuTKNfLbpWu7+2ICCCAAAIIIND5AhTmOjgHv/rVr9TDnDlzFH1CuW/fPuVnn322oq/wjh49WrnvlPNXYK+55hq1hy9+xsm3vvUtLfKdc3369FG+ePFixfCKrAtKLS0tWh5eEVbjAS/1dmW3UtfwV9cqPUFNW7/cK+cHUPInAggggAACCCCAAAIIIIAAAgggUJYAhbmymNJXqrSARGGuzTKtEGbpSl0pzFmOiAACCCCAAAIIIIAAAggggAACjSJAYa5RZioYZ6nCVrA6KQIIIIAAAggggAACCCCAAAIIIIBAnQlQmKuzCSl3OBTmypViPQQQQAABBBBAAAEEEEAAAQQQQKA+BSjM1ee8lBwVhbmSRKyAAAIIIIAAAggggAACCCCAAAII1LUAhbm6np70wVGYS7dhCQIIIIAAAggggAACCCCAAAIIINAIAhTmGmGWImOkMBdBoQkBBBBAAAEEEEAAAQQQQAABBBBoIAEKcw00WQcOlcLcgRr8jQACCCCAAAIIIIAAAggggAACCDSeAIW5xpszRowAAggggAACCCCAAAIIIIAAAgggUAABCnMFmEQOAQEEEEAAAQQQQAABBBBAAAEEEECg8QQozDXenDFiBBBAAAEEEEAAAQQQQAABBBBAAIECCFCYK8AkcggIIIAAAggggAACCCCAAAIIIIAAAo0nQGGu8eaMESOAAAIIIIAAAnUpcMstt2hcTz75pOLatWvrcpzlDiqrH9vqaD8XXXSRhjxs2DDF5ubmcg+B9RBAAAEEEECgzgUozNX5BDE8BBBAAAEEEECgUQQozMVnisJc3IVWBBBAAAEEEOjWjcJcJ70LOHGNw3PiGnehFQEEEEAAgXoWeO211zS8QYMGKT7xxBOKgwcPTgzbn/OJxnaS/fv3R5du2LBB7QsWLFD0/o444gjln/3sZxUXLlyoePjhhytW+uLxpo2j3P7S+klrD/v905/+pKYRI0Yo7tixQ7F3797hquQIIIAAAggg0GACFOY6acIozMXhyz1BjW/drRtf9UiToR0BBBBAAIH8BCjMtW+bdn6T1h72RmEuFCFHAAEEEECgOAKZF+bKPcEoDmFlR1Luiat7/fnPf64/Z8yYobhp0ybFSq/cZtWPxxXGrOY9rZ+09nAcnLiGIuQIIIAAAgjkL7Bs2TLtxM+Uu/fee6M7LffzPLrxAY1jxoxRds011yiOHDlS8Y033lD82te+pvivf/1LceXKlYqVvmQ13rT9Vtq/L0BOmDBBXX7uc59L65p2BBBAAAEEEGgQAQpzNZ4oCnPtg6edoKa1h71RmAtFyBFAAAEEEMhfgMJcdcblnt+4dwpzliAigAACCCBQHAEKczWey3JPXD2sUaNG6c/Zs2crjh49WrHSO+ay6kc7j7xUemIZ6aLdpkr758S1XU4WIoAAAgggkKnA+PHj1d+FF16oeOmll0b7r/TzPNpJGY2vvPKK1nrf+96n6AujZWyaWMXjXbJkidr9zDo/461///5qnzlzpuLkyZMVwxf34/M35+F6Ye713e47/x544AE1OfdyIgIIIIAAAgg0ngCFuRrPGYW56sB9AhueoKb1RmEuTYZ2BBBAAAEEshegMEdhLvt3FT0igAACCCDQNQQyL8x1Dbbqj7LcE9e0PVRaoMq7H/fvcXFF2SJEBBBAAAEEuo7ACSecoIN9+OGHFU888cTowft8oV+/flq+a9cuxaOPPlpx+PDhitOnT1ccNmyYYqUvzz77rDb5xCc+ofj3v/+90i60vsfr4/EdaqeffrqWP/XUU4oTJ05U9B1148aNU+4X9xNeYExr93Zh3LZtm5r8jL2//OUv4SrkCCCAAAIIINBgAhTmajxhFOa4olzjtxy7QwABBBBAIHcBCnMU5nJ/k7EDBBBAAAEECipwUGFu69atOlT/mtXjjz+u/M0331Q85ZRTFK+//nrFpqYmRb+UuvJ35513atV58+YpPv/884oDBgxQvPbaaxWvvPJKxbQriytWrNByX5ncsmWL8p49eyqec845iosXL1b0lVklnfhS7olr2hBL+aZtF7Zn1Y/7dX9cUbYIEQEEEEAAga4jcNhhh+lgX3rpJUXnoYAfNeHzvbPOOkur+NdUfcedz0O/973vabmfXRf2l5b710o//OEPa5VZs2alrdpuu89vHnroIa3nO9XCjTZs2KCmOXPmKG7cuDGxivtJO68N2xMbH5C8/fbbyo488kjF119//YCl/IkAAggggAACjShAYa7Gs0ZhjivKNX7LsTsEEEAAAQRyF3AhjsIchbnc32zsAAEEEEAAgYIJHFSYO+mkk3SIV111leIXvvAFxV69ein+/ve/V7ztttsU7733XkW/pF0RXLVqlVZpaWlR9B1vfkaH+/UzOp577jmtF15BdP+nnnqqlt9+++2KZ555pmJra6uin02ye/du5WvWrFHs7JdyT1zTxunjD13S1k9rz6of9+/+uKJsESICCCCAAAJdR6B37946WP8a6qGHHtqhg3/ssce0vb9B8cc//rGs/r797W9rvZ/+9KeK999/v+IhhxxS1vbhSj6/8Z1pPs5wPS8/5phjtMi513M/4flbWru3CyN3zIUi5AgggAACCDS+AIW5Gs8hhTm+6lHjtxy7QwABBBBAIHcBF6wozFGYy/3Nxg4QQAABBBAomMBBhTmfWPlXno4//viKDjntyt/ZZ5+tfubOnavoZ8CFnfsK57nnnqtFaVcW/Uw53+EX9uM75fr3769Fr776arhKp+T2rfbENc230oPJqh/v1/35CrGP08sdvZwryhYhIoAAAggg0PgCflTHI488ooMZNGhQhw7qrbfe0vZHHHGE4p49e9rt75vf/KaWP/DAA4oPPvigYtr5SLudHbCw3s5v/KusPo/2N0wOGDJ/IoAAAggggECDCVCYq/GE+QSRwhxXlGv81mN3CCCAAAII5CZAYa7tRxjyvvBIYS63tzAdI4AAAggg0GkCBxXmvvrVr2owfgbcxRdfrNx3vH3yk59U/v73vz86aF9ZDO90c0Fq165d2s552InvqOrTp48Whf24/3379mm587Af514e9uPltY4dPXHN6niy6sd+7s/zV2p+OXG1HBEBBBBAAIHGFxg/frwOYty4cYoTJkzo0EFt2rRJ20+aNEnxz3/+c7S/pUuXqn3JkiWK/ubF4YcfHl2/0kaf3+T1DN0ePXpoSHv37lX0/tLGuXLlSi1at26d4t133522Ku0IIIAAAggg0CACFOZqPFEU5riiXOO3HLtDAAEEEEAgdwEKc9U9Q5fCXO5vTXaAAAIIIIBA3QscVJjziDdv3qw//ayQ3/72t8p/9rOfKV533XWKzc3Nin7xlb7wDjXfQZXVHXNh/95/GNPGE65Xq7yjJ65ZHU9W/djN/XFF2SJEBBBAAAEEuo7AsmXLdLD33Xef4po1a6IH/+lPf1rt/obGWWedpfxd73qX4qOPPqroX2O94YYblE+dOlXRLz4fnTFjhpp8/tG3b1+vkkn0+c3gwYPVn79RMmTIEOU+X544caLy+fPnK/obJ0reeXE/4fmrn4XsO/9GjRqlTby+t3f0HYmXXHKJmrxfLycigAACCCCAQOMJUJir8ZxRmOOKco3fcuwOAQQQQACB3AUozFGYy/1Nxg4QQAABBBAoqEBqYS7teP/6179q0SmnnKL42muvJVb1Fb7wiqCfUXfzzTdr/U996lOJ7Zw8/PDD+nPMmDGKYT9p/Xv7MFa6frh91nm5J65p+83qeLLqx+N0f1xRtggRAQQQQACBriPgZ8wOHDhQB/3EE08o+rzAEv611AULFqjJz5LzVzpPO+00tU+bNk3RFzS9vaPvjGttbXVTu3H37t1a7u3aXfmAhT6/Wbx4sVoXLlyouGPHDkXf8dbS0qJ8ypQpiuGL+wnPa5cvX65V/Q2UF154QbnXc/Qz9kaMGKHl27dvV/Q3UpTwggACCCCAAAINKUBhrsbTRmGOK8o1fsuxOwQQQAABBHIXoDBHYS73Nxk7QAABBBBAoKACBxXmxo4dq0P1sz+GDx+u3Ffs/OtP3/nOd9T+zDPPJGjSrgiuWrVK682cOVPRvyrlK6N/+MMf1O5nZfjKoPfrnaT17+VhrHT9cPus83JPXL1fj995WgydwvWy6ifs17n754qyRYgIIIAAAgh0PYF58+bpoH/zm98orl27tush5HDEn/nMZ9Tr0KFDFX2HXQ67oksEEEAAAQQQqLEAhbkag1OY44pyjd9y7A4BBBBAAIGaCVCYy4eawlw+rvSKAAIIIIBAPQgcVJjzsz/8q1J+9kfPnj013pEjRyouWrRI8YMf/GDiOHznVNodXHfccYfW94mbn6UxYMAAtfuZIr5jb8+ePRX1n1j5naTUeML1a5X7+LminK04J67ZetIbAggggAACCCCAAAIIIIAAAgjkJ0BhLj/bdnumMNcuT9ULKcxVTceGCCCAAAIIIIAAAggggAACCCBQY4GDCnM13v9Bu9u8ebPaPv/5zytu3br1oHVoQAABBBBAAAEEEEAAAQQQQAABBBBAoNEFKMw1+gwyfgQQQAABBBBAAAEEEEAAAQQQQACBhhSoeWFu0qRJgpo1a5bioEGDFP3rrlOnTlXe1NSkeN111ynyggACCCCAAAIIIIAAAggggAACCCCAQJEEKMwVaTY5FgQQQAABBBBAAAEEEEAAAQQQQACBhhGoeWHurrvuEs7cuXMVt2/frnj88ccrTpkyRbGlpUWxR48eirwggAACCCCAAAIIIIAAAggggAACCCBQJAEKc0WaTY4FAQQQQAABBBBAAAEEEEAAAQQQQKBhBGpemGsYGQaKAAIIIIAAAggggAACCCCAAAIIIIBAjgIU5nLEpWsEEEAAAQQQQAABBBBAAAEEEEAAAQTSBCjMpcnQjgACCCCAAAIIIIAAAggggAACCCCAQI4CFOZyxKVrBBBAAAEEEEAAAQQQQAABBBBAAAEE0gQozKXJ0I4AAggggAACCCCAAAIIIIAAAggggECOAhTmcsSlawQQQAABBBBAAAEEEEAAAQQQQAABBNIEalaYa25u1hhuvfXW6FhWr16t9qampuhyGhHoigK9evXSYe/evVuxd+/eXZGBY0YAAQQQQAABBBBAAAEEEOhkgaFDh2oEGzdu7OSRFGv3FOaKNZ8cTcEEKMwVbEI5HAQQQAABBBBAAAEEEECgQQUozOUzcTUrzHXv3l1HsH///uiR+E661tZWLZ83b150PRoR6AoCO3fu1GGOHDlS8emnn1YcMGCA4ssvv6zICwIIIIAAAggggAACCCCAAAJ5CrhO07dvX+0mra6T5xiK3DeFuSLPLsfWsAIU5hp26hg4AggggAACCCCAAAIIIFAoAQpz+U4nhbl8fekdgaoEKMxVxcZGCCCAAAIIIIAAAggggAACGQtQmMsYNOiOwlwAQopAPQhQmKuHWWAMCCCAAAIIIIAAAggggAACFObyfQ9QmMvXl94RqEqAwlxVbGyEAAIIIIAAAggggAACCCCQsQCFuYxBg+4ozAUgpAjUgwCFuXqYBcaAAAIIIIAAAggggAACCCBAYS7f9wCFuXx96R2BqgQozFXFxkYIIIAAAggggAACCCCAAAIZC1CYyxg06I7CXABCikA9CFCYq4dZYAwIIIAAAggggAACCCCAAAIU5vJ9D1CYy9eX3hGoSoDCXFVsbIQAAggggAACCCCAAAIIIJCxAIW5jEGD7ijMBSCkCNSDAIW5epgFxoAAAggggAACCCCAAAIIIEBhLt/3AIW5fH3pHYGqBCjMVcXGRggggAACCCCAAAIIIIAAAhkLUJjLGDTojsJcAEKKQD0IUJirh1lgDAgggAACCCCAAAIIIIAAAhTm8n0PUJjL15feEahKgMJcVWxshAACCCCAAAIIIIAAAgggkLEAhbmMQYPuKMwFIKQI1IMAhbl6mAXGgAACCCCAAAIIIIAAAgggQGEu3/cAhbl8fekdgaoEKMxVxcZGCCCAAAIIIIAAAggggAACGQtQmMsYNOiOwlwAQopAPQhQmKuHWWAMCCCAAAIIIIAAAggggAACFObyfQ9QmMvXl94RqEqAwlxVbGyEAAIIIIAAAggggAACCCCQsQCFuYxBg+4ozAUgpAjUgwCFuXqYBcaAAAIIIIAAAggggAACCCBAYS7f90DNCnPNzc06kltvvTV6RL/85S/V/uabbyqed9550fVoRKArCLz88ss6zCVLlihOmzZN0f9+Zs+erZwXBBBAAAEEEEAAAQQQQAABBPIUcJ3G/x/q/y/Nc59dqW8Kc11ptjnWhhGgMNcwU8VAEUAAAQQQQAABBBBAAIFCC1CYy3d6a1aYy/cw6B0BBBBAAAEEEEAAAQQQQAABBBBAAIHGEqAw11jzxWgRQAABBBBAAAEEEEAAAQQQQAABBAoi0OHC3Pbt20WxatWqDpHs3r07sf1RRx2VyMtN6Kd9KXzw+bcA/76S7wP+XSQ9wgyfUCSZ45P0CDN8QpFkjk/SI8zwCUWSOT5JjzDDJxRJ5vgkPcIMn1AkmeOT9Aizovr4OJuamvTnwIED3UTsgACFuRS8ov5D4rhSJvw/zfjg828BCpfJ9wH/LpIeYYZPKJLM8Ul6hBk+oUgyxyfpEWb4hCLJHJ+kR5jhE4okc3ySHmGGTyiSzIvq46OkMGeJbGKHC3P+NY6vf/3rGtH/XD+9qpHdMf+2xHbjr2+rwCYay0hWz0/euUc/STR8kh5hVu8+X/3vS8Ihl5Uv+t8fJ9ajnwRHN3ySHmGGTyiSzPFJeoQZPqFIMscn6RFm+IQiyRyfpEeY4ROKJHN8kh5hhk8okszxSXqEWdF9mpubdcjz5s0LD528CgEKcylo9V6goeCYnLiuMl8U1JLzXvQPPB8t826Jtsi8Jz3CDJ9QJJnjk/QIM3xCkWSOT9IjzPAJRZI5PkmPMMMnFEnm+CQ9wgyfUCSZZ+1DYS7p29Ess8LcjpdbNZbr5t1S1ZguGztW2x17etuz5S67+fKq+mn5r7Y7904cMljb00+SEZ+kR5jVq8+oI0/QUG+aPDkccln5RbNatN5pJ7T9u6CfJBs+SY8wwycUSeb4JD3CDJ9QJJnjk/QIM3xCkWSOT9IjzPAJRZI5PkmPMMMnFEnm+CQ9wqyoPgt/vFqH+saxxypyx1w489XlFOZS3Oq1QEPBMT5hRZ8vCnPxeS/qBx7HFZ9vt+JjiXjEJ+7iVnwsEY/4xF3cio8l4hGfuItb8bFEPOITd3ErPpaIR3ziLm7NyofCnEWzjRTmUjyLXuihwBef+Hqddwpz8fnK6gOGfuK+bsXHEvGIT9zFrfhYIh7xibu4FR9LxCM+cRe34mOJeMQn7uJWfCwRj/jEXdxaVB8Kc57hbCOFuRTPei3QUFCLT1jR54vCXHzei/qBx3HF59ut+FgiHvGJu7gVH0vEIz5xF7fiY4l4xCfu4lZ8LBGP+MRd3IqPJeIRn7iLW7PyoTBn0WwjhbkUz6IXeijwxSe+Xuedwlx8vrL6gKGfuK9b8bFEPOITd3ErPpaIR3ziLm7FxxLxiE/cxa34WCIe8Ym7uBUfS8QjPnEXtxbVh8KcZzjbSGEuxbNeCzQU1OITVvT5ojAXn/eifuBxXPH5dis+lohHfOIubsXHEvGIT9zFrfhYIh7xibu4FR9LxCM+cRe34mOJeMQn7uLWrHwozFk020hhLsWz6IUeCnzxia/XeacwF5+vrD5g6Cfu61Z8LBGP+MRd3IqPJeIRn7iLW/GxRDziE3dxKz6WiEd84i5uxccS8YhP3MWtRfWhMOcZzjZSmEvxrNcCDQW1+IQVfb4ozMXnvagfeBxXfL7dio8l4hGfuItb8bFEPOITd3ErPpaIR3ziLm7FxxLxiE/cxa34WCIe8Ym7uDUrHwpzFs02UphL8Sx6oYcCX3zi63XeKczF5yurDxj6ifu6FR9LxCM+cRe34mOJeMQn7uJWfCwRj/jEXdyKjyXiEZ+4i1vxsUQ84hN3cWtRfSjMeYazjRTmUjzrtUBDQS0+YUWfLwpz8Xkv6gcexxWfb7fiY4l4xCfu4lZ8LBGP+MRd3IqPJeIRn7iLW/GxRDziE3dxKz6WiEd84i5uzcqHwpxFs40U5lI8i17oocAXn/h6nXcKc/H5yuoDhn7ivm7FxxLxiE/cxa34WCIe8Ym7uBUfS8QjPnEXt+JjiXjEJ+7iVnwsEY/4xF3cWlSfrApz69atE9WFF15osnZjS0uLlp9//vmKw4YNa3f9O++8U8vHjx+veOSRR0bX37Rpk9rXr1+vOHfu3MR6+/fvT+Rh0traqqa+ffuGi5SX2t4bUZizRBDrtUBDQS2YqP+kRZ8vCnPxeS/qBx7HFZ9vt+JjiXjEJ+7iVnwsEY/4xF3cio8l4hGfuItb8bFEPOITd3ErPpaIR3ziLm7NyofCnEXbYlaFuf8DAAD//9O72mUAAB9hSURBVO3de4xlZZU3YBqVQIQ04wUwQRphQGIi9ofagIa0BqNhEkBQrpEA3yBBZOKtJ+ZD1GESaCONkYik7X9M9BtBxKCoxBtBIQIKMRIkBrxAG+MFjQqKCCjOuLpXh/ftdeyiejPss/vhj/rVWrXP6b2fVVUntbKpWvK3//lvu6347wMf+EA8ev3v74/899UXLurZTj/iiHjcHsv/KfL0C/51Uc9z3r/8v3jcvv/nnz1PIcinQHlca6w+r1q6T5zlf5522uPOduHvHv2+8+Lgl+yz4evC87R2fFqPvuLTi7Q1n9ajr/j0Im3Np/XoKz69SFvzaT36ik8v0tZ8Wo++4tOLtDWf1qOvpuqz5jNXxqU+tMcekatXr+4vfUH1F77whTjuqKOOao6/6667mjqLHXfcMd69/fbbI/vHfe9734v+8uXL8yGR69ati3zzm9/c9N/ylrdEvXbt2qZ/+eWXR33QQQdF7r///s3H++Liiy+O1qpVq/oPRb3QddsSi7nSb7uxLmgsHLfNeVnM1XOf6gue66rnnV0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2eWTEnXyqV2yO5SPxVyKbkiLudZjU2WhtomifIdPybKpOVYfi7lNI2reGeoFxvM0rJsVfDYjaRp8Go7NCj6bkTQNPg3HZgWfzUiaBp+GY7OCz2YkTYNPw7FZwWczkqbBp+HYrJiqz5O9mNvSHWaz7rTLx11xxRUxi5NOOqmZSX581uPXr18fx++1117N42YVd999d3zohS984axDop//7j886H8+6I65GUJjXdC4Y64e2NTnZTFXz32qL3iuq553dvmkRJ18apfs8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1MmndsnuUD4WcxtELebyM2tGTn1BYzFXD37qc7eYq+c+1AuM56l9s8snJerkU7tkl09K1Mmndskun5Sok0/tkl0+KVEnn9olu3xSok4+tUt2p+oz74u5JUuW5Igir7nmmsgjjzyy6c8q7r9/w99WOOWUU+KQvAMvH591Pt4dc/74Q34uNDn1BdZUF5cWc82n8aZiqi94rmvTiMt3+JQsm5p8NlGU7/ApWTY1+WyiKN/hU7JsavLZRFG+w6dk2dTks4mifIdPybKpyWcTRfnOUD4WcxZz5SdY37R46kXamk/r0Vdj9bGY6ye1oR7qBcbz1L7Z5ZMSdfKpXbLLJyXq5FO7ZJdPStTJp3bJLp+UqJNP7ZJdPilRJ5/aJbtT9RnrYi5/R9yyZctyBJH5V1lXrlwZ9ZZ+J1w+OB93/PHHR2vp0qWR/e+wO++886K/YsWKyP6vxbpjzh1z8YnRvxnr4mmqd7oNdV0Wc/1n8oZ6qi94rqued3b5pESdfGqX7PJJiTr51C7Z5ZMSdfKpXbLLJyXq5FO7ZJdPStTJp3bJ7lA+FnPtH5ewmMvPsC4tnjqQruTTgXTlWH0s5rpBbSyHeoHxPLVvdvmkRJ18apfs8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1MmndsnuVH2e7MVc+vWZvwsu+/2dadnPzIXZqlWronXDDTdE9o/LO+Pyjrfly5fnU0SuWbMm8rjjjovs78i7+eabo3/HHXdEnnnmmZH55q677op3d99998i88y4/numvsqZEl2Nd0Ax1B5bn6Qa+sRzr3C3m6nlN9QXPddXzzi6flKiTT+2SXT4pUSef2iW7fFKiTj61S3b5pESdfGqX7PJJiTr51C7ZHcrHYq79X2Ut5vIzrMuxLlYswrpBbSzNq3bJbvpYzKVIm0O9wHie1rWv+PQibc2n9egrPr1IW/NpPfqKTy/S1nxaj77i04u0NZ/Wo6/49CJtzaf16Kup+jzZi7m8w6z33HXXXaP17W9/O7K/8y0fN+vOtPxrqf3j+t8BN+u4vGOvf3x/nrPqPL/999+/PMQdcyXLdtvlQsRCrQbiU7tkd2gfi7mUbXOqL3iuq51zX/HpRdqaT+vRV3x6kbbm03r0FZ9epK35tB59xacXaWs+rUdf8elF2ppP69FXQ/lYzB3V0y6otpi74F8XBNUfNPRixYKvF95Qc65dsps+FnMp0uZQLzCep3XtKz69SFvzaT36ik8v0tZ8Wo++4tOLtDWf1qOv+PQibc2n9egrPr1IW/NpPfpqqj5P9mKuv4Otd511R9uWHnf77bfHU/W/Q+5Xv/pV9HfbbbfI/q+uHnnkkdHPO+aiKN4s9rzyqdwxlxJd5kLEQq2D2VjyqV2yO7SPxVzKtjnVFzzX1c65r/j0Im3Np/XoKz69SFvzaT36ik8v0tZ8Wo++4tOLtDWf1qOv+PQibc2n9eiroXws5nrZDbXFXOcy9ELEYq4D3lhyrl2yO7SPxVzKtjnUC4znaV37ik8v0tZ8Wo++4tOLtDWf1qOv+PQibc2n9egrPr1IW/NpPfqKTy/S1nxaj76aqs+8LuZyPvk74nKRdtZZZ8WHTj311MhDDz00D43MO+Xyzrnmg48r8vny+fNDW7qTL49zx1xKdDn0YsWCrwPeWHKuXbKbPhZzKdLmVF/wXFc7577i04u0NZ/Wo6/49CJtzaf16Cs+vUhb82k9+opPL9LWfFqPvuLTi7Q1n9ajr4bysZjrZTfUFnOdSy4yLMI6mI0ln9olu2P1sZjLCbU51AuM52ld+4pPL9LWfFqPvuLTi7Q1n9ajr/j0Im3Np/XoKz69SFvzaT36ik8v0tZ8Wo++mqrPvC/m7rvvvhjVJz/5ychVq1b1o4t6oXfK5YPvvvvueDf/yEP2t3SnXR7njrmU6HKsCxoLx25QG8upz8tirp77VF/wXFc97+zySYk6+dQu2eWTEnXyqV2yyycl6uRTu2SXT0rUyad2yS6flKiTT+2S3aF8LOZStE2LudZju6kvaCzmuoFvLKc+d4u5eu5DvcB4nto3u3xSok4+tUt2+aREnXxql+zySYk6+dQu2eWTEnXyqV2yyycl6uRTu2R3qj5DLea2dpGVzlNJd8zNmOTUFz0WfPXgxzp3i7l6XlN9wXNd9byzyycl6uRTu2SXT0rUyad2yS6flKiTT+2SXT4pUSef2iW7fFKiTj61S3aH8rGYS9Fh02JuhudYFzQWavXApj4vi7l67kO9wHie2je7fFKiTj61S3b5pESdfGqX7PJJiTr51C7Z5ZMSdfKpXbLLJyXq5FO7ZHeqPkMt5tJJbhCwmJvxmTD1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYtJib4TnWBY2FWj2wqc/LYq6e+1AvMJ6n9s0un5Sok0/tkl0+KVEnn9olu3xSok4+tUt2+aREnXxql+zySYk6+dQu2Z2qj8VcTnjYtJib4Tn1RY8FXz34sc7dYq6e11Rf8FxXPe/s8kmJOvnULtnlkxJ18qldsssnJerkU7tkl09K1Mmndskun5Sok0/tkt2hfCzmUnTYHGwxd+6558aZ/fuFqxd1hpec/x/xuEcefjjytAv+76Ke5///5yfjcX95+FHPUwjyKVAe1xq7z/mnnfa4s134uxf+13/FwQ8/uuHrwvO0dnxaj77i04u0NZ/Wo6/49CJtzaf16Cs+vUhb82k9+opPL9LWfFqPvuLTi7Q1n9ajr6bqc/GVV8alvuXf/i1y9erF7X96r229tpib8Rkw9gWNxWU7uG1lXhZq7dyn+oLnuto59xWfXqSt+bQefcWnF2lrPq1HX/HpRdqaT+vRV3x6kbbm03r0FZ9epK35tB59NZSPxVwvO0y91Yu5e++9N85k7dq1W3VG999/f/P4pUuXNvVCC8/zj6X48Pm7gK+v9vPA10Xr0Vd8epG25tN69BWfXqSt+bQefcWnF2lrPq1HX/HpRdqaT+vRV3x6kbbm03r01VR98jrPOuuseHfvvffOltwKAYu5GXhT/UJyXTMGvrHNh8/fBSwu288DXxetR1/x6UXamk/r0Vd8epG25tN69BWfXqSt+bQefcWnF2lrPq1HX/HpRdp6qj55lRZzKTFMbvVibpjT8CwECBAgQIAAAQIECBAgQIAAAQIEti0Bi7lta96ulgABAgQIECBAgAABAgQIECBAYCQCFnMjGYTTIECAAAECBAgQIECAAAECBAgQ2LYELOa2rXm7WgIECBAgQIAAAQIECBAgQIAAgZEIWMyNZBBOgwABAgQIECBAgAABAgQIECBAYNsSsJjbtubtagkQIECAAAECBAgQIECAAAECBEYi8KQt5h555JG4xOuvvz5y/fr1kXvvvXfkq171qsgddtgh0hsC24KAr4ttYcqukQABAgQIECBAgAABAuMX8PPpOGZkMTeOOTiLbUTAN75tZNAukwABAgQIECBAgAABAiMX8PPpOAY0+GLujjvuiCs77rjjIp/5zGdGPv/5z4/86U9/GvnQQw9FXnnllZEvfvGLI70hMEUBXxdTnKprIkCAAAECBAgQIECAwPwJ+Pl0XDOzmBvXPJzNRAV845voYF0WAQIECBAgQIAAAQIE5kzAz6fjGthgi7kHH3wwrmz58uWRJ598cuSrX/3q8oqvu+666H/605+OvP322yN32mmn8nhNAvMo4OtiHqfmnAkQIECAAAECBAgQIDA9AT+fjnOmFnPjnIuzmoiAb3wTGaTLIECAAAECBAgQIECAwJwL+Pl0nAMcbDF30UUXxRXeeuutkWefffaCrvijH/1oHHfwwQdHrlq1akGPcxCBeRDwdTEPU3KOBAgQIECAAAECBAgQmL6An0/HOWOLuXHOxVlNRMA3vokM0mUQIECAAAECBAgQIEBgzgX8fDrOAQ62mDv66KPjCl/2spdFHnbYYQu64htvvDGOu+222yI///nPL+hxDiIwDwK+LuZhSs6RAAECBAgQIECAAAEC0xfw8+k4Z2wxN865OKuJCPjGN5FBugwCBAgQIECAAAECBAjMuYCfT8c5wMEWc6eeempc4R577BF5xBFHLOiKr7322jju17/+deTHP/7xBT3OQQTmQcDXxTxMyTkSIECAAAECBAgQIEBg+gJ+Ph3njC3mxjkXZzURAd/4JjJIl0GAAAECBAgQIECAAIE5F/Dz6TgHONhi7pvf/GZc4TnnnBN5ySWXRG6//fbllf/1r3+N/tve9rbI/OusK1euLI/XJDCPAr4u5nFqzpkAAQIECBAgQIAAAQLTE/Dz6ThnajE3zrk4q4kI+MY3kUG6DAIECBAgQIAAAQIECMy5gJ9PxznAwRZzeXlnnHFGvPvd73438uyzz47ce++9I++5557Iyy67LPLlL3955Lp16yK9ITBFAV8XU5yqayJAgAABAgQIECBAgMD8Cfj5dFwzs5gb1zyczUQFfOOb6GBdFgECBAgQIECAAAECBOZMwM+n4xrY4Iu5xx57LK7w0ksvjfzIRz4SuX79+shly5ZF5u+We+tb3xr1kiVLIr0hMEUBXxdTnKprIkCAAAECBAgQIECAwPwJ+Pl0XDOzmBvXPJzNRAV845voYF0WAQIECBAgQIAAAQIE5kzAz6fjGtjgi7lxXZ6zIUCAAAECBAgQIECAAAECBAgQIDBOAYu5cc7FWREgQIAAAQIECBAgQIAAAQIECExcwGJu4gN2eQQIECBAgAABAgQIECBAgAABAuMUsJgb51ycFQECBAgQIECAAAECBAgQIECAwMQFLOYmPmCXR4AAAQIECBAgQIAAAQIECBAgME4Bi7lxzsVZESBAgAABAgQIECBAgAABAgQITFzAYm7iA3Z5BAgQIECAAAECBAgQIECAAAEC4xSwmBvnXJwVAQIECBAgQIAAAQIECBAgQIDAxAUs5iY+YJdHgAABAgQIECBAgAABAgQIECAwTgGLuXHOxVkRIECAAAECBAgQIECAAAECBAhMXMBibuIDdnkECBAgQIAAAQIECBAgQIAAAQLjFLCYG+dcnBUBAgQIECBAgAABAgQIECBAgMDEBbZ6MffHP/4xiC644ILIK6+8MvJnP/tZ5H777Rf53ve+N/KEE06IzDff+MY34t1LL7008vrrr4/8wx/+ELnnnntGvv71r498//vfH7l06dJIbwiMSWDJkiVP6HT+9re/lcd/+ctfjv5FF10UedNNN0Xusssukccee2zkmjVrInfeeedIbwgQIECAAAECBAgQIECAwBMR+OxnPxuH597mxz/+cdT77rtvZO57jjnmmCfytI5doIDF3AKhHEZgIQIWcwtRcgwBAgQIECBAgAABAgQIjEXAYu6pncSiF3N//vOf48wPO+ywyIMOOijy3e9+d2Te6fb9738/6gsvvDDyqquuisw3r3zlK+PdM888M/I1r3lN5B577BH5i1/8IvKDH/xg5J133hl53XXXRXpDYEwCuZibdSfcQs/1ta99bRz6jne8IzK/zh566KGo3/Wud0X+5S9/ifzUpz4V6Q0BAgQIECBAgAABAgQIEFiIwM033xyHveENb4i8/PLLIw855JDIW265JfKkk06KvPrqqyMPPvjgSG+GEbCYG8bRsxAIAYs5nwgECBAgQIAAAQIECBAgMA8CFnPjmNKiF3N5B9x3vvOduJLPfe5zT+oV/elPf4rnf85znhOZ9ZP6j3pyAk9QYKjF3Jb+2QceeCAOed7znhf54IMPbukhPk6AAAECBAgQIECAAAECBDYJ5N8AOPzww6OX/yfjpgM2vrNu3bp4L/8mQN5Z1x+nXpyAxdzi3DyKQClgMVeyaBIgQIAAAQIECBAgQIDAyAQs5sYxkEUv5g488MC4gg996EOR+bvhhrqs/B1dv/zlL+MpL7744sj169dHfuYznxnqn/I8BAYTyMXcc5/73HjO3/72t5HPetazIl/xildE5u9iPPTQQ6N+om/uuuuueMjKlSsj8+vkiT6P4wkQIECAAAECBAgQIEBg2xTYa6+94sJvvPHGyGXLlpUQ9957b/Tz58/cy5QHaz5hAYu5J0zmAQRmC1jMzbbxEQIECBAgQIAAAQIECBAYj4DF3DhmsejF3E477RRXcMkll0R++MMfjvzRj34Uufvuu0e+6U1vijz//PMjd9hhh8hZb3Kx0X/8gAMOiNa3vvWtyLwDqT9OTeCpFDj66KPjn1+1alXkihUrIvOvqX7ta1+LOv+q6qWXXhr1UUcdFbnQNyeeeGIc+qIXvSjyfe9730If6jgCBAgQIECAAAECBAgQILBd7mfyd5Y/4xnPKFUeffTR6O+8886RDz/8cHmc5uIELOYW5+ZRBEoBi7mSRZMAAQIECBAgQIAAAQIERiZgMTeOgSx6MZcDfN3rXhdXkr9r7gUveEHU99xzT+Tb3/72yPx/lS+77LKot/Tmsccei0N++MMfRuYdRnmn3Cc+8YktPYWPExitQP4//GeddVac45133rmgc807U7/61a/G8ddcc03k05/+9AU93kEECBAgQIAAAQIECBAgQODvArnXccfcU/v5YDH31Pr717dRAYu5bXTwLpsAAQIECBAgQIAAAQIjEbCYG8cgFr2YyzvgbrvttriS/CuU/WXdd9990dp///0jf//73/eHLKj+zW9+E8fts88+kQ888MCCHucgAmMUeOSRR+K0dtlll8gt/T/6+VeJv/jFL8bx1157bWT+rscovCFAgAABAgQIECBAgAABAgsU8McfFgj1JB9mMfckA3t6ApWAxVylokeAAAECBAgQIECAAAEC/1sCFnP/W9L/+N9Z9GLu5JNPjmfO33m12267lf9S3jG33377xcfvv//+8rgtNX/+85/HIQceeGBk3kG3pcf5OIExCtxyyy1xWqecckpk/i7F/lzzdymuXbs2PpS/Wy7/Gk5/vJoAAQIECBAgQIAAAQIECCxE4IQTTojDDj/88MgzzzyzfNi6deuif/3110defvnl5XGaixOwmFucm0cR2CoBi7mt4vNgAgQIECBAgAABAgQIENhKAYu5rQQc6OGLXszdeuutcQoXXHBB5Jo1ayLzr7L+5Cc/iTr/Kuuee+4Z9cc+9rHIfHPMMcfEu+985zsjX/rSl0bmX5m84447ol61alXkIYccErl69epIbwiMSeDII4+M08nP5xUrVkT9tKc9LfKGG26IzL/Geu6550Z9xhlnROabr3/96/Hue97znsivfOUrkbvuumseIgkQIECAAAECBAgQIECAwKIFbrrppnjsG9/4xsgrrrgiMvcueUPJiSeeGP2rr7468uCDD470ZhgBi7lhHD0LgRCwmPOJQIAAAQIECBAgQIAAAQLzIGAxN44pLXoxl6f/pS99Kd7NO39+8IMfRP3sZz87Mn8XXd5Zt+OOO+ZDI/OvS1500UVR50Z2yZIlUR9wwAGRp59+euQ555wTmR+PwhsCIxGY9fm8/fbbxxm+5CUvicw7SY8//vjyzPPOuIX+Tsbf/e538Tz5uPJJNQkQIECAAAECBAgQIECAQCdw1VVXRee8886LzP8Dct9994069znHHnts90jlEAIWc0Moeg4CGwUs5nwqECBAgAABAgQIECBAgMA8CVjMPbXT2urF3FN7+v51AgQIECBAgAABAgQIECBAgAABAvMpYDE3n3Nz1gQIECBAgAABAgQIECBAgAABAnMuYDE35wN0+gQIECBAgAABAgQIECBAgAABAvMpYDE3n3Nz1gQIECBAgAABAgQIECBAgAABAnMuYDE35wN0+gQIECBAgAABAgQIECBAgAABAvMpYDE3n3Nz1gQIECBAgAABAgQIECBAgAABAnMuYDE35wN0+gQIECBAgAABAgQIECBAgAABAvMpYDE3n3Nz1gQIECBAgAABAgQIECBAgAABAnMuYDE35wN0+gQIECBAgAABAgQIECBAgAABAvMpYDE3n3Nz1gQIECBAgAABAgQIECBAgAABAnMuYDE35wN0+gQIECBAgAABAgQIECBAgAABAvMpYDE3n3Nz1gQIECBAgAABAgQIECBAgAABAnMuYDE35wN0+gQIECBAgAABAgQIECBAgAABAvMpYDE3n3Nz1gQIECBAgAABAgQIECBAgAABAnMuYDE35wN0+gQIECBAgAABAgQIECBAgAABAvMp8N8KB43DgxvfiwAAAABJRU5ErkJggg==) - An $n$-vector requires $8n$ bytes to store. ## Computational complexity of vector operations - One **floating point operation** (flop) is one basic arithmetic operation in $\mathbb{R}$ or $\mathbb{C}$: $+, -, *, /, \sqrt, ...$ - The **flop count** or **operation count** is the total number of flops in an algorithm. - A (very) crude predictor of run time of the algorithm is $$ \text{run time} \approx \frac{\text{flop count}}{\text{computer speed (flops/second)}}. $$ - **Dominant term**: the highest-order term in the flop count. $$ \frac 13 n^3 + 100 n^2 + 10n + 5 \approx \frac 13 n^3. $$ - **Order**: the power in the dominant term. $$ \frac 13 n^3 + 100 n^2 + 10n + 5 = \text{order } n^3 = O(n^3). $$ - **In-class exercises**. Assume vectors are all of length $n$. Give the flop counts of the following operations. 1. Sum the elements of a vector: $\rule[-0.1cm]{1cm}{0.15mm}$ flops. 2. Vector addition and substraction: $\rule[-0.1cm]{1cm}{0.15mm}$ flops. 3. Scalar multiplication: $\rule[-0.1cm]{1cm}{0.15mm}$ flops. 4. Elementwise multiplication: $\rule[-0.1cm]{1cm}{0.15mm}$ flops. 5. Inner product: $\rule[-0.1cm]{1cm}{0.15mm}$ flops. 6. Norm of a vector: $\rule[-0.1cm]{1cm}{0.15mm}$ flops. - These vector operations are all order $n$ algorithms. ```{julia} # info about my computer versioninfo(verbose = true) ``` Assume that one Apple M2 performance core can do 4 double-precision flops per CPU cylce (?) at 2.4GHz (cycles/second). Then the theoretical throughput of a single performance core on my laptop is $$ 4 \times 2.4 \times 10^9 = 9.6 \times 10^9 \text{ flops/second} = 9.6 \text{ GFLOPS} $$ in double precision. I estimate my computer takes about $$ \frac{10^7 - 1}{9.6 \times 10^9} \approx 0.001042 \text{ seconds} = 1.042 \text{ milliseconds} = 1042 \text{ micro seconds} $$ to sum a vector of length $n = 10^7$ using a single performance core. ```{julia} # the actual run time n = 10^7 x = randn(n) sum(x) # compile @time sum(x); ``` ```{julia} @benchmark sum($x) ``` ## Norm, distance, angle - The **Euclidean norm** or **L2 norm** of a vector $\mathbf{x} \in \mathbb{R}^n$ is $$ \|\mathbf{x}\| = \|\mathbf{x}\|_2 = (\mathbf{x}'\mathbf{x})^{1/2} = \sqrt{x_1^2 + \cdots + x_n^2}. $$ The Euclidean/L2 norm captures the Euclidean length of the vector. ```{julia} #| code-fold: true x = [2, 1] plot([0; x[1]], [0; x[2]], arrow = true, color = :blue, legend = :none, xlims = (-3, 3), ylims = (-2, 2), annotations = [((x .+ 0.3)..., L"x=(%$(x[1]), %$(x[2]))'")], xticks = -3:1:3, yticks = -2:1:2, framestyle = :origin, aspect_ratio = :equal) ``` - The **L1 norm** of a vector $\mathbf{x} \in \mathbb{R}^n$ is $$ \|\mathbf{x}\|_1 = |x_1| + \cdots + |x_n|. $$ Also known as **Manhattan Distance** or **Taxicab norm**. The L1 norm is the distance you have to travel between the origin $\mathbf{0}_n$ to the destination $\mathbf{x} = (x_1, \ldots, x_n)$, in a way that resembles how a taxicab drives between city blocks to arrive at its destination. ```{julia} #| code-fold: true x = [2, 1] plot([0 x[1]; x[1] x[1]], [0 0; 0 x[2]], arrow = true, color = :blue, legend = :none, xlims = (-3, 3), ylims = (-2, 2), annotations = [((x .+ 0.3)..., L"x=(%$(x[1]), %$(x[2]))'")], xticks = -3:1:3, yticks = -2:1:2, framestyle = :origin, aspect_ratio = :equal) ``` - Properties of L2 norm. 1. Positive definiteness: $\|\mathbf{x}\| \ge 0$ for any vector $\mathbf{x}$. $\|\mathbf{x}\| = 0$ if and only if $\mathbf{x}=\mathbf{0}$. 2. Homogeneity: $\|\alpha \mathbf{x}\| = |\alpha| \|\mathbf{x}\|$ for any scalar $\alpha$ and vector $\mathbf{x}$. 3. Triangle inequality: $\|\mathbf{x} + \mathbf{y}\| \le \|\mathbf{x}\| + \|\mathbf{y}\|$ for any $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$. Proof: use Cauchy-Schwartz inequality. TODO in class. 4. **Cauchy-Schwarz inequality**: $|\mathbf{x}' \mathbf{y}| \le \|\mathbf{x}\| \|\mathbf{y}\|$ for any $\mathbf{x}, \mathbf{y} \in \mathbb{R}^n$. The equality holds when (1) $\mathbf{x} = \mathbf{0}$ or $\mathbf{y}=\mathbf{0}$ or (2) $\mathbf{x} \ne \mathbf{0}$, $\mathbf{y} \ne \mathbf{0}$, and $\mathbf{x} = \alpha \mathbf{y}$ for some $\alpha \ne 0$. Proof: The function $f(t) = \|\mathbf{x} - t \mathbf{y}\|_2^2 = \|\mathbf{x}\|_2^2 - 2t (\mathbf{x}' \mathbf{y}) + t^2\|\mathbf{y}\|_2^2$ is minimized at $t^\star =(\mathbf{x}'\mathbf{y}) / \|\mathbf{y}\|^2$ with minimal value $0 \le f(t^\star) = \|\mathbf{x}\|^2 - (\mathbf{x}'\mathbf{y})^2 / \|\mathbf{y}\|^2$. There are at least 5 other proofs of CS inequality on [Wikipedia](https://en.wikipedia.org/wiki/Cauchy%E2%80%93Schwarz_inequality). The first three properties are the defining properties of any vector norm. **Homework 1 (BV 3.5):** Show the three properties for the L1 norm. **Homework 1 (BV 3.5):** Show the three properties for the L-infinity norm: $\|\mathbf{x}\|_\infty = \max_i |x_i|$. **In-class exericse:** Which property is violated for the L0 "norm": $\|\mathbf{x}\|_0 = \sum_i \mathbb{1}_{x_i \ne 0} = \text{number of non-zero elements in } \mathbf{x}$. ```{julia} #| code-fold: true x = [2, 1] y = [0.5, 1] plot([0 0; x[1] y[1]], [0 0; x[2] y[2]], arrow = true, color = :blue, legend = :none, xlims = (-3, 3), ylims = (-1, 3), annotations = [(x[1] + 0.6, x[2], L"x=(%$(x[1]), %$(x[2]))'"), (y[1] - 0.25, y[2] + 0.2, L"y=(%$(y[1]), %$(y[2]))'")], xticks = -3:1:3, yticks = -1:1:3, framestyle = :origin, aspect_ratio = :equal) plot!([x[1] 0; x[1] + y[1] x[1] + y[1]], [x[2] 0; x[2] + y[2] x[2] + y[2]], arrow = true, linestyle = :dash, color = :blue, linealpha = 0.5, annotations = [(x[1] + y[1] - 0.2, x[2] + y[2] + 0.2, L"x+y=(%$(x[1] + y[1]), %$(x[2] + y[2]))'")]) ``` ```{julia} # check triangular inequality on random vectors @show x = randn(5) @show y = randn(5) @show norm(x + y) @show norm(x) + norm(y) @show norm(x + y) ≤ norm(x) + norm(y) ``` ```{julia} # check Cauchy-Schwarz inequality on random vectors @show x = randn(5) @show y = randn(5) @show abs(x'y) @show norm(x) * norm(y) @show abs(x'y) ≤ norm(x) * norm(y) ``` - The **(Euclidean) distance** between vectors $\mathbf{x}$ and $\mathbf{y}$ is defined as $\|\mathbf{x} - \mathbf{y}\|$. - Property of distances. 1. Nonnegativity. $\|\mathbf{x} - \mathbf{y}\| \ge 0$ for all $\mathbf{x}$ and $\mathbf{y}$. And $\|\mathbf{x} - \mathbf{y}\| = 0$ if and only if $\mathbf{x} = \mathbf{y}$. 2. Triangular inequality: $\|\mathbf{x} - \mathbf{y}\| \le \|\mathbf{x} - \mathbf{z}\| + \|\mathbf{z} - \mathbf{y}\|$. Proof: TODO. - The **average** of a vector $\mathbf{x}$ is $$ \operatorname{avg}(\mathbf{x}) = \bar{\mathbf{x}} = \frac{x_1 + \cdots + x_n}{n} = \frac{\mathbf{1}' \mathbf{x}}{n}. $$ - The **rooted mean square** (RMS) of a vector is $$ \operatorname{rms}(\mathbf{x}) = \sqrt{\frac{x_1^2 + \cdots + x_n^2}{n}} = \frac{\|\mathbf{x}\|}{\sqrt n}. $$ - The **standard deviation** of a vector $\mathbf{x}$ is $$ \operatorname{std}(\mathbf{x}) = \sqrt{\frac{(x_1 - \bar{\mathbf{x}})^2 + \cdots + (x_n - \bar{\mathbf{x}})^2}{n}} = \operatorname{rms}(\mathbf{x} - \bar{\mathbf{x}} \mathbf{1}) = \frac{\|\mathbf{x} - (\mathbf{1}' \mathbf{x} / n) \mathbf{1}\|}{\sqrt n}. $$ - Theorem: $\operatorname{avg}(\mathbf{x})^2 + \operatorname{std}(\mathbf{x})^2 = \operatorname{rms}(\mathbf{x})^2$. This result underlies the famous _bias-variance tradeoff_ in statistics. Proof: HW1. Hint: $\operatorname{std}(\mathbf{x})^2 = \frac{\|\mathbf{x} - (\mathbf{1}' \mathbf{x} / n) \mathbf{1}\|^2}{n} = ... = \operatorname{rms}(\mathbf{x})^2 - \operatorname{avg}(\mathbf{x})^2$. ```{julia} x = randn(5) @show mean(x)^2 + std(x, corrected = false)^2 @show norm(x)^2 / length(x) # floating point arithmetics is not exact @show mean(x)^2 + std(x, corrected = false)^2 ≈ norm(x)^2 / length(x) ``` - **Angle** between two nonzero vectors $\mathbf{x}, \mathbf{y}$ is $$ \theta = \angle (\mathbf{x}, \mathbf{y}) = \operatorname{arccos} \left(\frac{\mathbf{x}'\mathbf{y}}{\|\mathbf{x}\|\|\mathbf{y}\|}\right). $$ This is the unique value of $\theta \in [0, \pi]$ that satisifies $$ \cos \theta = \frac{\mathbf{x}'\mathbf{y}}{\|\mathbf{x}\|\|\mathbf{y}\|}. $$ - Cauchy-Schwarz inequality guarantees that $$ -1 \le \cos \theta = \frac{\mathbf{x}'\mathbf{y}}{\|\mathbf{x}\|\|\mathbf{y}\|} \le 1. $$ - Terminology | Angle | Sign of inner product | Terminology | |--------------------------|---------------------------------------------------------|-----------------------------------------------------------------------------------| | $\theta = 0$ | $\mathbf{x}'\mathbf{y}=\|\mathbf{x}\|\|\mathbf{y}\|$ | $\mathbf{x}$ and $\mathbf{y}$ are aligned or parallel | | $0 \le \theta < \pi/2$ | $\mathbf{x}'\mathbf{y} > 0$ | $\mathbf{x}$ and $\mathbf{y}$ make an acute angle | | $\theta = \pi / 2$ | $\mathbf{x}'\mathbf{y} = 0$ | $\mathbf{x}$ and $\mathbf{y}$ are **orthogonal**, $\mathbf{x} \perp \mathbf{y}$ | | $\pi/2 < \theta \le \pi$ | $\mathbf{x}'\mathbf{y} < 0$ | $\mathbf{x}$ and $\mathbf{y}$ make an obtuse angle | | $\theta = \pi$ | $\mathbf{x}'\mathbf{y} = -\|\mathbf{x}\|\|\mathbf{y}\|$ | $\mathbf{x}$ and $\mathbf{y}$ are anti-aligned or opposed | ```{julia} #| code-fold: true x = [2, 1] y = 0.5x plot([0 0; x[1] y[1]], [0 0; x[2] y[2]], arrow = true, color = [:blue, :purple], legend = :none, xlims = (-3, 3), ylims = (-2, 2), annotations = [(x[1] + 0.6, x[2], L"x=(%$(x[1]), %$(x[2]))'"), (y[1] - 0.25, y[2] + 0.2, L"y=(%$(y[1]), %$(y[2]))'")], xticks = -3:1:3, yticks = -2:1:2, framestyle = :origin, aspect_ratio = :equal, title = L"$x$ and $y$ are aligned or parallel: $x'y = %$(dot(x, y)) = \|\|x\|\| \|\|y\|\|$") ``` ```{julia} #| code-fold: true x = [2, 1] θ = π/4 A = [cos(θ) -sin(θ); sin(θ) cos(θ)] y = 0.5(A * x) plot([0 0; x[1] y[1]], [0 0; x[2] y[2]], arrow = true, color = [:blue, :purple], legend = :none, xlims = (-3, 3), ylims = (-2, 2), annotations = [(x[1] + 0.6, x[2], L"x=(%$(x[1]), %$(x[2]))'"), (y[1] - 0.25, y[2] + 0.2, L"y=(%$(round(y[1], digits=1)), %$(round(y[2], digits=1)))'")], xticks = -3:1:3, yticks = -2:1:2, framestyle = :origin, aspect_ratio = :equal, title = L"$x$ and $y$ make an acute angle: $0 < x'y = %$(round(dot(x, y), digits=1)) < %$(round(norm(x) * norm(y), digits=1)) = \|\|x\|\| \|\|y\|\|$") ``` ```{julia} #| code-fold: true x = [2, 1] θ = π/2 A = [cos(θ) -sin(θ); sin(θ) cos(θ)] y = 0.5(A * x) plot([0 0; x[1] y[1]], [0 0; x[2] y[2]], arrow = true, color = [:blue, :purple], legend = :none, xlims = (-3, 3), ylims = (-2, 2), annotations = [(x[1] + 0.6, x[2], L"x=(%$(x[1]), %$(x[2]))'"), (y[1] - 0.25, y[2] + 0.2, L"y=(%$(round(y[1], digits=1)), %$(round(y[2], digits=1)))'")], xticks = -3:1:3, yticks = -2:1:2, framestyle = :origin, aspect_ratio = :equal, title = L"$x$ is orthogonal to $y$ ($x \perp y$): $x'y = %$(round(dot(x, y), digits=1))") ``` ```{julia} #| code-fold: true x = [2, 1] θ = (3/4)π A = [cos(θ) -sin(θ); sin(θ) cos(θ)] y = 0.5(A * x) plot([0 0; x[1] y[1]], [0 0; x[2] y[2]], arrow = true, color = [:blue, :purple], legend = :none, xlims = (-3, 3), ylims = (-2, 2), annotations = [(x[1] + 0.6, x[2], L"x=(%$(x[1]), %$(x[2]))'"), (y[1] - 0.25, y[2] + 0.2, L"y=(%$(round(y[1], digits=1)), %$(round(y[2], digits=1)))'")], xticks = -3:1:3, yticks = -2:1:2, framestyle = :origin, aspect_ratio = :equal, title = L"$x$ and $y$ make an obtuse angle: $- \|\|x\|\| \|\|y\|\| = -%$(round(norm(x) * norm(y), digits=1)) < %$(round(dot(x, y), digits=1)) = x'y < 0") ``` ```{julia} #| code-fold: true x = [2, 1] θ = π A = [cos(θ) -sin(θ); sin(θ) cos(θ)] y = 0.5(A * x) plot([0 0; x[1] y[1]], [0 0; x[2] y[2]], arrow = true, color = [:blue, :purple], legend = :none, xlims = (-3, 3), ylims = (-2, 2), annotations = [(x[1] + 0.6, x[2], L"x=(%$(x[1]), %$(x[2]))'"), (y[1] - 0.25, y[2] - 0.2, L"y=(%$(round(y[1], digits=1)), %$(round(y[2], digits=1)))'")], xticks = -3:1:3, yticks = -2:1:2, framestyle = :origin, aspect_ratio = :equal, title = L"$x$ and $y$ are anti-aligned or opposed: $x'y = %$(round(dot(x, y), digits=1)) = -\|\|x\|\| \|\|y\|\|") ``` ## Linear independence - Example: Consider vectors in $\mathbb{R}^3$ $$ \begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix}, \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}, \begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix}. $$ The fourth vector $\begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix}$ is in some sense redundant because it can be expressed as a linear combination of the other 3 vectors $$ \begin{pmatrix} 1 \\ 2 \\ 3 \end{pmatrix} = 1 \cdot \begin{pmatrix} 1 \\ 0 \\ 0 \end{pmatrix} + 2 \cdot \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix} + 3 \cdot \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}. $$ Similarly, each one of these 4 vectors can be expressed as the linear combination of the other 3. We say these four vectors are linearly dependent. - A set of vectors $\mathbf{a}_1, \ldots, \mathbf{a}_k \in \mathbb{R}^n$ are **linearly dependent** if there exist constants $\alpha_1, \ldots, \alpha_k$, which are not all zeros, such that $$ \alpha_1 \mathbf{a}_1 + \cdots + \alpha_k \mathbf{a}_k = \mathbf{0}. $$ They are **linearly independent** if they are not linearly dependent. That is if $\alpha_1 \mathbf{a}_1 + \cdots + \alpha_k \mathbf{a}_k = \mathbf{0}$ then $\alpha_1 = \cdots = \alpha_k = 0$. This is usually how we show that a set of vectors are linearly independent. - Theorem: Unit vectors $\mathbf{e}_1, \ldots, \mathbf{e}_n \in \mathbb{R}^n$ are linearly independent. Proof: TODO in class. - Theorem: If $\mathbf{x}$ is a linear combination of linearly independent vectors $\mathbf{a}_1, \ldots, \mathbf{a}_k$. That is $\mathbf{x} = \alpha_1 \mathbf{a}_1 + \cdots + \alpha_k \mathbf{a}_k$. Then the coefficients $\alpha_1, \ldots, \alpha_k$ are unique. Proof: TODO in class. Hint: proof by contradition. - **Independence-dimension inequality** or **order-dimension inequality**. If the vectors $\mathbf{a}_1, \ldots, \mathbf{a}_k \in \mathbb{R}^n$ are linearly independent, then $k \le n$. In words, there can be at most $n$ linearly independent vectors in $\mathbb{R}^n$. Or any set of $n+1$ or more vectors in $\mathbb{R}^n$ must linearly dependent. Proof (optional): We show this by induction. Let $a_1, \ldots, a_k \in \mathbb{R}^1$ be linearly independent. We must have $a_1 \ne 0$. This means that every element $a_i$ of the collection can be expressed as a multiple of $a_i = (a_i / a_1) a_1$ of the first element $a_1$. This contradicts the linear independence thus $k$ must be 1. Induction hypothesis: suppose $n \ge 2$ and the independence-dimension inequality holds for $k \le n$. We partition the vectors $\mathbf{a}_i \in \mathbb{R}^n$ as $$ \mathbf{a}_i = \begin{pmatrix} \mathbf{b}_i \\ \alpha_i \end{pmatrix}, \quad i = 1,\ldots,k, $$ where $\mathbf{b}_i \in \mathbb{R}^{n-1}$ and $\alpha_i \in \mathbb{R}$. First suppose $\alpha_1 = \cdots = \alpha_k = 0$. Then the vectors $\mathbf{b}_1, \ldots, \mathbf{b}_k$ are linearly independent: $\sum_{i=1}^k \beta_i \mathbf{b}_i = \mathbf{0}$ if and only if $\sum_{i=1}^k \beta_i \mathbf{a}_i = \mathbf{0}$, which is only possible for $\beta_1 = \cdots = \beta_k = 0$ because the vectors $\mathbf{a}_i$ are linearly independent. The vectors $\mathbf{b}_i$ therefore form a linearly independent collection of $(n-1)$-vectors. By the induction hypothesis we have $k \le n-1$ so $k \le n$. Next we assume the scalars $\alpha_i$ are not all zero. Assume $\alpha_j \ne 0$. We define a collection of $k-1$ vectors $\mathbf{c}_i$ of length $n-1$ as follows: $$ \mathbf{c}_i = \mathbf{b}_i - \frac{\alpha_i}{\alpha_j} \mathbf{b}_j, \quad i = 1, \ldots, j-1, \mathbf{c}_i = \mathbf{b}_{i+1} - \frac{\alpha_{i+1}}{\alpha_j} \mathbf{b}_j, \quad i = j, \ldots, k-1. $$ These $k-1$ vectors are linealy independent: If $\sum_{i=1}^{k-1} \beta_i c_i = 0$ then $$ \sum_{i=1}^{j-1} \beta_i \begin{pmatrix} \mathbf{b}_i \\ \alpha_i \end{pmatrix} + \gamma \begin{pmatrix} \mathbf{b}_j \\ \alpha_j \end{pmatrix} + \sum_{i=j+1}^k \beta_{i-1} \begin{pmatrix} \mathbf{b}_i \\ \alpha_i \end{pmatrix} = \mathbf{0} $$ with $\gamma = - \alpha_j^{-1} \left( \sum_{i=1}^{j-1} \beta_i \alpha_i + \sum_{i=j+1}^k \beta_{i-1} \alpha_i \right)$. Since the vectors $\mathbf{a}_i$ are linearly independent, all coefficients $\beta_i$ and $\gamma$ are all zero. This in turns implies that the vectors $\mathbf{c}_1, \ldots, \mathbf{c}_{k-1}$ are linearly independent. By the induction hypothesis $k-1 \le n-1$, we have established $k \le n$. ## Basis - A set of $n$ linearly independent vectors $\mathbf{a}_1, \ldots, \mathbf{a}_n \in \mathbb{R}^n$ is called a **basis** for $\mathbb{R}^n$. - Fact: the zero vector $\mathbf{0}_n$ cannot be a basis vector in $\mathbb{R}^n$. Why? - Theorem: Any vector $\mathbf{x} \in \mathbb{R}^n$ can be expressed as a linear combination of basis vectors $\mathbf{x} = \alpha_1 \mathbf{a}_1 + \cdots + \alpha_n \mathbf{a}_n$ for some $\alpha_1, \ldots, \alpha_n$, and these coefficients are unique. This is called expansion of $\mathbf{x}$ in the basis $\mathbf{a}_1, \ldots, \mathbf{a}_n$. Proof of existence by contradition (optional). Suppose $\mathbf{x}$ can NOT be expressed as a linear combination of basis vectors. Suppose an arbitrary linear combination $\alpha_1 \mathbf{a}_1 + \cdots + \alpha_n \mathbf{a}_n + \beta \mathbf{x} = \mathbf{0}$. Then $\beta = 0$ otherwise it contradictions with our assumption. Also $\alpha_1 = \cdots = \alpha_n = 0$ by linear independence of $\mathbf{a}_1, \ldots, \mathbf{a}_n$. Therefore we conclude $\alpha_1 = \cdots = \alpha_n = \beta = 0$. Thus $\mathbf{a}_1, \ldots, \mathbf{a}_n, \mathbf{x}$ are linearly independent, contradicting with the independence-dimension inequality. Proof of uniqueness: TODO in class. - Example: Unit vectors $\mathbf{e}_1, \ldots, \mathbf{e}_n$ form a basis for $\mathbb{R}^n$. Expansion of a vector $\mathbf{x} \in \mathbb{R}^n$ in this basis is $$ \mathbf{x} = x_1 \mathbf{e}_1 + \cdots + x_n \mathbf{e}_n. $$ ## Orthonormal basis - A set of vectors $\mathbf{a}_1, \ldots, \mathbf{a}_k$ are **(mutually) orthogonal** if $\mathbf{a}_i \perp \mathbf{a}_j$ for any $i \ne j$. They are **normalized** if $\|\mathbf{a}_i\|=1$ for all $i$. They are **orthonormal** if they are both orthogonal and normalized. Orthonormality is often expressed compactly by $\mathbf{a}_i'\mathbf{a}_j = \delta_{ij}$, where $$ \delta_{ij} = \begin{cases} 1 & \text{if } i = j \\ 0 & \text{if } i \ne j \end{cases} $$ is the Kronecker delta notation. - Theorem: An orthonormal set of vectors are linearly independent. Proof: TODO in class. Expand $\|\alpha_1 \mathbf{a}_1 + \cdots + \alpha_k \mathbf{a}_k\|^2 = (\alpha_1 \mathbf{a}_1 + \cdots + \alpha_k \mathbf{a}_k)'(\alpha_1 \mathbf{a}_1 + \cdots + \alpha_k \mathbf{a}_k)$. - By the independence-dimension inequality, must have $k \le n$. When $k=n$, $\mathbf{a}_1, \ldots, \mathbf{a}_n$ are called an **orthonormal basis**. - Examples of orthonormal basis: 1. Unit vectors $\mathbf{e}_1, \ldots, \mathbf{e}_n$ in $\mathbb{R}^n$. 2. The 3 vectors in $\mathbb{R}^3$: $$ \begin{pmatrix} 0 \\ 0 \\ 1 \end{pmatrix}, \quad \frac{1}{\sqrt 2} \begin{pmatrix} 1 \\ 1 \\ 0 \end{pmatrix}, \quad \quad \frac{1}{\sqrt 2} \begin{pmatrix} 1 \\ -1 \\ 0 \end{pmatrix}. $$ ```{julia} a1 = [0, 0, 1] a2 = (1 / sqrt(2)) * [1, 1, 0] a3 = (1 / sqrt(2)) * [1, -1, 0] @show norm(a1), norm(a2), norm(a3) @show a1'a2, a1'a3, a2'a3 ``` ```{julia} #| code-fold: true plot3d([0; a1[1]], [0; a1[2]], [0; a1[3]], arrow = true, legend = :none, framestyle = :origin, aspect_ratio = :equal, xlims = (-1, 1), ylims = (-1, 1), zlims = (-1, 1), xticks = -1:1:1, yticks = -1:1:1, zticks = -1:1:1 ) plot3d!([0; a2[1]], [0; a2[2]], [0; a2[3]]) plot3d!([0; a3[1]], [0; a3[2]], [0; a3[3]]) ``` - **Orthonormal expansion**. If $\mathbf{a}_1, \ldots, \mathbf{a}_n \in \mathbb{R}^n$ is an orthonormal basis, then for any vector $\mathbf{x} \in \mathbb{R}^n$, $$ \mathbf{x} = (\mathbf{a}_1'\mathbf{x}) \mathbf{a}_1 + \cdots + (\mathbf{a}_n'\mathbf{x}) \mathbf{a}_n. $$ Proof: $\mathbf{x} = \mathbf{A} \mathbf{A}' \mathbf{x}$. ```{julia} @show x = randn(3) @show (a1'x) * a1 + (a2'x) * a2 + (a3'x) * a3 @show x ≈ (a1'x) * a1 + (a2'x) * a2 + (a3'x) * a3 ``` ## How to orthonormalize a set of vectors? Gram-Schmidt algorithm - Given vectros $\mathbf{a}_1, \ldots, \mathbf{a}_k \in \mathbb{R}^n$. G-S algorithm generates a sequence of orthonormal vectors $\mathbf{q}_1, \mathbf{q}_2, \ldots$. - For $i=1,\ldots,k$: 1. Orthogonalization: $\tilde{\mathbf{q}}_i = \mathbf{a}_i - [(\mathbf{q}_1' \mathbf{a}_i) \mathbf{q}_1 + \cdots + (\mathbf{q}_{i-1}' \mathbf{a}_i) \mathbf{q}_{i-1}]$. 2. Test for linear independence: if $\tilde{\mathbf{q}}_i = \mathbf{0}$, quit. 3. Normalization: $\mathbf{q}_i = \tilde{\mathbf{q}}_i / \|\tilde{\mathbf{q}}_i\|$. - If G-S does not stop early (in step 2), $\mathbf{a}_1, \ldots, \mathbf{a}_k$ are linearly independent. - If G-S stops early in iteration $i=j$, then $\mathbf{a}_j$ is a linear combination of $\mathbf{a}_1, \ldots, \mathbf{a}_{j-1}$ and $\mathbf{a}_1, \ldots, \mathbf{a}_{j-1}$ are linearly independent. ```{julia} #| code-fold: true a1 = [0.5, 2] a2 = [1.5, 0.5] p1 = plot([0 0; a1[1] a2[1]], [0 0; a1[2] a2[2]], arrow = true, color = :purple, axis = ([], false), legend = :none, xlims = (-2, 2), ylims = (-2, 2), annotations = [(a1[1] + 0.2, a1[2], L"a_1"), (a2[1] + 0.2, a2[2], L"a_2")], framestyle = :origin, aspect_ratio = :equal) t = 0:0.01:2π plot!(p1, sin.(t), cos.(t), color = :gray) # orthogonalizing a1 q̃1 = copy(a1) p2 = plot([0 0; q̃1[1] a2[1]], [0 0; q̃1[2] a2[2]], arrow = true, color = [:red :purple], axis = ([], false), legend = :none, xlims = (-2, 2), ylims = (-2, 2), annotations = [(q̃1[1] + 0.2, q̃1[2], L"\tilde{q}_1"), (a2[1] + 0.2, a2[2], L"a_2")], framestyle = :origin, aspect_ratio = :equal) t = 0:0.01:2π plot!(p2, sin.(t), cos.(t), color = :gray) # normalizing q̃1 q1 = q̃1 / norm(q̃1) p3 = plot([0 0; q1[1] a2[1]], [0 0; q1[2] a2[2]], arrow = true, color = [:green :purple], axis = ([], false), legend = :none, xlims = (-2, 2), ylims = (-2, 2), annotations = [(q1[1] + 0.2, q1[2], L"q_1"), (a2[1] + 0.2, a2[2], L"a_2")], framestyle = :origin, aspect_ratio = :equal) t = 0:0.01:2π plot!(p3, sin.(t), cos.(t), color = :gray) # orthogonalizing a2 q̃2 = a2 - dot(a2, q1) * q1 p4 = plot([0 0 0; q1[1] a2[1] q̃2[1]], [0 0 0; q1[2] a2[2] q̃2[2]], arrow = true, color = [:green :purple :red], xlims = (-1.5, 1.5), ylims = (-1.5, 1.5), axis = ([], false), legend = :none, annotations = [ (q1[1] + 0.2, q1[2], L"q_1"), (a2[1] + 0.2, a2[2], L"a_2"), (q̃2[1] + 0.2, q̃2[2] - 0.2, L"\tilde{q}_2"), ((a2[1] + q̃2[1]) / 2 + 1.2, (a2[2] + q̃2[2]) / 2, L"-(q_1' a_2)q_1") ], framestyle = :origin, aspect_ratio = :equal) t = 0:0.01:2π plot!(p4, sin.(t), cos.(t), color = :gray) plot!(p4, [a2[1]; q̃2[1]], [a2[2]; q̃2[2]], arrow = true, color = :black) # normalizing q̃2 q2 = q̃2 / norm(q̃2) p5 = plot([0 0; q1[1] q2[1]], [0 0; q1[2] q2[2]], arrow = true, color = :green, axis = ([], false), legend = :none, xlims = (-1.5, 1.5), ylims = (-1.5, 1.5), annotations = [(q1[1] + 0.3, q1[2] + 0.1, L"q_1"), (q2[1] + 0.3, q2[2], L"q_2")], framestyle = :origin, aspect_ratio = :equal) t = 0:0.01:2π plot!(p5, sin.(t), cos.(t), color = :gray) plot(p1, p2, p3, p4, p5) ``` ```{julia} # n = 5, k = 3 @show a1 = randn(5) @show a2 = randn(5) @show a3 = randn(5); ``` ```{julia} # for i = 1 # orthogonalization @show q̃1 = copy(a1) # test for linear independence @show norm(q̃1) ≈ 0 # normalization @show q1 = q̃1 / norm(q̃1); ``` ```{julia} # for i = 2 # orthogonalization @show q̃2 = a2 - (q1'a2) * q1 # test for linear independence @show norm(q̃2) ≈ 0 # normalization @show q2 = q̃2 / norm(q̃2); ``` ```{julia} # for i = 3 # orthogonalization @show q̃3 = a3 - (q1'a3) * q1 - (q2'a3) * q2 # test for linear independence @show norm(q̃3) ≈ 0 # Normalization @show q3 = q̃3 / norm(q̃3); ``` ```{julia} # test for orthonormality of q1, q2, q3 @show norm(q1), norm(q2), norm(q3) @show q1'q2, q1'q3, q2'q3; ``` Show by induction that $\mathbf{q}_1, \ldots, \mathbf{q}_i$ are orthonormal (optional): - Assume it's true for $i-1$. - Orthogonalization step ensures that $\tilde{\mathbf{q}}_i \perp \mathbf{q}_1, \ldots, \tilde{\mathbf{q}}_i \perp \mathbf{q}_{i-1}$. To show this, take inner product of both sides with $\mathbf{q}_j$, $j < i$ $$ \mathbf{q}_j' \tilde{\mathbf{q}}_i = \mathbf{q}_j' \mathbf{a}_i - (\mathbf{q}_1' \mathbf{a}_i) (\mathbf{q}_j' \mathbf{q}_1) - \cdots - (\mathbf{q}_{i-1}' \mathbf{a}_i) (\mathbf{q}_j' \mathbf{q}_{i-1}) = \mathbf{q}_j' \mathbf{a}_i - \mathbf{q}_j' \mathbf{a}_i = 0. $$ - So $\mathbf{q}_1, \ldots, \mathbf{q}_i$ are orthogonal. The normalization step ensures $\mathbf{q}_i$ is normal. Suppose G-S has not terminated by iteration $i$, then - $\mathbf{a}_i$ is a combination of $\mathbf{q}_1, \ldots, \mathbf{q}_i$, and - $\mathbf{q}_i$ is a combination of $\mathbf{a}_1, \ldots, \mathbf{a}_i$. Computational complexity of G-S algorithm: - Step 1 of iteration $i$ requires (1) $i-1$ inner products, $\mathbf{q}_1' \mathbf{a}_i, \ldots, \mathbf{q}_{i-1}' \mathbf{a}_i$, which costs $(i-1)(2n-1)$ flops, (2) $2n(i-1)$ flops to compute $\tilde{\mathbf{q}}_i$ - Step 2 of iteration $i$ requires less than $n$ flops. - Step 3 of iteration $i$ requires $3n$ flops to normalize $\|\tilde{\mathbf{q}}_i\|$. - Assuming no early termination, total computational cost of the GS algorithm to orthonormalize a set of $k$ vectors in $\mathbb{R}^n$ is: $$ \sum_{i=1}^k [(4n-1) (i - 1) + 3n] = (4n - 1) \frac{k(k-1)}{2} + 3nk \approx 2nk^2 = O(nk^2), $$ using $\sum_{i=1}^k (i - 1) = k(k-1)/2$.