{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "\n# Gaussian Mixture Model Sine Curve\n\nThis example demonstrates the behavior of Gaussian mixture models fit on data\nthat was not sampled from a mixture of Gaussian random variables. The dataset\nis formed by 100 points loosely spaced following a noisy sine curve. There is\ntherefore no ground truth value for the number of Gaussian components.\n\nThe first model is a classical Gaussian Mixture Model with 10 components fit\nwith the Expectation-Maximization algorithm.\n\nThe second model is a Bayesian Gaussian Mixture Model with a Dirichlet process\nprior fit with variational inference. The low value of the concentration prior\nmakes the model favor a lower number of active components. This models\n\"decides\" to focus its modeling power on the big picture of the structure of\nthe dataset: groups of points with alternating directions modeled by\nnon-diagonal covariance matrices. Those alternating directions roughly capture\nthe alternating nature of the original sine signal.\n\nThe third model is also a Bayesian Gaussian mixture model with a Dirichlet\nprocess prior but this time the value of the concentration prior is higher\ngiving the model more liberty to model the fine-grained structure of the data.\nThe result is a mixture with a larger number of active components that is\nsimilar to the first model where we arbitrarily decided to fix the number of\ncomponents to 10.\n\nWhich model is the best is a matter of subjective judgment: do we want to\nfavor models that only capture the big picture to summarize and explain most of\nthe structure of the data while ignoring the details or do we prefer models\nthat closely follow the high density regions of the signal?\n\nThe last two panels show how we can sample from the last two models. The\nresulting samples distributions do not look exactly like the original data\ndistribution. The difference primarily stems from the approximation error we\nmade by using a model that assumes that the data was generated by a finite\nnumber of Gaussian components instead of a continuous noisy sine curve.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# Authors: The scikit-learn developers\n# SPDX-License-Identifier: BSD-3-Clause\n\nimport itertools\n\nimport matplotlib as mpl\nimport matplotlib.pyplot as plt\nimport numpy as np\nfrom scipy import linalg\n\nfrom sklearn import mixture\n\ncolor_iter = itertools.cycle([\"navy\", \"c\", \"cornflowerblue\", \"gold\", \"darkorange\"])\n\n\ndef plot_results(X, Y, means, covariances, index, title):\n splot = plt.subplot(5, 1, 1 + index)\n for i, (mean, covar, color) in enumerate(zip(means, covariances, color_iter)):\n v, w = linalg.eigh(covar)\n v = 2.0 * np.sqrt(2.0) * np.sqrt(v)\n u = w[0] / linalg.norm(w[0])\n # as the DP will not use every component it has access to\n # unless it needs it, we shouldn't plot the redundant\n # components.\n if not np.any(Y == i):\n continue\n plt.scatter(X[Y == i, 0], X[Y == i, 1], 0.8, color=color)\n\n # Plot an ellipse to show the Gaussian component\n angle = np.arctan(u[1] / u[0])\n angle = 180.0 * angle / np.pi # convert to degrees\n ell = mpl.patches.Ellipse(mean, v[0], v[1], angle=180.0 + angle, color=color)\n ell.set_clip_box(splot.bbox)\n ell.set_alpha(0.5)\n splot.add_artist(ell)\n\n plt.xlim(-6.0, 4.0 * np.pi - 6.0)\n plt.ylim(-5.0, 5.0)\n plt.title(title)\n plt.xticks(())\n plt.yticks(())\n\n\ndef plot_samples(X, Y, n_components, index, title):\n plt.subplot(5, 1, 4 + index)\n for i, color in zip(range(n_components), color_iter):\n # as the DP will not use every component it has access to\n # unless it needs it, we shouldn't plot the redundant\n # components.\n if not np.any(Y == i):\n continue\n plt.scatter(X[Y == i, 0], X[Y == i, 1], 0.8, color=color)\n\n plt.xlim(-6.0, 4.0 * np.pi - 6.0)\n plt.ylim(-5.0, 5.0)\n plt.title(title)\n plt.xticks(())\n plt.yticks(())\n\n\n# Parameters\nn_samples = 100\n\n# Generate random sample following a sine curve\nnp.random.seed(0)\nX = np.zeros((n_samples, 2))\nstep = 4.0 * np.pi / n_samples\n\nfor i in range(X.shape[0]):\n x = i * step - 6.0\n X[i, 0] = x + np.random.normal(0, 0.1)\n X[i, 1] = 3.0 * (np.sin(x) + np.random.normal(0, 0.2))\n\nplt.figure(figsize=(10, 10))\nplt.subplots_adjust(\n bottom=0.04, top=0.95, hspace=0.2, wspace=0.05, left=0.03, right=0.97\n)\n\n# Fit a Gaussian mixture with EM using ten components\ngmm = mixture.GaussianMixture(\n n_components=10, covariance_type=\"full\", max_iter=100\n).fit(X)\nplot_results(\n X, gmm.predict(X), gmm.means_, gmm.covariances_, 0, \"Expectation-maximization\"\n)\n\ndpgmm = mixture.BayesianGaussianMixture(\n n_components=10,\n covariance_type=\"full\",\n weight_concentration_prior=1e-2,\n weight_concentration_prior_type=\"dirichlet_process\",\n mean_precision_prior=1e-2,\n covariance_prior=1e0 * np.eye(2),\n init_params=\"random\",\n max_iter=100,\n random_state=2,\n).fit(X)\nplot_results(\n X,\n dpgmm.predict(X),\n dpgmm.means_,\n dpgmm.covariances_,\n 1,\n \"Bayesian Gaussian mixture models with a Dirichlet process prior \"\n r\"for $\\gamma_0=0.01$.\",\n)\n\nX_s, y_s = dpgmm.sample(n_samples=2000)\nplot_samples(\n X_s,\n y_s,\n dpgmm.n_components,\n 0,\n \"Gaussian mixture with a Dirichlet process prior \"\n r\"for $\\gamma_0=0.01$ sampled with $2000$ samples.\",\n)\n\ndpgmm = mixture.BayesianGaussianMixture(\n n_components=10,\n covariance_type=\"full\",\n weight_concentration_prior=1e2,\n weight_concentration_prior_type=\"dirichlet_process\",\n mean_precision_prior=1e-2,\n covariance_prior=1e0 * np.eye(2),\n init_params=\"kmeans\",\n max_iter=100,\n random_state=2,\n).fit(X)\nplot_results(\n X,\n dpgmm.predict(X),\n dpgmm.means_,\n dpgmm.covariances_,\n 2,\n \"Bayesian Gaussian mixture models with a Dirichlet process prior \"\n r\"for $\\gamma_0=100$\",\n)\n\nX_s, y_s = dpgmm.sample(n_samples=2000)\nplot_samples(\n X_s,\n y_s,\n dpgmm.n_components,\n 1,\n \"Gaussian mixture with a Dirichlet process prior \"\n r\"for $\\gamma_0=100$ sampled with $2000$ samples.\",\n)\n\nplt.show()" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.21" } }, "nbformat": 4, "nbformat_minor": 0 }