{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "\n# Comparison of Manifold Learning methods\n\nAn illustration of dimensionality reduction on the S-curve dataset\nwith various manifold learning methods.\n\nFor a discussion and comparison of these algorithms, see the\n`manifold module page `\n\nFor a similar example, where the methods are applied to a\nsphere dataset, see `sphx_glr_auto_examples_manifold_plot_manifold_sphere.py`\n\nNote that the purpose of the MDS is to find a low-dimensional\nrepresentation of the data (here 2D) in which the distances respect well\nthe distances in the original high-dimensional space, unlike other\nmanifold-learning algorithms, it does not seeks an isotropic\nrepresentation of the data in the low-dimensional space.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "# Authors: The scikit-learn developers\n# SPDX-License-Identifier: BSD-3-Clause" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Dataset preparation\n\nWe start by generating the S-curve dataset.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "import matplotlib.pyplot as plt\n\n# unused but required import for doing 3d projections with matplotlib < 3.2\nimport mpl_toolkits.mplot3d # noqa: F401\nfrom matplotlib import ticker\n\nfrom sklearn import datasets, manifold\n\nn_samples = 1500\nS_points, S_color = datasets.make_s_curve(n_samples, random_state=0)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's look at the original data. Also define some helping\nfunctions, which we will use further on.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "def plot_3d(points, points_color, title):\n x, y, z = points.T\n\n fig, ax = plt.subplots(\n figsize=(6, 6),\n facecolor=\"white\",\n tight_layout=True,\n subplot_kw={\"projection\": \"3d\"},\n )\n fig.suptitle(title, size=16)\n col = ax.scatter(x, y, z, c=points_color, s=50, alpha=0.8)\n ax.view_init(azim=-60, elev=9)\n ax.xaxis.set_major_locator(ticker.MultipleLocator(1))\n ax.yaxis.set_major_locator(ticker.MultipleLocator(1))\n ax.zaxis.set_major_locator(ticker.MultipleLocator(1))\n\n fig.colorbar(col, ax=ax, orientation=\"horizontal\", shrink=0.6, aspect=60, pad=0.01)\n plt.show()\n\n\ndef plot_2d(points, points_color, title):\n fig, ax = plt.subplots(figsize=(3, 3), facecolor=\"white\", constrained_layout=True)\n fig.suptitle(title, size=16)\n add_2d_scatter(ax, points, points_color)\n plt.show()\n\n\ndef add_2d_scatter(ax, points, points_color, title=None):\n x, y = points.T\n ax.scatter(x, y, c=points_color, s=50, alpha=0.8)\n ax.set_title(title)\n ax.xaxis.set_major_formatter(ticker.NullFormatter())\n ax.yaxis.set_major_formatter(ticker.NullFormatter())\n\n\nplot_3d(S_points, S_color, \"Original S-curve samples\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Define algorithms for the manifold learning\n\nManifold learning is an approach to non-linear dimensionality reduction.\nAlgorithms for this task are based on the idea that the dimensionality of\nmany data sets is only artificially high.\n\nRead more in the `User Guide `.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "n_neighbors = 12 # neighborhood which is used to recover the locally linear structure\nn_components = 2 # number of coordinates for the manifold" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Locally Linear Embeddings\n\nLocally linear embedding (LLE) can be thought of as a series of local\nPrincipal Component Analyses which are globally compared to find the\nbest non-linear embedding.\nRead more in the `User Guide `.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "params = {\n \"n_neighbors\": n_neighbors,\n \"n_components\": n_components,\n \"eigen_solver\": \"auto\",\n \"random_state\": 0,\n}\n\nlle_standard = manifold.LocallyLinearEmbedding(method=\"standard\", **params)\nS_standard = lle_standard.fit_transform(S_points)\n\nlle_ltsa = manifold.LocallyLinearEmbedding(method=\"ltsa\", **params)\nS_ltsa = lle_ltsa.fit_transform(S_points)\n\nlle_hessian = manifold.LocallyLinearEmbedding(method=\"hessian\", **params)\nS_hessian = lle_hessian.fit_transform(S_points)\n\nlle_mod = manifold.LocallyLinearEmbedding(method=\"modified\", **params)\nS_mod = lle_mod.fit_transform(S_points)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "fig, axs = plt.subplots(\n nrows=2, ncols=2, figsize=(7, 7), facecolor=\"white\", constrained_layout=True\n)\nfig.suptitle(\"Locally Linear Embeddings\", size=16)\n\nlle_methods = [\n (\"Standard locally linear embedding\", S_standard),\n (\"Local tangent space alignment\", S_ltsa),\n (\"Hessian eigenmap\", S_hessian),\n (\"Modified locally linear embedding\", S_mod),\n]\nfor ax, method in zip(axs.flat, lle_methods):\n name, points = method\n add_2d_scatter(ax, points, S_color, name)\n\nplt.show()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Isomap Embedding\n\nNon-linear dimensionality reduction through Isometric Mapping.\nIsomap seeks a lower-dimensional embedding which maintains geodesic\ndistances between all points. Read more in the `User Guide `.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "isomap = manifold.Isomap(n_neighbors=n_neighbors, n_components=n_components, p=1)\nS_isomap = isomap.fit_transform(S_points)\n\nplot_2d(S_isomap, S_color, \"Isomap Embedding\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Multidimensional scaling\n\nMultidimensional scaling (MDS) seeks a low-dimensional representation\nof the data in which the distances respect well the distances in the\noriginal high-dimensional space.\nRead more in the `User Guide `.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "md_scaling = manifold.MDS(\n n_components=n_components,\n max_iter=50,\n n_init=4,\n random_state=0,\n normalized_stress=False,\n)\nS_scaling = md_scaling.fit_transform(S_points)\n\nplot_2d(S_scaling, S_color, \"Multidimensional scaling\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Spectral embedding for non-linear dimensionality reduction\n\nThis implementation uses Laplacian Eigenmaps, which finds a low dimensional\nrepresentation of the data using a spectral decomposition of the graph Laplacian.\nRead more in the `User Guide `.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "spectral = manifold.SpectralEmbedding(\n n_components=n_components, n_neighbors=n_neighbors, random_state=42\n)\nS_spectral = spectral.fit_transform(S_points)\n\nplot_2d(S_spectral, S_color, \"Spectral Embedding\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### T-distributed Stochastic Neighbor Embedding\n\nIt converts similarities between data points to joint probabilities and\ntries to minimize the Kullback-Leibler divergence between the joint probabilities\nof the low-dimensional embedding and the high-dimensional data. t-SNE has a cost\nfunction that is not convex, i.e. with different initializations we can get\ndifferent results. Read more in the `User Guide `.\n\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false }, "outputs": [], "source": [ "t_sne = manifold.TSNE(\n n_components=n_components,\n perplexity=30,\n init=\"random\",\n max_iter=250,\n random_state=0,\n)\nS_t_sne = t_sne.fit_transform(S_points)\n\nplot_2d(S_t_sne, S_color, \"T-distributed Stochastic \\n Neighbor Embedding\")" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.21" } }, "nbformat": 4, "nbformat_minor": 0 }