{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Evaluation of Poincare Embeddings" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This notebook demonstrates how well Poincare embeddings perform on the tasks detailed in the [original paper](https://arxiv.org/pdf/1705.08039.pdf) about the embeddings.\n", "\n", "The following two external, open-source implementations are used - \n", "1. [C++](https://github.com/TatsuyaShirakawa/poincare-embedding)\n", "2. [Numpy](https://github.com/nishnik/poincare_embeddings)\n", "\n", "This is the list of tasks - \n", "1. WordNet reconstruction\n", "2. WordNet link prediction\n", "3. Link prediction in collaboration networks (evaluation incomplete)\n", "4. Lexical entailment on HyperLex\n", "\n", "A more detailed explanation of the tasks and the evaluation methodology is present in the individual evaluation subsections." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 1. Setup\n", "\n", "The following section performs the following - \n", "1. Imports required python libraries and downloads the wordnet data\n", "2. Clones the repositories containing the C++ and Numpy implementations of the Poincare embeddings\n", "3. Applies patches containing minor changes to the implementations.\n", "4. Compiles the C++ sources to create a binary" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/home/misha/git/gensim\n" ] } ], "source": [ "%cd ../.." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[33mYou are using pip version 19.0.1, however version 19.1 is available.\r\n", "You should consider upgrading via the 'pip install --upgrade pip' command.\u001b[0m\r\n" ] } ], "source": [ "# Some libraries need to be installed that are not part of Gensim\n", "! pip install click>=6.7 nltk>=3.2.5 prettytable>=0.7.2 pygtrie>=2.2" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "[nltk_data] Downloading package wordnet to /home/misha/nltk_data...\n", "[nltk_data] Package wordnet is already up-to-date!\n" ] }, { "data": { "text/plain": [ "True" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "import csv\n", "from collections import OrderedDict\n", "from IPython.display import display, HTML\n", "import logging\n", "import os\n", "import pickle\n", "import random\n", "import re\n", "\n", "import click\n", "from gensim.models.poincare import PoincareModel, PoincareRelations, \\\n", " ReconstructionEvaluation, LinkPredictionEvaluation, \\\n", " LexicalEntailmentEvaluation, PoincareKeyedVectors\n", "from gensim.utils import check_output\n", "import nltk\n", "from prettytable import PrettyTable\n", "from smart_open import smart_open\n", "\n", "logging.basicConfig(level=logging.INFO)\n", "nltk.download('wordnet')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Please set the variable `parent_directory` below to change the directory to which the repositories are cloned." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/home/misha/git/gensim/docs/notebooks\n" ] } ], "source": [ "%cd docs/notebooks/" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "current_directory = os.getcwd()" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [], "source": [ "# Change this variable to `False` to not remove and re-download repos for external implementations\n", "force_setup = False\n", "\n", "# The poincare datasets, models and source code for external models are downloaded to this directory\n", "parent_directory = os.path.join(current_directory, 'poincare')\n", "! mkdir -p {parent_directory}" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/home/misha/git/gensim/docs/notebooks/poincare\n" ] } ], "source": [ "%cd {parent_directory}\n", "\n", "# Clone repos\n", "np_repo_name = 'poincare-np-embedding'\n", "if force_setup and os.path.exists(np_repo_name):\n", " ! rm -rf {np_repo_name}\n", "clone_np_repo = not os.path.exists(np_repo_name)\n", "if clone_np_repo:\n", " ! git clone https://github.com/nishnik/poincare_embeddings.git {np_repo_name}\n", "\n", "cpp_repo_name = 'poincare-cpp-embedding'\n", "if force_setup and os.path.exists(cpp_repo_name):\n", " ! rm -rf {cpp_repo_name}\n", "clone_cpp_repo = not os.path.exists(cpp_repo_name)\n", "if clone_cpp_repo:\n", " ! git clone https://github.com/TatsuyaShirakawa/poincare-embedding.git {cpp_repo_name}\n", "\n", "patches_applied = False" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [], "source": [ "# Apply patches\n", "if clone_cpp_repo and not patches_applied:\n", " %cd {cpp_repo_name}\n", " ! git apply ../poincare_burn_in_eps.patch\n", "\n", "if clone_np_repo and not patches_applied:\n", " %cd ../{np_repo_name}\n", " ! git apply ../poincare_numpy.patch\n", " \n", "patches_applied = True" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "/home/misha/git/gensim/docs/notebooks/poincare/poincare-cpp-embedding\n", "/home/misha/git/gensim/docs/notebooks/poincare/poincare-cpp-embedding/work\n", "-- The C compiler identification is GNU 7.4.0\n", "-- The CXX compiler identification is GNU 7.4.0\n", "-- Check for working C compiler: /usr/bin/cc\n", "-- Check for working C compiler: /usr/bin/cc -- works\n", "-- Detecting C compiler ABI info\n", "-- Detecting C compiler ABI info - done\n", "-- Detecting C compile features\n", "-- Detecting C compile features - done\n", "-- Check for working CXX compiler: /usr/bin/c++\n", "-- Check for working CXX compiler: /usr/bin/c++ -- works\n", "-- Detecting CXX compiler ABI info\n", "-- Detecting CXX compiler ABI info - done\n", "-- Detecting CXX compile features\n", "-- Detecting CXX compile features - done\n", "-- Looking for pthread.h\n", "-- Looking for pthread.h - found\n", "-- Looking for pthread_create\n", "-- Looking for pthread_create - not found\n", "-- Check if compiler accepts -pthread\n", "-- Check if compiler accepts -pthread - yes\n", "-- Found Threads: TRUE \n", "-- Configuring done\n", "-- Generating done\n", "-- Build files have been written to: /home/misha/git/gensim/docs/notebooks/poincare/poincare-cpp-embedding/work\n", "\u001b[35m\u001b[1mScanning dependencies of target poincare_embedding\u001b[0m\n", "[ 50%] \u001b[32mBuilding CXX object CMakeFiles/poincare_embedding.dir/src/poincare_embedding.cpp.o\u001b[0m\n", "[100%] \u001b[32m\u001b[1mLinking CXX executable poincare_embedding\u001b[0m\n", "[100%] Built target poincare_embedding\n", "/home/misha/git/gensim/docs/notebooks\n" ] } ], "source": [ "# Compile the code for the external c++ implementation into a binary\n", "%cd {parent_directory}/{cpp_repo_name}\n", "!mkdir -p work\n", "%cd work\n", "!cmake ..\n", "!make\n", "%cd {current_directory}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You might need to install an updated version of `cmake` to be able to compile the source code. Please make sure that the binary `poincare_embedding` has been created before proceeding by verifying the above cell does not raise an error." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "cpp_binary_path = os.path.join(parent_directory, cpp_repo_name, 'work', 'poincare_embedding')\n", "assert(os.path.exists(cpp_binary_path)), 'Binary file doesnt exist at %s' % cpp_binary_path" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 2. Training\n", "\n", "### 2.1 Create the data" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# These directories are auto created in the current directory for storing poincare datasets and models\n", "data_directory = os.path.join(parent_directory, 'data')\n", "models_directory = os.path.join(parent_directory, 'models')\n", "\n", "# Create directories\n", "! mkdir -p {data_directory}\n", "! mkdir -p {models_directory}" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "82115 nouns\n", "743241 hypernyms\n" ] } ], "source": [ "# Prepare the WordNet data\n", "# Can also be downloaded directly from -\n", "# https://github.com/jayantj/gensim/raw/wordnet_data/docs/notebooks/poincare/data/wordnet_noun_hypernyms.tsv\n", "\n", "wordnet_file = os.path.join(data_directory, 'wordnet_noun_hypernyms.tsv')\n", "if not os.path.exists(wordnet_file):\n", " ! python {parent_directory}/{cpp_repo_name}/scripts/create_wordnet_noun_hierarchy.py {wordnet_file}" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "--2019-05-10 12:18:20-- http://people.ds.cam.ac.uk/iv250/paper/hyperlex/hyperlex-data.zip\n", "Resolving people.ds.cam.ac.uk (people.ds.cam.ac.uk)... 131.111.3.47\n", "Connecting to people.ds.cam.ac.uk (people.ds.cam.ac.uk)|131.111.3.47|:80... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 183900 (180K) [application/zip]\n", "Saving to: ‘/home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex-data.zip’\n", "\n", "/home/misha/git/gen 100%[===================>] 179.59K 158KB/s in 1.1s \n", "\n", "2019-05-10 12:18:22 (158 KB/s) - ‘/home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex-data.zip’ saved [183900/183900]\n", "\n", "Archive: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex-data.zip\n", " creating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/nouns-verbs/\n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/nouns-verbs/hyperlex-verbs.txt \n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/nouns-verbs/hyperlex-nouns.txt \n", " creating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/\n", " creating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/random/\n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/random/hyperlex_training_all_random.txt \n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/random/hyperlex_test_all_random.txt \n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/random/hyperlex_dev_all_random.txt \n", " creating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/lexical/\n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/lexical/hyperlex_dev_all_lexical.txt \n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/lexical/hyperlex_test_all_lexical.txt \n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/splits/lexical/hyperlex_training_all_lexical.txt \n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/hyperlex-all.txt \n", " inflating: /home/misha/git/gensim/docs/notebooks/poincare/data/hyperlex/README.txt \n" ] } ], "source": [ "# Prepare the HyperLex data\n", "hyperlex_url = \"http://people.ds.cam.ac.uk/iv250/paper/hyperlex/hyperlex-data.zip\"\n", "! wget {hyperlex_url} -O {data_directory}/hyperlex-data.zip\n", "if os.path.exists(os.path.join(data_directory, 'hyperlex')):\n", " ! rm -r {data_directory}/hyperlex\n", "! unzip {data_directory}/hyperlex-data.zip -d {data_directory}/hyperlex/\n", "hyperlex_file = os.path.join(data_directory, 'hyperlex', 'nouns-verbs', 'hyperlex-nouns.txt')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 2.2 Training [C++ embeddings](https://github.com/TatsuyaShirakawa/poincare-embedding)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def train_cpp_model(\n", " binary_path, data_file, output_file, dim, epochs, neg,\n", " num_threads, epsilon, burn_in, seed=0):\n", " \"\"\"Train a poincare embedding using the c++ implementation\n", " \n", " Args:\n", " binary_path (str): Path to the compiled c++ implementation binary\n", " data_file (str): Path to tsv file containing relation pairs\n", " output_file (str): Path to output file containing model\n", " dim (int): Number of dimensions of the trained model\n", " epochs (int): Number of epochs to use\n", " neg (int): Number of negative samples to use\n", " num_threads (int): Number of threads to use for training the model\n", " epsilon (float): Constant used for clipping below a norm of one\n", " burn_in (int): Number of epochs to use for burn-in init (0 means no burn-in)\n", " \n", " Notes: \n", " If `output_file` already exists, skips training\n", " \"\"\"\n", " if os.path.exists(output_file):\n", " print('File %s exists, skipping' % output_file)\n", " return\n", " args = {\n", " 'dim': dim,\n", " 'max_epoch': epochs,\n", " 'neg_size': neg,\n", " 'num_thread': num_threads,\n", " 'epsilon': epsilon,\n", " 'burn_in': burn_in,\n", " 'learning_rate_init': 0.1,\n", " 'learning_rate_final': 0.0001,\n", " }\n", " cmd = [binary_path, data_file, output_file]\n", " for option, value in args.items():\n", " cmd.append(\"--%s\" % option)\n", " cmd.append(str(value))\n", " \n", " return check_output(args=cmd)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "model_sizes = [5, 10, 20, 50, 100, 200]\n", "default_params = {\n", " 'neg': 20,\n", " 'epochs': 50,\n", " 'threads': 8,\n", " 'eps': 1e-6,\n", " 'burn_in': 0,\n", " 'batch_size': 10,\n", " 'reg': 0.0\n", "}\n", "\n", "non_default_params = {\n", " 'neg': [10],\n", " 'epochs': [200],\n", " 'burn_in': [10]\n", "}" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "scrolled": true }, "outputs": [], "source": [ "def cpp_model_name_from_params(params, prefix):\n", " param_keys = ['burn_in', 'epochs', 'neg', 'eps', 'threads']\n", " name = ['%s_%s' % (key, params[key]) for key in sorted(param_keys)]\n", " return '%s_%s' % (prefix, '_'.join(name))\n", "\n", "def train_model_with_params(params, train_file, model_sizes, prefix, implementation):\n", " \"\"\"Trains models with given params for multiple model sizes using the given implementation\n", " \n", " Args:\n", " params (dict): parameters to train the model with\n", " train_file (str): Path to tsv file containing relation pairs\n", " model_sizes (list): list of dimension sizes (integer) to train the model with\n", " prefix (str): prefix to use for the saved model filenames\n", " implementation (str): whether to use the numpy or c++ implementation,\n", " allowed values: 'numpy', 'c++'\n", " \n", " Returns:\n", " tuple (model_name, model_files)\n", " model_files is a dict of (size, filename) pairs\n", " Example: ('cpp_model_epochs_50', {5: 'models/cpp_model_epochs_50_dim_5'})\n", " \"\"\"\n", " files = {}\n", " if implementation == 'c++':\n", " model_name = cpp_model_name_from_params(params, prefix)\n", " elif implementation == 'numpy':\n", " model_name = np_model_name_from_params(params, prefix)\n", " elif implementation == 'gensim':\n", " model_name = gensim_model_name_from_params(params, prefix)\n", " else:\n", " raise ValueError('Given implementation %s not found' % implementation)\n", " for model_size in model_sizes:\n", " output_file_name = '%s_dim_%d' % (model_name, model_size)\n", " output_file = os.path.join(models_directory, output_file_name)\n", " print('Training model %s of size %d' % (model_name, model_size))\n", " if implementation == 'c++':\n", " out = train_cpp_model(\n", " cpp_binary_path, train_file, output_file, model_size,\n", " params['epochs'], params['neg'], params['threads'],\n", " params['eps'], params['burn_in'], seed=0)\n", " elif implementation == 'numpy':\n", " train_external_numpy_model(\n", " python_script_path, train_file, output_file, model_size,\n", " params['epochs'], params['neg'], seed=0)\n", " elif implementation == 'gensim':\n", " train_gensim_model(\n", " train_file, output_file, model_size, params['epochs'],\n", " params['neg'], params['burn_in'], params['batch_size'], params['reg'], seed=0)\n", " else:\n", " raise ValueError('Given implementation %s not found' % implementation)\n", " files[model_size] = output_file\n", " return (model_name, files)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "model_files = {}" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 5\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 10\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 20\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 50\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 100\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_20_threads_8 of size 200\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 5\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 10\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 20\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 50\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 100\n", "Training model cpp_model_burn_in_0_epochs_50_eps_1e-06_neg_10_threads_8 of size 200\n", "Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 5\n", "Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 10\n", "Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 20\n", "Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 50\n", "Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 100\n", "Training model cpp_model_burn_in_0_epochs_200_eps_1e-06_neg_20_threads_8 of size 200\n", "Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 5\n", "Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 10\n", "Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 20\n", "Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 50\n", "Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 100\n", "Training model cpp_model_burn_in_10_epochs_50_eps_1e-06_neg_20_threads_8 of size 200\n" ] } ], "source": [ "model_files['c++'] = {}\n", "# Train c++ models with default params\n", "model_name, files = train_model_with_params(default_params, wordnet_file, model_sizes, 'cpp_model', 'c++')\n", "model_files['c++'][model_name] = {}\n", "for dim, filepath in files.items():\n", " model_files['c++'][model_name][dim] = filepath\n", "# Train c++ models with non-default params\n", "for param, values in non_default_params.items():\n", " params = default_params.copy()\n", " for value in values:\n", " params[param] = value\n", " model_name, files = train_model_with_params(params, wordnet_file, model_sizes, 'cpp_model', 'c++')\n", " model_files['c++'][model_name] = {}\n", " for dim, filepath in files.items():\n", " model_files['c++'][model_name][dim] = filepath" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 2.3 Training [numpy embeddings](https://github.com/nishnik/poincare_embeddings) (non-gensim)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "python_script_path = os.path.join(parent_directory, np_repo_name, 'poincare.py')" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def np_model_name_from_params(params, prefix):\n", " param_keys = ['neg', 'epochs']\n", " name = ['%s_%s' % (key, params[key]) for key in sorted(param_keys)]\n", " return '%s_%s' % (prefix, '_'.join(name))\n", "\n", "def train_external_numpy_model(\n", " script_path, data_file, output_file, dim, epochs, neg, seed=0):\n", " \"\"\"Train a poincare embedding using an external numpy implementation\n", " \n", " Args:\n", " script_path (str): Path to the Python training script\n", " data_file (str): Path to tsv file containing relation pairs\n", " output_file (str): Path to output file containing model\n", " dim (int): Number of dimensions of the trained model\n", " epochs (int): Number of epochs to use\n", " neg (int): Number of negative samples to use\n", " \n", " Notes: \n", " If `output_file` already exists, skips training\n", " \"\"\"\n", " if os.path.exists(output_file):\n", " print('File %s exists, skipping' % output_file)\n", " return\n", " args = {\n", " 'input-file': data_file,\n", " 'output-file': output_file,\n", " 'dimensions': dim,\n", " 'epochs': epochs,\n", " 'learning-rate': 0.01,\n", " 'num-negative': neg,\n", " }\n", " cmd = ['python', script_path]\n", " for option, value in args.items():\n", " cmd.append(\"--%s\" % option)\n", " cmd.append(str(value))\n", " \n", " return check_output(args=cmd)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Training model np_model_epochs_50_neg_20 of size 5\n", "Training model np_model_epochs_50_neg_20 of size 10\n", "Training model np_model_epochs_50_neg_20 of size 20\n", "Training model np_model_epochs_50_neg_20 of size 50\n", "Training model np_model_epochs_50_neg_20 of size 100\n", "Training model np_model_epochs_50_neg_20 of size 200\n" ] } ], "source": [ "model_files['numpy'] = {}\n", "# Train models with default params\n", "model_name, files = train_model_with_params(default_params, wordnet_file, model_sizes, 'np_model', 'numpy')\n", "model_files['numpy'][model_name] = {}\n", "for dim, filepath in files.items():\n", " model_files['numpy'][model_name][dim] = filepath" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 2.4 Training gensim embeddings" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def gensim_model_name_from_params(params, prefix):\n", " param_keys = ['neg', 'epochs', 'burn_in', 'batch_size', 'reg']\n", " name = ['%s_%s' % (key, params[key]) for key in sorted(param_keys)]\n", " return '%s_%s' % (prefix, '_'.join(name))\n", "\n", "def train_gensim_model(\n", " data_file, output_file, dim, epochs, neg, burn_in, batch_size, reg, seed=0):\n", " \"\"\"Train a poincare embedding using gensim implementation\n", " \n", " Args:\n", " data_file (str): Path to tsv file containing relation pairs\n", " output_file (str): Path to output file containing model\n", " dim (int): Number of dimensions of the trained model\n", " epochs (int): Number of epochs to use\n", " neg (int): Number of negative samples to use\n", " burn_in (int): Number of epochs to use for burn-in initialization\n", " batch_size (int): Size of batch to use for training\n", " reg (float): Coefficient used for l2-regularization while training\n", " \n", " Notes: \n", " If `output_file` already exists, skips training\n", " \"\"\"\n", " if os.path.exists(output_file):\n", " print('File %s exists, skipping' % output_file)\n", " return\n", " train_data = PoincareRelations(data_file)\n", " model = PoincareModel(train_data, size=dim, negative=neg, burn_in=burn_in, regularization_coeff=reg)\n", " model.train(epochs=epochs, batch_size=batch_size)\n", " model.save(output_file)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "non_default_params_gensim = [\n", " {'neg': 10,},\n", " {'burn_in': 10,},\n", " {'batch_size': 50,},\n", " {'neg': 10, 'reg': 1, 'burn_in': 10, 'epochs': 200},\n", "]" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:loading relations from train data..\n", "WARNING:smart_open.smart_open_lib:this function is deprecated, use smart_open.open instead\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Training model gensim_model_batch_size_10_burn_in_0_epochs_50_neg_20_reg_0.0 of size 5\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:loaded 743241 relations from train data, 82114 nodes\n", "INFO:gensim.models.poincare:training model of size 5 with 1 workers on 743241 relations for 50 epochs and 0 burn-in epochs, using lr=0.10000 burn-in lr=0.01000 negative=20\n", "INFO:gensim.models.poincare:starting training (50 epochs)----------------------------------------\n", "INFO:gensim.models.poincare:training on epoch 1, examples #9990-#10000, loss: 30.71\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4749.76 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #19990-#20000, loss: 30.62\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4909.96 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #29990-#30000, loss: 30.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4855.20 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #39990-#40000, loss: 30.49\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4912.49 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #49990-#50000, loss: 30.44\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4887.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #59990-#60000, loss: 30.39\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4700.32 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #69990-#70000, loss: 30.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4834.01 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #79990-#80000, loss: 30.31\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4806.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #89990-#90000, loss: 30.28\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5037.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #99990-#100000, loss: 30.24\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5029.46 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #109990-#110000, loss: 30.22\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.96 s, 5111.02 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #119990-#120000, loss: 30.20\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4915.07 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #129990-#130000, loss: 30.18\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.28 s, 4383.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #139990-#140000, loss: 30.14\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4619.44 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #149990-#150000, loss: 30.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.23 s, 4489.14 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #159990-#160000, loss: 30.09\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4831.86 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #169990-#170000, loss: 30.06\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.95 s, 5120.16 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #179990-#180000, loss: 30.06\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.55 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #189990-#190000, loss: 30.03\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.23 s, 4480.38 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #199990-#200000, loss: 29.99\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4630.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #209990-#210000, loss: 29.98\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.43 s, 4106.87 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #219990-#220000, loss: 29.96\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4587.86 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #229990-#230000, loss: 29.93\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4963.33 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #239990-#240000, loss: 29.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4701.92 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #249990-#250000, loss: 29.89\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4744.32 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #259990-#260000, loss: 29.88\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.30 s, 4344.27 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #269990-#270000, loss: 29.86\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3860.48 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #279990-#280000, loss: 29.84\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3889.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #289990-#290000, loss: 29.81\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.29 s, 4357.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #299990-#300000, loss: 29.78\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4823.16 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #309990-#310000, loss: 29.74\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4990.55 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #319990-#320000, loss: 29.75\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.94 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #329990-#330000, loss: 29.71\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4809.07 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #339990-#340000, loss: 29.68\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4787.32 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #349990-#350000, loss: 29.68\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.25 s, 4451.75 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #359990-#360000, loss: 29.65\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.44 s, 4092.97 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #369990-#370000, loss: 29.63\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3867.21 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #379990-#380000, loss: 29.62\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.51 s, 3980.40 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #389990-#390000, loss: 29.59\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4463.61 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #399990-#400000, loss: 29.59\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5001.52 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #409990-#410000, loss: 29.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4754.73 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #419990-#420000, loss: 29.53\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4533.89 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #429990-#430000, loss: 29.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4778.84 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #439990-#440000, loss: 29.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4586.84 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #449990-#450000, loss: 29.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4713.23 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 1, examples #459990-#460000, loss: 29.43\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4918.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #469990-#470000, loss: 29.41\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4671.67 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #479990-#480000, loss: 29.41\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4679.77 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #489990-#490000, loss: 29.38\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4545.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #499990-#500000, loss: 29.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.52 s, 3969.84 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #509990-#510000, loss: 29.30\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.38 s, 4198.87 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #519990-#520000, loss: 29.32\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.44 s, 4092.34 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #529990-#530000, loss: 29.27\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3885.30 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #539990-#540000, loss: 29.26\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4838.04 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #549990-#550000, loss: 29.21\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.88 s, 3469.95 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #559990-#560000, loss: 29.18\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.95 s, 3389.43 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #569990-#570000, loss: 29.17\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.67 s, 3742.41 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #579990-#580000, loss: 29.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.98 s, 3351.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #589990-#590000, loss: 29.10\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.95 s, 3387.14 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #599990-#600000, loss: 29.07\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.27 s, 3054.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #609990-#610000, loss: 29.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.82 s, 3551.59 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #619990-#620000, loss: 29.00\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.94 s, 3402.01 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #629990-#630000, loss: 29.00\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.53 s, 3948.00 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #639990-#640000, loss: 28.97\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.26 s, 3068.72 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #649990-#650000, loss: 28.91\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.73 s, 3663.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #659990-#660000, loss: 28.94\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.47 s, 2879.20 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #669990-#670000, loss: 28.88\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.73 s, 3662.08 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #679990-#680000, loss: 28.85\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.91 s, 3432.97 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #689990-#690000, loss: 28.84\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.12 s, 3202.25 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #699990-#700000, loss: 28.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.52 s, 2837.12 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #709990-#710000, loss: 28.72\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.91 s, 3437.84 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #719990-#720000, loss: 28.72\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.34 s, 4277.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #729990-#730000, loss: 28.68\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.27 s, 3056.02 examples / s\n", "INFO:gensim.models.poincare:training on epoch 1, examples #739990-#740000, loss: 28.65\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3862.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #9990-#10000, loss: 28.36\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.32 s, 3015.39 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #19990-#20000, loss: 28.38\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.44 s, 4098.29 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #29990-#30000, loss: 28.33\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.24 s, 3082.26 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #39990-#40000, loss: 28.24\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.37 s, 4215.99 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #49990-#50000, loss: 28.24\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.26 s, 3065.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #59990-#60000, loss: 28.20\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.47 s, 4056.48 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #69990-#70000, loss: 28.14\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.92 s, 3424.05 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #79990-#80000, loss: 28.11\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.10 s, 3225.82 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #89990-#90000, loss: 28.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.87 s, 3485.15 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #99990-#100000, loss: 28.07\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.04 s, 3290.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #109990-#110000, loss: 28.01\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.30 s, 4350.37 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #119990-#120000, loss: 27.98\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.93 s, 3410.48 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #129990-#130000, loss: 27.93\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.88 s, 3476.53 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #139990-#140000, loss: 27.88\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.85 s, 3509.82 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #149990-#150000, loss: 27.86\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.89 s, 3455.85 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #159990-#160000, loss: 27.81\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.47 s, 4049.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #169990-#170000, loss: 27.80\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.89 s, 3462.15 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #179990-#180000, loss: 27.76\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.02 s, 3309.25 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 2, examples #189990-#190000, loss: 27.72\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.40 s, 4170.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #199990-#200000, loss: 27.67\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.29 s, 3041.13 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #209990-#210000, loss: 27.59\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4856.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #219990-#220000, loss: 27.52\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.05 s, 3280.16 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #229990-#230000, loss: 27.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.67 s, 3743.50 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #239990-#240000, loss: 27.49\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.74 s, 3646.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #249990-#250000, loss: 27.43\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.97 s, 3371.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #259990-#260000, loss: 27.42\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4667.36 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #269990-#270000, loss: 27.34\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.97 s, 3368.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #279990-#280000, loss: 27.31\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.79 s, 3580.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #289990-#290000, loss: 27.29\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.61 s, 3830.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #299990-#300000, loss: 27.24\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.17 s, 3150.43 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #309990-#310000, loss: 27.21\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.44 s, 4090.56 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #319990-#320000, loss: 27.17\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.84 s, 3520.51 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #329990-#330000, loss: 27.07\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.06 s, 3264.34 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #339990-#340000, loss: 27.06\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.61 s, 3826.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #349990-#350000, loss: 26.97\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.00 s, 3328.54 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #359990-#360000, loss: 26.94\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.12 s, 3207.03 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #369990-#370000, loss: 26.87\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.33 s, 4296.87 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #379990-#380000, loss: 26.88\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4729.59 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #389990-#390000, loss: 26.78\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4997.96 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #399990-#400000, loss: 26.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4942.94 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #409990-#410000, loss: 26.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4938.90 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #419990-#420000, loss: 26.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4861.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #429990-#430000, loss: 26.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5017.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #439990-#440000, loss: 26.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5031.36 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #449990-#450000, loss: 26.53\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5088.95 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #459990-#460000, loss: 26.56\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5020.40 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #469990-#470000, loss: 26.41\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5017.54 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #479990-#480000, loss: 26.40\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4909.68 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #489990-#490000, loss: 26.28\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.52 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #499990-#500000, loss: 26.20\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4794.14 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #509990-#510000, loss: 26.19\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4875.88 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #519990-#520000, loss: 26.21\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5053.06 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #529990-#530000, loss: 26.08\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5075.26 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #539990-#540000, loss: 26.09\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4992.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #549990-#550000, loss: 26.09\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4949.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #559990-#560000, loss: 25.96\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4884.55 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #569990-#570000, loss: 25.94\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4952.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #579990-#580000, loss: 25.85\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4944.52 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #589990-#590000, loss: 25.86\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4940.39 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #599990-#600000, loss: 25.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5019.82 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #609990-#610000, loss: 25.73\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4921.34 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #619990-#620000, loss: 25.66\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4970.34 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #629990-#630000, loss: 25.61\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4992.80 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #639990-#640000, loss: 25.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4969.25 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #649990-#650000, loss: 25.57\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5024.53 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 2, examples #659990-#660000, loss: 25.53\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4898.76 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #669990-#670000, loss: 25.38\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5047.43 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #679990-#680000, loss: 25.42\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4961.59 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #689990-#690000, loss: 25.30\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4949.78 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #699990-#700000, loss: 25.27\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4917.41 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #709990-#710000, loss: 25.24\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5010.41 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #719990-#720000, loss: 25.14\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4948.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #729990-#730000, loss: 25.10\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5015.81 examples / s\n", "INFO:gensim.models.poincare:training on epoch 2, examples #739990-#740000, loss: 25.10\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4997.04 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #9990-#10000, loss: 24.59\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4983.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #19990-#20000, loss: 24.52\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.94 s, 5148.98 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #29990-#30000, loss: 24.51\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4915.44 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #39990-#40000, loss: 24.37\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5037.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #49990-#50000, loss: 24.37\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.40 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #59990-#60000, loss: 24.29\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5063.77 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #69990-#70000, loss: 24.23\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4906.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #79990-#80000, loss: 24.14\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5077.38 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #89990-#90000, loss: 24.08\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.96 s, 5107.00 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #99990-#100000, loss: 24.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5035.09 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #109990-#110000, loss: 24.03\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5060.39 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #119990-#120000, loss: 24.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.95 s, 5136.46 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #129990-#130000, loss: 23.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.59 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #139990-#140000, loss: 23.88\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4857.96 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #149990-#150000, loss: 23.83\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.23 s, 4483.37 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #159990-#160000, loss: 23.79\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4960.81 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #169990-#170000, loss: 23.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4685.71 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #179990-#180000, loss: 23.65\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4726.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #189990-#190000, loss: 23.64\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.85 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #199990-#200000, loss: 23.56\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4797.30 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #209990-#210000, loss: 23.51\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4458.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #219990-#220000, loss: 23.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4800.01 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #229990-#230000, loss: 23.41\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4777.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #239990-#240000, loss: 23.37\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5039.90 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #249990-#250000, loss: 23.32\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5003.08 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #259990-#260000, loss: 23.26\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4905.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #269990-#270000, loss: 23.23\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3856.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #279990-#280000, loss: 23.18\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.83 s, 3532.78 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #289990-#290000, loss: 23.16\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.45 s, 4085.00 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #299990-#300000, loss: 23.10\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.26 s, 4422.89 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #309990-#310000, loss: 22.97\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.32 s, 4316.61 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #319990-#320000, loss: 22.88\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4681.67 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #329990-#330000, loss: 22.85\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4678.16 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #339990-#340000, loss: 22.82\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4718.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #349990-#350000, loss: 22.81\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4889.15 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #359990-#360000, loss: 22.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4915.44 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #369990-#370000, loss: 22.72\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4908.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #379990-#380000, loss: 22.70\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4965.93 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 3, examples #389990-#390000, loss: 22.63\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4671.89 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #399990-#400000, loss: 22.54\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4838.80 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #409990-#410000, loss: 22.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4848.72 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #419990-#420000, loss: 22.50\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4627.03 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #429990-#430000, loss: 22.39\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4696.29 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #439990-#440000, loss: 22.42\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4847.23 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #449990-#450000, loss: 22.29\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4871.52 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #459990-#460000, loss: 22.21\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4969.33 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #469990-#470000, loss: 22.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4919.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #479990-#480000, loss: 22.11\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4782.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #489990-#490000, loss: 22.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4882.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #499990-#500000, loss: 22.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4542.77 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #509990-#510000, loss: 22.02\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4593.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #519990-#520000, loss: 21.89\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.26 s, 4420.52 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #529990-#530000, loss: 21.98\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.95 s, 5115.37 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #539990-#540000, loss: 21.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4781.64 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #549990-#550000, loss: 21.68\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4948.20 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #559990-#560000, loss: 21.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4868.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #569990-#570000, loss: 21.76\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4670.71 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #579990-#580000, loss: 21.65\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4693.07 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #589990-#590000, loss: 21.52\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4474.16 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #599990-#600000, loss: 21.50\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4952.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #609990-#610000, loss: 21.52\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4468.84 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #619990-#620000, loss: 21.51\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.22 s, 4507.36 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #629990-#630000, loss: 21.28\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.51 s, 3978.63 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #639990-#640000, loss: 21.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4829.42 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #649990-#650000, loss: 21.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4726.57 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #659990-#660000, loss: 21.22\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.25 s, 4453.09 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #669990-#670000, loss: 21.20\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4547.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #679990-#680000, loss: 21.09\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4698.17 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #689990-#690000, loss: 21.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4962.71 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #699990-#700000, loss: 21.07\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4954.70 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #709990-#710000, loss: 21.03\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4977.30 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #719990-#720000, loss: 20.85\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4532.46 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #729990-#730000, loss: 20.99\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.26 s, 4421.55 examples / s\n", "INFO:gensim.models.poincare:training on epoch 3, examples #739990-#740000, loss: 20.82\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4812.09 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #9990-#10000, loss: 20.28\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4880.65 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #19990-#20000, loss: 20.39\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.67 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #29990-#30000, loss: 20.30\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.43 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #39990-#40000, loss: 20.26\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4761.39 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #49990-#50000, loss: 20.22\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4795.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #59990-#60000, loss: 20.10\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4747.03 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #69990-#70000, loss: 20.02\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4829.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #79990-#80000, loss: 20.11\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5044.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #89990-#90000, loss: 19.95\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4863.30 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #99990-#100000, loss: 20.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4678.92 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #109990-#110000, loss: 20.00\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4888.53 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 4, examples #119990-#120000, loss: 19.85\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4667.48 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #129990-#130000, loss: 19.87\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4901.64 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #139990-#140000, loss: 19.87\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5002.31 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #149990-#150000, loss: 19.85\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4938.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #159990-#160000, loss: 19.76\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5070.00 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #169990-#170000, loss: 19.64\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.07 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #179990-#180000, loss: 19.64\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4716.83 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #189990-#190000, loss: 19.54\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.98 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #199990-#200000, loss: 19.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4698.48 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #209990-#210000, loss: 19.57\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.19 s, 4563.29 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #219990-#220000, loss: 19.36\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.55 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #229990-#230000, loss: 19.31\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4871.79 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #239990-#240000, loss: 19.21\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4982.23 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #249990-#250000, loss: 19.27\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4763.58 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #259990-#260000, loss: 19.33\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4789.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #269990-#270000, loss: 19.23\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4979.62 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #279990-#280000, loss: 19.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4865.48 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #289990-#290000, loss: 19.27\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4861.66 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #299990-#300000, loss: 19.08\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4968.50 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #309990-#310000, loss: 19.09\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4944.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #319990-#320000, loss: 18.95\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4959.73 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #329990-#330000, loss: 18.94\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4985.03 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #339990-#340000, loss: 18.95\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5007.82 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #349990-#350000, loss: 18.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4756.73 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #359990-#360000, loss: 18.95\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4833.77 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #369990-#370000, loss: 18.82\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4968.78 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #379990-#380000, loss: 18.71\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5060.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #389990-#390000, loss: 18.67\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5039.61 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #399990-#400000, loss: 18.61\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5059.70 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #409990-#410000, loss: 18.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5017.67 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #419990-#420000, loss: 18.57\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5007.73 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #429990-#430000, loss: 18.42\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4954.07 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #439990-#440000, loss: 18.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4969.06 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #449990-#450000, loss: 18.53\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.17 s, 4605.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #459990-#460000, loss: 18.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5060.19 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #469990-#470000, loss: 18.33\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4981.71 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #479990-#480000, loss: 18.37\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4841.89 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #489990-#490000, loss: 18.26\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4850.45 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #499990-#500000, loss: 18.28\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4941.40 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #509990-#510000, loss: 18.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4887.96 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #519990-#520000, loss: 18.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5001.63 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #529990-#530000, loss: 18.17\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4822.19 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #539990-#540000, loss: 18.16\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4833.48 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #549990-#550000, loss: 18.18\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5004.34 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #559990-#560000, loss: 18.10\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5011.97 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #569990-#570000, loss: 17.88\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4852.27 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #579990-#580000, loss: 17.93\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4894.56 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 4, examples #589990-#590000, loss: 17.89\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4941.00 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #599990-#600000, loss: 17.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5038.40 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #609990-#610000, loss: 17.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4938.77 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #619990-#620000, loss: 17.72\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4813.41 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #629990-#630000, loss: 17.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.75 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #639990-#640000, loss: 17.75\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4949.58 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #649990-#650000, loss: 17.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4986.85 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #659990-#660000, loss: 17.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4977.14 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #669990-#670000, loss: 17.66\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5064.44 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #679990-#680000, loss: 17.49\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4986.14 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #689990-#690000, loss: 17.51\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4998.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #699990-#700000, loss: 17.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4939.02 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #709990-#710000, loss: 17.48\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5034.09 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #719990-#720000, loss: 17.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4990.34 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #729990-#730000, loss: 17.43\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4978.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 4, examples #739990-#740000, loss: 17.44\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4946.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #9990-#10000, loss: 16.98\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4967.11 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #19990-#20000, loss: 17.01\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5006.20 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #29990-#30000, loss: 17.03\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4878.62 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #39990-#40000, loss: 16.97\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4962.48 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #49990-#50000, loss: 16.87\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4861.49 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #59990-#60000, loss: 16.80\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4943.24 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #69990-#70000, loss: 16.74\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5053.07 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #79990-#80000, loss: 16.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5034.57 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #89990-#90000, loss: 16.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4990.77 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #99990-#100000, loss: 16.80\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4928.06 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #109990-#110000, loss: 16.69\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.58 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #119990-#120000, loss: 16.48\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4793.82 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #129990-#130000, loss: 16.78\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5056.86 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #139990-#140000, loss: 16.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4999.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #149990-#150000, loss: 16.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4693.17 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #159990-#160000, loss: 16.58\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5005.50 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #169990-#170000, loss: 16.43\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5064.80 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #179990-#180000, loss: 16.39\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #189990-#190000, loss: 16.31\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4815.94 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #199990-#200000, loss: 16.29\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4911.52 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #209990-#210000, loss: 16.41\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4846.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #219990-#220000, loss: 16.23\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4998.19 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #229990-#230000, loss: 16.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4779.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #239990-#240000, loss: 16.28\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4753.24 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #249990-#250000, loss: 16.10\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4993.15 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #259990-#260000, loss: 16.15\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4828.99 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #269990-#270000, loss: 16.17\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5063.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #279990-#280000, loss: 16.13\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4863.24 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #289990-#290000, loss: 16.05\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4868.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #299990-#300000, loss: 16.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4842.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #309990-#310000, loss: 15.97\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5016.07 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 5, examples #319990-#320000, loss: 16.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4759.84 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #329990-#330000, loss: 15.98\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4971.92 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #339990-#340000, loss: 16.00\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4928.27 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #349990-#350000, loss: 15.84\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5048.23 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #359990-#360000, loss: 15.91\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4976.53 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #369990-#370000, loss: 15.83\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4948.23 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #379990-#380000, loss: 15.80\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4957.56 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #389990-#390000, loss: 15.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4904.25 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #399990-#400000, loss: 15.75\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4995.65 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #409990-#410000, loss: 15.78\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4963.78 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #419990-#420000, loss: 15.70\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4888.36 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #429990-#430000, loss: 15.68\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4911.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #439990-#440000, loss: 15.64\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4952.78 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #449990-#450000, loss: 15.52\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4718.29 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #459990-#460000, loss: 15.54\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4987.42 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #469990-#470000, loss: 15.63\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5033.39 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #479990-#480000, loss: 15.70\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4838.73 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #489990-#490000, loss: 15.51\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5000.57 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #499990-#500000, loss: 15.51\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4689.02 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #509990-#510000, loss: 15.32\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.15 s, 4652.77 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #519990-#520000, loss: 15.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4581.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #529990-#530000, loss: 15.21\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4957.70 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #539990-#540000, loss: 15.34\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4730.20 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #549990-#550000, loss: 15.19\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.36 s, 4237.03 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #559990-#560000, loss: 15.31\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4771.29 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #569990-#570000, loss: 15.16\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4707.25 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #579990-#580000, loss: 15.26\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4868.55 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #589990-#590000, loss: 15.25\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4853.89 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #599990-#600000, loss: 15.19\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4595.38 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #609990-#610000, loss: 15.08\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4916.04 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #619990-#620000, loss: 15.25\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4620.15 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #629990-#630000, loss: 14.94\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.29 s, 4372.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #639990-#640000, loss: 15.02\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.26 s, 4420.59 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #649990-#650000, loss: 15.05\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4584.97 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #659990-#660000, loss: 15.13\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4846.84 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #669990-#670000, loss: 14.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4919.27 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #679990-#680000, loss: 14.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4985.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #689990-#690000, loss: 15.05\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5057.07 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #699990-#700000, loss: 14.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5000.94 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #709990-#710000, loss: 14.95\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4840.27 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #719990-#720000, loss: 14.87\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4770.30 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #729990-#730000, loss: 14.76\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5007.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 5, examples #739990-#740000, loss: 14.70\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.94 s, 5149.87 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #9990-#10000, loss: 14.62\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5074.41 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #19990-#20000, loss: 14.57\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5068.34 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #29990-#30000, loss: 14.56\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4711.80 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #39990-#40000, loss: 14.53\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4901.93 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 6, examples #49990-#50000, loss: 14.38\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4972.95 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #59990-#60000, loss: 14.40\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4912.06 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #69990-#70000, loss: 14.48\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4845.06 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #79990-#80000, loss: 14.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.19 s, 4567.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #89990-#90000, loss: 14.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4905.89 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #99990-#100000, loss: 14.27\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4717.66 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #109990-#110000, loss: 14.32\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4583.40 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #119990-#120000, loss: 14.38\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4773.44 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #129990-#130000, loss: 14.30\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4737.11 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #139990-#140000, loss: 14.29\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.25 s, 4445.51 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #149990-#150000, loss: 14.21\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5039.20 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #159990-#160000, loss: 14.25\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4890.26 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #169990-#170000, loss: 14.30\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4938.33 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #179990-#180000, loss: 14.23\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4632.99 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #189990-#190000, loss: 14.20\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4979.41 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #199990-#200000, loss: 14.22\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4762.01 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #209990-#210000, loss: 14.18\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.19 s, 4567.59 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #219990-#220000, loss: 14.07\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4461.97 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #229990-#230000, loss: 14.31\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.22 s, 4504.80 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #239990-#240000, loss: 14.14\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4535.80 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #249990-#250000, loss: 14.15\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3890.63 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #259990-#260000, loss: 13.95\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.19 s, 4569.33 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #269990-#270000, loss: 14.05\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4844.90 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #279990-#280000, loss: 14.12\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4676.96 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #289990-#290000, loss: 14.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4810.09 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #299990-#300000, loss: 13.99\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4519.08 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #309990-#310000, loss: 13.95\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.16 s, 4638.66 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #319990-#320000, loss: 13.86\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4585.32 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #329990-#330000, loss: 13.90\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.24 s, 4468.27 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #339990-#340000, loss: 13.89\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4983.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #349990-#350000, loss: 13.86\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.15 s, 4658.19 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #359990-#360000, loss: 13.79\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.14 s, 4673.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #369990-#370000, loss: 13.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.17 s, 4613.05 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #379990-#380000, loss: 13.63\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4992.76 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #389990-#390000, loss: 13.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5000.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #399990-#400000, loss: 13.75\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4716.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #409990-#410000, loss: 13.76\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4762.14 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #419990-#420000, loss: 13.63\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4856.55 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #429990-#430000, loss: 13.71\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.43 s, 4115.02 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #439990-#440000, loss: 13.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.30 s, 4340.13 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #449990-#450000, loss: 13.70\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.35 s, 4255.26 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #459990-#460000, loss: 13.60\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4991.39 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #469990-#470000, loss: 13.51\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4979.94 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #479990-#480000, loss: 13.64\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4703.96 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #489990-#490000, loss: 13.60\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4524.26 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #499990-#500000, loss: 13.61\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4815.57 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #509990-#510000, loss: 13.70\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4812.18 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 6, examples #519990-#520000, loss: 13.45\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5008.45 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #529990-#530000, loss: 13.59\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4808.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #539990-#540000, loss: 13.42\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4886.71 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #549990-#550000, loss: 13.52\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4789.03 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #559990-#560000, loss: 13.37\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4941.37 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #569990-#570000, loss: 13.49\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.17 s, 4612.78 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #579990-#580000, loss: 13.37\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.06 s, 4864.96 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #589990-#590000, loss: 13.26\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5015.37 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #599990-#600000, loss: 13.29\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4871.27 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #609990-#610000, loss: 13.31\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4965.43 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #619990-#620000, loss: 13.38\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4897.68 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #629990-#630000, loss: 13.30\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.15 s, 4652.29 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #639990-#640000, loss: 13.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.29 s, 4368.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #649990-#650000, loss: 13.28\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.12 s, 4713.79 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #659990-#660000, loss: 13.22\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.10 s, 4761.11 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #669990-#670000, loss: 13.23\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.01 s, 4979.03 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #679990-#680000, loss: 13.24\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4903.94 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #689990-#690000, loss: 13.34\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.33 s, 4297.92 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #699990-#700000, loss: 13.18\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4537.09 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #709990-#710000, loss: 13.18\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.77 s, 3613.14 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #719990-#720000, loss: 13.16\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.74 s, 3655.50 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #729990-#730000, loss: 13.11\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.56 s, 3901.57 examples / s\n", "INFO:gensim.models.poincare:training on epoch 6, examples #739990-#740000, loss: 13.06\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.11 s, 4740.70 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #9990-#10000, loss: 13.14\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4691.85 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #19990-#20000, loss: 12.79\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.23 s, 4486.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #29990-#30000, loss: 13.03\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.35 s, 4251.78 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #39990-#40000, loss: 12.81\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.33 s, 4284.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #49990-#50000, loss: 12.94\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.37 s, 4223.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #59990-#60000, loss: 12.86\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3896.02 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #69990-#70000, loss: 12.85\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.57 s, 3884.71 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #79990-#80000, loss: 12.79\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.72 s, 3671.83 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #89990-#90000, loss: 12.86\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4873.00 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #99990-#100000, loss: 12.80\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4899.85 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #109990-#110000, loss: 12.84\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4777.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #119990-#120000, loss: 12.65\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.99 s, 5014.45 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #129990-#130000, loss: 12.83\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.03 s, 4923.53 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #139990-#140000, loss: 12.74\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.05 s, 4870.12 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #149990-#150000, loss: 12.75\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.07 s, 4834.36 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #159990-#160000, loss: 12.73\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4775.92 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #169990-#170000, loss: 12.68\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4586.59 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #179990-#180000, loss: 12.81\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.18 s, 4597.17 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #189990-#190000, loss: 12.60\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.38 s, 4209.51 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #199990-#200000, loss: 12.73\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.49 s, 4012.02 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #209990-#210000, loss: 12.65\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.43 s, 4123.05 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #219990-#220000, loss: 12.82\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.38 s, 4200.99 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #229990-#230000, loss: 12.62\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.34 s, 4272.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #239990-#240000, loss: 12.67\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.20 s, 4544.30 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 7, examples #249990-#250000, loss: 12.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.79 s, 3585.54 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #259990-#260000, loss: 12.61\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.22 s, 3108.58 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #269990-#270000, loss: 12.61\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.80 s, 3573.95 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #279990-#280000, loss: 12.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.37 s, 2965.71 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #289990-#290000, loss: 12.61\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.00 s, 3337.80 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #299990-#300000, loss: 12.54\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.23 s, 3092.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #309990-#310000, loss: 12.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.15 s, 3174.50 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #319990-#320000, loss: 12.42\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.00 s, 3337.83 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #329990-#330000, loss: 12.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.90 s, 3451.59 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #339990-#340000, loss: 12.54\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.95 s, 3388.02 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #349990-#350000, loss: 12.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.03 s, 3298.19 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #359990-#360000, loss: 12.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.15 s, 3178.27 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #369990-#370000, loss: 12.46\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.59 s, 3855.54 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #379990-#380000, loss: 12.55\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.41 s, 2929.58 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #389990-#390000, loss: 12.32\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.87 s, 3479.64 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #399990-#400000, loss: 12.38\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.25 s, 3076.83 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #409990-#410000, loss: 12.44\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.30 s, 3032.67 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #419990-#420000, loss: 12.27\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.46 s, 4070.55 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #429990-#430000, loss: 12.32\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.53 s, 2833.03 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #439990-#440000, loss: 12.39\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.37 s, 4218.68 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #449990-#450000, loss: 12.31\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.58 s, 2794.56 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #459990-#460000, loss: 12.49\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.74 s, 3653.72 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #469990-#470000, loss: 12.42\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 4.27 s, 2339.92 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #479990-#480000, loss: 12.39\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.70 s, 2705.51 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #489990-#490000, loss: 12.39\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 4.18 s, 2390.35 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #499990-#500000, loss: 12.29\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 4.27 s, 2340.87 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #509990-#510000, loss: 12.27\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.33 s, 2999.75 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #519990-#520000, loss: 12.16\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 4.92 s, 2033.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #529990-#530000, loss: 12.30\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.98 s, 3354.39 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #539990-#540000, loss: 12.26\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.52 s, 2839.14 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #549990-#550000, loss: 12.20\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.75 s, 3635.44 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #559990-#560000, loss: 12.09\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.15 s, 3171.62 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #569990-#570000, loss: 12.35\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.63 s, 2754.73 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #579990-#580000, loss: 12.28\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.97 s, 3369.52 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #589990-#590000, loss: 12.15\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.81 s, 2627.79 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #599990-#600000, loss: 12.11\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.46 s, 2890.72 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #609990-#610000, loss: 12.08\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.26 s, 3068.25 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #619990-#620000, loss: 11.95\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.61 s, 2772.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #629990-#630000, loss: 12.19\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.43 s, 4116.51 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #639990-#640000, loss: 12.09\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.68 s, 2716.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #649990-#650000, loss: 12.01\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.22 s, 3106.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #659990-#660000, loss: 12.19\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.30 s, 3029.78 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #669990-#670000, loss: 11.92\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.71 s, 2696.68 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #679990-#680000, loss: 11.89\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.76 s, 3623.31 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #689990-#690000, loss: 12.04\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.66 s, 2734.81 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #699990-#700000, loss: 12.01\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.80 s, 3570.62 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #709990-#710000, loss: 11.99\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.94 s, 3395.87 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 7, examples #719990-#720000, loss: 12.01\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 3.05 s, 3275.42 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #729990-#730000, loss: 11.90\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.13 s, 4689.34 examples / s\n", "INFO:gensim.models.poincare:training on epoch 7, examples #739990-#740000, loss: 11.97\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.96 s, 3380.01 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #9990-#10000, loss: 11.83\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.64 s, 3788.66 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #19990-#20000, loss: 11.96\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.72 s, 3678.58 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #29990-#30000, loss: 11.99\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.90 s, 3446.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #39990-#40000, loss: 11.81\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.39 s, 4183.87 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #49990-#50000, loss: 11.84\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.73 s, 3660.45 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #59990-#60000, loss: 11.78\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.87 s, 3487.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #69990-#70000, loss: 11.79\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.58 s, 3874.98 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #79990-#80000, loss: 11.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.96 s, 5106.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #89990-#90000, loss: 11.50\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.21 s, 4520.77 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #99990-#100000, loss: 11.84\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.69 s, 3712.38 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #109990-#110000, loss: 11.68\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.09 s, 4776.22 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #119990-#120000, loss: 11.71\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5221.46 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #129990-#130000, loss: 11.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5217.92 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #139990-#140000, loss: 11.77\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5322.30 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #149990-#150000, loss: 11.68\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.98 s, 5061.23 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #159990-#160000, loss: 11.73\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5312.49 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #169990-#170000, loss: 11.58\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.85 s, 5396.36 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #179990-#180000, loss: 11.64\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5283.63 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #189990-#190000, loss: 11.76\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5373.89 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #199990-#200000, loss: 11.62\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5286.74 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #209990-#210000, loss: 11.66\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.93 s, 5171.18 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #219990-#220000, loss: 11.57\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5317.64 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #229990-#230000, loss: 11.56\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5383.91 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #239990-#240000, loss: 11.70\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.91 s, 5238.90 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #249990-#250000, loss: 11.43\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5300.40 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #259990-#260000, loss: 11.70\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5288.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #269990-#270000, loss: 11.60\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.97 s, 5083.33 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #279990-#280000, loss: 11.63\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5195.28 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #289990-#290000, loss: 11.42\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 4996.08 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #299990-#300000, loss: 11.59\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.00 s, 5009.45 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #309990-#310000, loss: 11.63\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.90 s, 5264.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #319990-#320000, loss: 11.56\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5316.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #329990-#330000, loss: 11.47\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.94 s, 5167.73 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #339990-#340000, loss: 11.51\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.87 s, 5337.89 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #349990-#350000, loss: 11.54\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5389.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #359990-#360000, loss: 11.49\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5362.69 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #369990-#370000, loss: 11.43\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.04 s, 4896.47 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #379990-#380000, loss: 11.45\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4814.46 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #389990-#390000, loss: 11.48\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.08 s, 4818.25 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #399990-#400000, loss: 11.38\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 2.02 s, 4960.60 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #409990-#410000, loss: 11.45\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.86 s, 5365.93 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #419990-#420000, loss: 11.59\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5206.10 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #429990-#430000, loss: 11.39\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.88 s, 5322.43 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #439990-#440000, loss: 11.45\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.89 s, 5301.80 examples / s\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "INFO:gensim.models.poincare:training on epoch 8, examples #449990-#450000, loss: 11.45\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.91 s, 5242.85 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #459990-#460000, loss: 11.30\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.94 s, 5149.58 examples / s\n", "INFO:gensim.models.poincare:training on epoch 8, examples #469990-#470000, loss: 11.45\n", "INFO:gensim.models.poincare:time taken for 10000 examples: 1.92 s, 5211.54 examples / s\n" ] } ], "source": [ "model_files['gensim'] = {}\n", "# Train models with default params\n", "model_name, files = train_model_with_params(default_params, wordnet_file, model_sizes, 'gensim_model', 'gensim')\n", "model_files['gensim'][model_name] = {}\n", "for dim, filepath in files.items():\n", " model_files['gensim'][model_name][dim] = filepath\n", "# Train models with non-default params\n", "for new_params in non_default_params_gensim:\n", " params = default_params.copy()\n", " params.update(new_params)\n", " model_name, files = train_model_with_params(params, wordnet_file, model_sizes, 'gensim_model', 'gensim')\n", " model_files['gensim'][model_name] = {}\n", " for dim, filepath in files.items():\n", " model_files['gensim'][model_name][dim] = filepath" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 3. Loading the embeddings" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def transform_cpp_embedding_to_kv(input_file, output_file, encoding='utf8'):\n", " \"\"\"Given a C++ embedding tsv filepath, converts it to a KeyedVector-supported file\"\"\"\n", " with smart_open(input_file, 'rb') as f:\n", " lines = [line.decode(encoding) for line in f]\n", " if not len(lines):\n", " raise ValueError(\"file is empty\")\n", " first_line = lines[0]\n", " parts = first_line.rstrip().split(\"\\t\")\n", " model_size = len(parts) - 1\n", " vocab_size = len(lines)\n", " with smart_open(output_file, 'w') as f:\n", " f.write('%d %d\\n' % (vocab_size, model_size))\n", " for line in lines:\n", " f.write(line.replace('\\t', ' '))\n", "\n", "def transform_numpy_embedding_to_kv(input_file, output_file, encoding='utf8'):\n", " \"\"\"Given a numpy poincare embedding pkl filepath, converts it to a KeyedVector-supported file\"\"\"\n", " np_embeddings = pickle.load(open(input_file, 'rb'))\n", " random_embedding = np_embeddings[list(np_embeddings.keys())[0]]\n", " \n", " model_size = random_embedding.shape[0]\n", " vocab_size = len(np_embeddings)\n", " with smart_open(output_file, 'w') as f:\n", " f.write('%d %d\\n' % (vocab_size, model_size))\n", " for key, vector in np_embeddings.items():\n", " vector_string = ' '.join('%.6f' % value for value in vector)\n", " f.write('%s %s\\n' % (key, vector_string))\n", "\n", "def load_poincare_cpp(input_filename):\n", " \"\"\"Load embedding trained via C++ Poincare model.\n", "\n", " Parameters\n", " ----------\n", " filepath : str\n", " Path to tsv file containing embedding.\n", "\n", " Returns\n", " -------\n", " PoincareKeyedVectors instance.\n", "\n", " \"\"\"\n", " keyed_vectors_filename = input_filename + '.kv'\n", " transform_cpp_embedding_to_kv(input_filename, keyed_vectors_filename)\n", " embedding = PoincareKeyedVectors.load_word2vec_format(keyed_vectors_filename)\n", " os.unlink(keyed_vectors_filename)\n", " return embedding\n", "\n", "def load_poincare_numpy(input_filename):\n", " \"\"\"Load embedding trained via Python numpy Poincare model.\n", "\n", " Parameters\n", " ----------\n", " filepath : str\n", " Path to pkl file containing embedding.\n", "\n", " Returns:\n", " PoincareKeyedVectors instance.\n", "\n", " \"\"\"\n", " keyed_vectors_filename = input_filename + '.kv'\n", " transform_numpy_embedding_to_kv(input_filename, keyed_vectors_filename)\n", " embedding = PoincareKeyedVectors.load_word2vec_format(keyed_vectors_filename)\n", " os.unlink(keyed_vectors_filename)\n", " return embedding\n", "\n", "def load_poincare_gensim(input_filename):\n", " \"\"\"Load embedding trained via Gensim PoincareModel.\n", "\n", " Parameters\n", " ----------\n", " filepath : str\n", " Path to model file.\n", "\n", " Returns:\n", " PoincareKeyedVectors instance.\n", "\n", " \"\"\"\n", " model = PoincareModel.load(input_filename)\n", " return model.kv\n", "\n", "def load_model(implementation, model_file):\n", " \"\"\"Convenience function over functions to load models from different implementations.\n", " \n", " Parameters\n", " ----------\n", " implementation : str\n", " Implementation used to create model file ('c++'/'numpy'/'gensim').\n", " model_file : str\n", " Path to model file.\n", " \n", " Returns\n", " -------\n", " PoincareKeyedVectors instance\n", " \n", " Notes\n", " -----\n", " Raises ValueError in case of invalid value for `implementation`\n", "\n", " \"\"\"\n", " if implementation == 'c++':\n", " return load_poincare_cpp(model_file)\n", " elif implementation == 'numpy':\n", " return load_poincare_numpy(model_file)\n", " elif implementation == 'gensim':\n", " return load_poincare_gensim(model_file)\n", " else:\n", " raise ValueError('Invalid implementation %s' % implementation)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 4. Evaluation" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def display_results(task_name, results):\n", " \"\"\"Display evaluation results of multiple embeddings on a single task in a tabular format\n", " \n", " Args:\n", " task_name (str): name the task being evaluated\n", " results (dict): mapping between embeddings and corresponding results\n", " \n", " \"\"\"\n", " result_table = PrettyTable()\n", " result_table.field_names = [\"Model Description\", \"Metric\"] + [str(dim) for dim in sorted(model_sizes)]\n", " for model_name, model_results in results.items():\n", " metrics = [metric for metric in model_results.keys()]\n", " dims = sorted([dim for dim in model_results[metrics[0]].keys()])\n", " description = model_description_from_name(model_name)\n", " row = [description, '\\n'.join(metrics) + '\\n']\n", " for dim in dims:\n", " scores = ['%.2f' % model_results[metric][dim] for metric in metrics]\n", " row.append('\\n'.join(scores))\n", " result_table.add_row(row)\n", " result_table.align = 'r'\n", " result_html = result_table.get_html_string()\n", " search = \"\"\n", " insert_at = result_html.index(search) + len(search)\n", " new_row = \"\"\"\n", " \n", " \n", " \n", " \n", " \"\"\" % (task_name, len(model_sizes))\n", " result_html = result_html[:insert_at] + new_row + result_html[insert_at:]\n", " display(HTML(result_html))\n", " \n", "def model_description_from_name(model_name):\n", " if model_name.startswith('gensim'):\n", " implementation = 'Gensim'\n", " elif model_name.startswith('cpp'):\n", " implementation = 'C++'\n", " elif model_name.startswith('np'):\n", " implementation = 'Numpy'\n", " else:\n", " raise ValueError('Unsupported implementation for model: %s' % model_name)\n", " description = []\n", " for param_key in sorted(default_params.keys()):\n", " pattern = '%s_([^_]*)_?' % param_key\n", " match = re.search(pattern, model_name)\n", " if match:\n", " description.append(\"%s=%s\" % (param_key, match.groups()[0]))\n", " return \"%s: %s\" % (implementation, \", \".join(description))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 4.1 WordNet reconstruction" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For this task, embeddings are learnt using the entire transitive closure of the WordNet noun hypernym hierarchy. Subsequently, for every hypernym pair `(u, v)`, the rank of `v` amongst all nodes that do not have a positive edge with `v` is computed. The final metric `mean_rank` is the average of all these ranks. The `MAP` metric is the mean of the Average Precision of the rankings for all positive nodes for a given node `u`.\n", "\n", "Note that this task tests representation capacity of the learnt embeddings, and not the generalization ability." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "reconstruction_results = OrderedDict()\n", "metrics = ['mean_rank', 'MAP']" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "scrolled": true }, "outputs": [], "source": [ "for implementation, models in sorted(model_files.items()):\n", " for model_name, files in models.items():\n", " if model_name in reconstruction_results:\n", " continue\n", " reconstruction_results[model_name] = OrderedDict()\n", " for metric in metrics:\n", " reconstruction_results[model_name][metric] = {}\n", " for model_size, model_file in files.items():\n", " print('Evaluating model %s of size %d' % (model_name, model_size))\n", " embedding = load_model(implementation, model_file)\n", " eval_instance = ReconstructionEvaluation(wordnet_file, embedding)\n", " eval_result = eval_instance.evaluate(max_n=1000)\n", " for metric in metrics:\n", " reconstruction_results[model_name][metric][model_size] = eval_result[metric]" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "display_results('WordNet Reconstruction', reconstruction_results)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Results from the paper -\n", "![Reconstruction Results](https://raw.githubusercontent.com/RaRe-Technologies/gensim/poincare_model_keyedvectors/docs/notebooks/poincare/reconstruction_paper.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The figures above illustrate a few things - \n", "1. The gensim implementation does significantly better for all model sizes and hyperparameters than both the other implementations.\n", "2. The results from the original paper have not been achieved by our implementation. Especially for models with lower dimensions, the paper mentions significantly better mean rank and MAP for the reconstruction task.\n", "3. Using burn-in and regularization leads to much better results with low model sizes, however the results do not improve significantly with increasing model size. This might have to do with tuning the regularization coefficient, which the paper does not mention." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 4.2 WordNet link prediction" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This task is similar to the reconstruction task described above, except that the list of relations is split into a training and testing set, and the mean rank reported is for the edges in the test set.\n", "\n", "Therefore, this tests the ability of the model to predict unseen edges between nodes, i.e. generalization ability, as opposed to the representation capacity tested in the Reconstruction task" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### 4.2.1 Preparing data" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def train_test_split(data_file, test_ratio=0.1):\n", " \"\"\"Creates train and test files from given data file, returns train/test file names\n", " \n", " Args:\n", " data_file (str): path to data file for which train/test split is to be created\n", " test_ratio (float): fraction of lines to be used for test data\n", " \n", " Returns\n", " (train_file, test_file): tuple of strings with train file and test file paths\n", " \"\"\"\n", " train_filename = data_file + '.train'\n", " test_filename = data_file + '.test'\n", " if os.path.exists(train_filename) and os.path.exists(test_filename):\n", " print('Train and test files already exist, skipping')\n", " return (train_filename, test_filename)\n", " root_nodes, leaf_nodes = get_root_and_leaf_nodes(data_file)\n", " test_line_candidates = []\n", " line_count = 0\n", " all_nodes = set()\n", " with smart_open(data_file, 'rb') as f:\n", " for i, line in enumerate(f):\n", " node_1, node_2 = line.split()\n", " all_nodes.update([node_1, node_2])\n", " if (\n", " node_1 not in leaf_nodes\n", " and node_2 not in leaf_nodes\n", " and node_1 not in root_nodes\n", " and node_2 not in root_nodes\n", " and node_1 != node_2\n", " ):\n", " test_line_candidates.append(i)\n", " line_count += 1\n", "\n", " num_test_lines = int(test_ratio * line_count)\n", " if num_test_lines > len(test_line_candidates):\n", " raise ValueError('Not enough candidate relations for test set')\n", " print('Choosing %d test lines from %d candidates' % (num_test_lines, len(test_line_candidates)))\n", " test_line_indices = set(random.sample(test_line_candidates, num_test_lines))\n", " train_line_indices = set(l for l in range(line_count) if l not in test_line_indices)\n", " \n", " train_set_nodes = set()\n", " with smart_open(data_file, 'rb') as f:\n", " train_file = smart_open(train_filename, 'wb')\n", " test_file = smart_open(test_filename, 'wb')\n", " for i, line in enumerate(f):\n", " if i in train_line_indices:\n", " train_set_nodes.update(line.split())\n", " train_file.write(line)\n", " elif i in test_line_indices:\n", " test_file.write(line)\n", " else:\n", " raise AssertionError('Line %d not present in either train or test line indices' % i)\n", " train_file.close()\n", " test_file.close()\n", " assert len(train_set_nodes) == len(all_nodes), 'Not all nodes from dataset present in train set relations'\n", " return (train_filename, test_filename)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def get_root_and_leaf_nodes(data_file):\n", " \"\"\"Return keys of root and leaf nodes from a file with transitive closure relations\n", " \n", " Args:\n", " data_file(str): file path containing transitive closure relations\n", " \n", " Returns:\n", " (root_nodes, leaf_nodes) - tuple containing keys of root and leaf nodes\n", " \"\"\"\n", " root_candidates = set()\n", " leaf_candidates = set()\n", " with smart_open(data_file, 'rb') as f:\n", " for line in f:\n", " nodes = line.split()\n", " root_candidates.update(nodes)\n", " leaf_candidates.update(nodes)\n", " \n", " with smart_open(data_file, 'rb') as f:\n", " for line in f:\n", " node_1, node_2 = line.split()\n", " if node_1 == node_2:\n", " continue\n", " leaf_candidates.discard(node_1)\n", " root_candidates.discard(node_2)\n", " \n", " return (leaf_candidates, root_candidates)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "wordnet_train_file, wordnet_test_file = train_test_split(wordnet_file)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### 4.2.2 Training models" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# Training models for link prediction\n", "lp_model_files = {}" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "lp_model_files['c++'] = {}\n", "# Train c++ models with default params\n", "model_name, files = train_model_with_params(default_params, wordnet_train_file, model_sizes, 'cpp_lp_model', 'c++')\n", "lp_model_files['c++'][model_name] = {}\n", "for dim, filepath in files.items():\n", " lp_model_files['c++'][model_name][dim] = filepath\n", "# Train c++ models with non-default params\n", "for param, values in non_default_params.items():\n", " params = default_params.copy()\n", " for value in values:\n", " params[param] = value\n", " model_name, files = train_model_with_params(params, wordnet_train_file, model_sizes, 'cpp_lp_model', 'c++')\n", " lp_model_files['c++'][model_name] = {}\n", " for dim, filepath in files.items():\n", " lp_model_files['c++'][model_name][dim] = filepath" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "lp_model_files['numpy'] = {}\n", "# Train numpy models with default params\n", "model_name, files = train_model_with_params(default_params, wordnet_train_file, model_sizes, 'np_lp_model', 'numpy')\n", "lp_model_files['numpy'][model_name] = {}\n", "for dim, filepath in files.items():\n", " lp_model_files['numpy'][model_name][dim] = filepath" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "lp_model_files['gensim'] = {}\n", "# Train models with default params\n", "model_name, files = train_model_with_params(default_params, wordnet_train_file, model_sizes, 'gensim_lp_model', 'gensim')\n", "lp_model_files['gensim'][model_name] = {}\n", "for dim, filepath in files.items():\n", " lp_model_files['gensim'][model_name][dim] = filepath\n", "# Train models with non-default params\n", "for new_params in non_default_params_gensim:\n", " params = default_params.copy()\n", " params.update(new_params)\n", " model_name, files = train_model_with_params(params, wordnet_file, model_sizes, 'gensim_lp_model', 'gensim')\n", " lp_model_files['gensim'][model_name] = {}\n", " for dim, filepath in files.items():\n", " lp_model_files['gensim'][model_name][dim] = filepath" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### 4.2.3 Evaluating models" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "lp_results = OrderedDict()\n", "metrics = ['mean_rank', 'MAP']" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "for implementation, models in sorted(lp_model_files.items()):\n", " for model_name, files in models.items():\n", " lp_results[model_name] = OrderedDict()\n", " for metric in metrics:\n", " lp_results[model_name][metric] = {}\n", " for model_size, model_file in files.items():\n", " print('Evaluating model %s of size %d' % (model_name, model_size))\n", " embedding = load_model(implementation, model_file)\n", " eval_instance = LinkPredictionEvaluation(wordnet_train_file, wordnet_test_file, embedding)\n", " eval_result = eval_instance.evaluate(max_n=1000)\n", " for metric in metrics:\n", " lp_results[model_name][metric][model_size] = eval_result[metric]" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "display_results('WordNet Link Prediction', lp_results)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Results from the paper -\n", "![Link Prediction Paper](https://raw.githubusercontent.com/RaRe-Technologies/gensim/poincare_model_keyedvectors/docs/notebooks/poincare/link_prediction_paper.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "These results follow similar trends as the reconstruction results. Repeating here for ease of reading - \n", "1. The gensim implementation does significantly better for all model sizes and hyperparameters than both the other implementations.\n", "2. The results from the original paper have not been achieved by our implementation. Especially for models with lower dimensions, the paper mentions significantly better mean rank and MAP for the link prediction task.\n", "4. Using burn-in and regularization leads to better results with low model sizes, however the results do not improve significantly with increasing model size.\n", "\n", "The main difference from the reconstruction results is that mean ranks for link prediction are slightly worse most of the time than the corresponding reconstruction results. This is to be expected, as link prediction is performed on a held-out test set." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 4.3 HyperLex Lexical Entailment" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The Lexical Entailment task is performed using the HyperLex dataset, a collection of 2163 noun pairs and scores that denote \"To what degree is noun A a type of noun Y\". For example - \n", " \n", "`girl person 9.85`\n", "\n", "These scores are out of 10.\n", "\n", "The [spearman's correlation score](https://en.wikipedia.org/wiki/Spearman%27s_rank_correlation_coefficient) is computed for the predicted and actual similarity scores, with the models trained on the entire WordNet noun hierarchy.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "entailment_results = OrderedDict()\n", "eval_instance = LexicalEntailmentEvaluation(hyperlex_file)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "scrolled": true }, "outputs": [], "source": [ "for implementation, models in sorted(model_files.items()):\n", " for model_name, files in models.items():\n", " if model_name in entailment_results:\n", " continue\n", " entailment_results[model_name] = OrderedDict()\n", " entailment_results[model_name]['spearman'] = {}\n", " for model_size, model_file in files.items():\n", " print('Evaluating model %s of size %d' % (model_name, model_size))\n", " embedding = load_model(implementation, model_file)\n", " entailment_results[model_name]['spearman'][model_size] = eval_instance.evaluate_spearman(embedding)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "display_results('Lexical Entailment (HyperLex)', entailment_results)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Results from paper (for Poincaré Embeddings, as well as other embeddings from previous papers) - \n", "![LE Results](https://raw.githubusercontent.com/RaRe-Technologies/gensim/poincare_model_keyedvectors/docs/notebooks/poincare/entailment_paper.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Some observations - \n", "1. We achieve a max spearman score of 0.48, fairly close to the spearman score of 0.512 mentioned in the paper.\n", "2. The best results are obtained with 20 negative examples, a batch size of 10, and no burn-in, however the differences are too low to make a meaningful conclusion.\n", "\n", "However, there are a few ambiguities and caveats - \n", "1. The paper does not mention which hyperparameters and model size have been used for the above mentioned result. Hence it is possible that the results are achieved with a significantly lower model size than the one we use, which would imply that our implementation still has some way to go.\n", "2. The same word can have multiple nodes in the WordNet dataset for different senses of the word, and it is unclear in the paper how to decide which node to pick. For the above results, we have gone with the sane default of picking the particular sense that has the maximum similarity score with the target word.\n", "3. Certain words in the HyperLex dataset seem to be absent from the WordNet data - the paper does not mention any such thing. Pairs containing missing words have been omitted from the evaluation (182/2163).\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 4.4 Link Prediction on Collaboration Networks\n", "\n", "The paper also describes a variant of the Poincaré model to learn embeddings of nodes in a symmetric graph, unlike the WordNet noun hierarchy, which is directed and asymmetric. The datasets used in the paper for this model are scientific collaboration networks, in which the nodes are researchers and an edge represents that the two researchers have co-authored a paper.\n", "\n", "This variant has not been implemented yet, and is therefore not a part of our experiments." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 5. Next Steps" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "1. The model can be investigated further to understand why it doesn't produce results as good as the paper. It is possible that this might be due to training details not present in the paper, or due to us incorrectly interpreting some ambiguous parts of the paper. We have not been able to clarify all such ambiguities in communication with the authors.\n", "2. Optimizing the training process further - with a model size of 50 dimensions and a dataset with ~700k relations and ~80k nodes, the Gensim implementation takes around 45 seconds to complete an epoch (~15k relations per second), whereas the open source C++ implementation takes around 1/6th the time (~95k relations per second).\n", "3. Implementing the variant of the model mentioned in the paper for symmetric graphs and evaluating on the scientific collaboration datasets described earlier in the report." ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.1" } }, "nbformat": 4, "nbformat_minor": 2 }
%s Dimensions