{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "\n", "\n", "\n", "\n", "# BHSA version mappings\n", "\n", "In this notebook we map the nodes between the all the extant versions of the BHSA dataset.\n", "\n", "The resulting mappings can be used for writing version independent programs that process\n", "the BHSA data.\n", "Those programs can only be version independent to a certain extent, because\n", "in general, node mappings between versions cannot be made perfect.\n", "\n", "If one imagines what may change between versions, it seems intractable to make a device that overcomes\n", "differences in the encoding of the texts and its syntax.\n", "However, we are dealing with versions of a very stable text, that is linguistically annotated by means\n", "of a consistent method, so there is reason to be optimistic.\n", "This notebook shows that this optimism is well-founded.\n", "\n", "In another notebook,\n", "[version Phrases](versionPhrases.ipynb)\n", "we show how one can use the mappings to analyse phrase encodings across versions of the data.\n", "\n", "# Overview\n", "We create the mappings in two distinct stages, each being based on a particular insight, and dealing with\n", "a set of difficult cases.\n", "\n", "* [Slot nodes](#Slot-nodes): we restrict ourselves to the *slot* nodes,\n", " the nodes that correspond to the individual words;\n", "* [Nodes in general](#Nodes-in-general): we extend the slot mapping in a generic way to\n", " a mapping between all nodes.\n", " Those other nodes are the ones that correspond to higher level textual objects, such as phrases, clauses,\n", " sentences.\n", "\n", "This is a big notebook, here are links to some locations in the computation.\n", "\n", "* [start of the computation](#Computing)\n", "* [start of making slot mappings](#Making-slot-mappings)\n", "* [start of expanding them to node mappings](#Extending-to-node-mappings)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Nodes, edges, mappings\n", "\n", "In the\n", "[text-fabric data model](https://github.com/Dans-labs/text-fabric/wiki/Data-model),\n", "nodes correspond to the objects in the text and its syntax, and edges correspond to relationships between\n", "those objects.\n", "Normally, these edges are **intra**-dataset, they are between nodes in the same dataset.\n", "\n", "Now, each version of the BHSA in text-fabric is its own dataset.\n", "The mappings between nodes of one version and corresponding nodes in another version are\n", "**inter**-dataset edges.\n", "\n", "Nodes in text-fabric are abstract, they are just numbers,\n", "starting with 1 for the first slot (word),\n", "increasing by one for each slot up to the last slot,\n", "and then just continuing beyond that for the non-slot nodes.\n", "\n", "So an edge is just a mapping between numbers, and it is perfectly possible to have just any mapping\n", "between numbers in a dataset.\n", "\n", "We store mappings as ordinary TF edge features, so you can use the mapping in both ways, by\n", "\n", "```\n", "nodesInVersion4 = Es('omap@3-4').f(nodeInVersion3)\n", "nodesInVersion3 = Es('omap@3-4').t(nodeInVersion4)\n", "```\n", "\n", "respectively.\n", "When one version supersedes another, we store the mapping between the older and newer version\n", "as an edge in the new version, leaving the older version untouched.\n", "\n", "We store the node mapping with a bit more information than the mere correspondence between nodes.\n", "We also add an integer to each correspondence which indicates how problematic that correspondence is.\n", "\n", "If the correspondence is perfect, we do not add any value.\n", "If it is a simple discrepancy, confined to an equal amount of slots in both versions, we add value `0`.\n", "If the discrepancy is more complicated, we add a higher number.\n", "The details of this will follow." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Slot nodes\n", "\n", "The basic idea in creating a slot mapping is to walk through the slots of both versions in parallel,\n", "and upon encountering a difference, to take one of a few prescribed actions, that may lead to catching up\n", "slots in one of the two versions.\n", "\n", "The standard behaviour is to stop at each difference encountered, unless the difference conforms\n", "to a \"predefined\" case. When there is no match, the user may add a case to the list of cases.\n", "It might be needed to add a different systematic kind of case, and for that programming is needed.\n", "\n", "This notebook shows the patterns and the very small lists of cases that were needed to do the job for 4\n", "version transitions, each corresponding to 1 year or more of encoding activity.\n", "\n", "## Differences\n", "\n", "When we compare versions, our aim is not to record all differences in general, but to record\n", "the correspondence between the slots of the versions, and exactly where and how this\n", "correspondence is disturbed.\n", "\n", "We use the lexeme concept as an anchor point for the correspondence.\n", "If we compare the two versions, slot by slot, and as long as we encounter the same lexemes,\n", "we have an undisturbed correspondence.\n", "In fact, we relax this a little bit, because the very concept of lexeme might change between versions.\n", "So we reduce the information in lexemes considerably, before we compare them, so that we\n", "do not get disturbed by petty changes.\n", "\n", "While being undisturbed, we just create an edge between the slot in the one version that we are at,\n", "to the node in the other version that we are at,\n", "and we assign no value to such an edge.\n", "\n", "But eventually, we encounter real disturbances.\n", "They manifest themselves in just a few situations:\n", "\n", "1. ![1](diffs/diffs.001.png)\n", "2. ![2](diffs/diffs.002.png)\n", "3. ![3](diffs/diffs.003.png)\n", "\n", "In full generality, we can say:\n", "$i$ slots in the source $V$ version correspond\n", "to $j$ slots in the target version $W$,\n", "where $i$ and $j$ may be 0, but not at the same time:\n", "\n", "1. ![4](diffs/diffs.004.png)\n", "\n", "If $i$ slots in version $V$, starting at $n$\n", "get replaced by $j$ slots in the version $W$, starting at $m$,\n", "we create edges between all $n, ..., n+i-1$ on the one hand\n", "and all $m, ..., m+j-1$ on the other hand,\n", "and associate them all with the same number $j-i$.\n", "\n", "But so far, it turns out that the only things we have to deal with,\n", "are specific instances of 1, 2, and 3 above.\n", "\n", "We have a closer look at those cases.\n", "\n", "### Lexeme change\n", "When a lexeme changes at a particular spot $n, m$,\n", "we have $i=j=1$, leading to exactly one edge $(n, m)$ with value $0$.\n", "\n", "### Slot splitting\n", "When slot $n\\in V$ splits into $m, ..., m+j \\in W$, we create edges from $n$ to each of the $m, ..., m+j$,\n", "each carrying the number $j$. The larger $j$ is,\n", "the greater the dissimilarity between node $n\\in V$\n", "and each of the $m, ..., m+j \\in W$.\n", "\n", "### Slot collapse\n", "When slots $n, ..., n+i \\in V$ collapse into $m\\in W$, we create edges from each of the $n, ..., n+i$ to $m$,\n", "each carrying the number $j$. The larger $j$ is,\n", "the greater the dissimilarity between the nodes $n, ..., n+i\\in V$\n", "and $m \\in W$.\n", "\n", "### Slot deletion\n", "When slot $n$ is deleted from $V$, we have $i=1, j=0$, leading to zero edges from $n$.\n", "But so far, we have not encountered this case.\n", "\n", "### Slot addition\n", "When slot $m$ is added to $W$, we have $i=0, j=1$, again leading to zero edges to $m$.\n", "But so far, we have not encountered this case." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Nodes in general\n", "The basic idea we use for the general case is that that nodes are linked to slots.\n", "In text-fabric, the standard `oslots` edge feature lists for each non-slot node the slots it is linked to.\n", "\n", "Combining the just created slot mappings between versions and the `oslots` feature,\n", "we can extend the slot mapping into a general node mapping.\n", "\n", "In order to map a node $n$ in version $V$, we look at its slots $s$,\n", "use the already established *slot mapping* to map these to slots $t$ in version $W$,\n", "and collect the nodes $m$ in version $W$ that are linked to those $t$.\n", "They are good candidates for the mapping.\n", "\n", "![5](diffs/diffs.005.png)\n", "\n", "# Refinements\n", "\n", "When we try to match nodes across versions, based on slot containment, we also respect\n", "their `otype`s. So we will not try to match a `clause` to a `phrase`.\n", "We make implicit use of the fact that for most `otype`s, the members contain disjoint slots.\n", "\n", "# Multiple candidates\n", "Of course, things are not always as neat as in the diagram. Textual objects may have split, or shifted,\n", "or collapsed.\n", "In general we find 0 or more candidates.\n", "Even if we find exactly one candidate, it does not have to be a perfect match.\n", "A typical situation is this:\n", "\n", "![6](diffs/diffs.006.png)\n", "\n", "We do not find a node $m\\in W$ that occupies the mapped slots exactly.\n", "Instead, we find that the target area is split between two candidates who\n", "also reach outside the target area.\n", "\n", "In such cases, we make edges to all such candidates, but we add a dissimilarity measure.\n", "If $m$ is the collection of slots, mapped from $n$, and $m_1$ is a candidate for $n$, meaning $m_1$ has\n", "overlap with $m$, then the *dissimilarity* of $m_1$ is defined as:\n", "\n", "$$|m_1\\cup m| - |m_1\\cap m|$$\n", "\n", "In words: the number of slots in the union of $m_1$ and $m$ minus the number of slots in their intersection.\n", "\n", "In other words: $m_1$ gets a penalty for\n", "\n", "* each slot $s\\in m_1$ that is not in the mapped slots $m$;\n", "* each mapped slot $t\\in m$ that is not in $m_1$.\n", "\n", "If a candidate occupies exactly the mapped slots, the dissimilarity is 0.\n", "If there is only one such candidate of the right type, the case is completely clear, and we\n", "do not add a dissimilarity value to the edge.\n", "\n", "If there are more candidates, all of them will get an edge, and those edges will contain the dissimilarity\n", "value, even if that value is $0$.\n", "\n", "\n", "# Subphrases\n", "The most difficult type to handle in our dataset is the `subphrase`,\n", "because they nest and overlap.\n", "But it turns out that the similarity measure almost always helps out: when looking for candidates\n", "for a mapped subphrase, usually one of them has a dissimilarity of 0.\n", "That's the real counterpart.\n", "\n", "# Reporting\n", "We report the success in establishing the match between non-slot nodes.\n", "We do so per node type, and for each node type we list a few statistics,\n", "both in absolute numbers and in percentage of the total amount of nodes of that\n", "type in the source version.\n", "\n", "We count the nodes that fall in each of the following cases.\n", "The list of cases is ordered by decreasing success of the mapping.\n", "\n", "1. **unique, perfect**: there is only one match for the mapping and it is a perfect one in terms\n", " of slots linked to it;\n", "2. **multiple, one perfect**: there are multiple matches, but at least one is perfect; this occurs\n", " typically if nodes of a type are linked to nested and overlapping sequences of slots, such as `subphrase`s;\n", "3. **unique, imperfect**: there is only one match, but it is not perfect; this indicates that some\n", " boundary reorganization has happened between the two versions, and that some slots of the source node\n", " have been cut off in the target node; yet the fact that the source node and the\n", " target node correspond is clear;\n", "4. **multiple, cleanly composed**: in this case the source node corresponds to a bunch of matches, that\n", " together cleanly cover the mapped slots of the source node; in other words: the original node\n", " has been split in several parts;\n", "5. **multiple, non-perfect**: all remaining cases where there are matches; these situations can be the\n", " result of more intrusive changes; if it turns out to be a small set they do require closer inspection;\n", "6. **not mapped**: these are nodes for which no match could be found." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Computing" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "%load_ext autoreload\n", "%autoreload 2" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "import os\n", "import collections\n", "from functools import reduce\n", "from tf.dataset.nodemaps import caption\n", "from tf.fabric import Fabric" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We specify our versions and the subtle differences between them as far as they are relevant." ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "REPO = os.path.expanduser(\"~/github/etcbc/bhsa\")\n", "baseDir = \"{}/tf\".format(REPO)\n", "tempDir = \"{}/_temp\".format(REPO)\n", "SILENT = \"auto\"\n", "\n", "versions = \"\"\"\n", " 3\n", " 4\n", " 4b\n", " 2016\n", " 2017\n", " c\n", "\"\"\".strip().split()\n", "\n", "# work only with selected versions\n", "# remove this if you want to work with all versions\n", "versions = \"\"\"\n", " 2017\n", " 2021\n", "\"\"\".strip().split()\n", "\n", "versionInfo = {\n", " \"\": dict(\n", " OCC=\"g_word\",\n", " LEX=\"lex\",\n", " ),\n", " \"3\": dict(\n", " OCC=\"text_plain\",\n", " LEX=\"lexeme\",\n", " ),\n", "}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Load all versions in one go!" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "lines_to_next_cell": 2 }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "..............................................................................................\n", ". 0.00s Version -> 2017 <- loading ... .\n", "..............................................................................................\n", "This is Text-Fabric 10.2.0\n", "Api reference : https://annotation.github.io/text-fabric/tf/cheatsheet.html\n", "\n", "115 features found and 0 ignored\n", "..............................................................................................\n", ". 1.85s Version -> 2021 <- loading ... .\n", "..............................................................................................\n", "This is Text-Fabric 10.2.0\n", "Api reference : https://annotation.github.io/text-fabric/tf/cheatsheet.html\n", "\n", "116 features found and 0 ignored\n", "..............................................................................................\n", ". 3.69s All versions loaded .\n", "..............................................................................................\n" ] } ], "source": [ "TF = {}\n", "api = {}\n", "for v in versions:\n", " for (param, value) in versionInfo.get(v, versionInfo[\"\"]).items():\n", " globals()[param] = value\n", " caption(4, \"Version -> {} <- loading ...\".format(v), silent=SILENT)\n", " TF[v] = Fabric(locations=\"{}/{}\".format(baseDir, v), modules=[\"\"], silent=SILENT)\n", " api[v] = TF[v].load(\"{} {}\".format(OCC, LEX)) # noqa F821\n", "caption(4, \"All versions loaded\", silent=SILENT)" ] }, { "cell_type": "markdown", "metadata": { "lines_to_next_cell": 2 }, "source": [ "We want to switch easily between the APIs for the versions." ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "def activate(v):\n", " for (param, value) in versionInfo.get(v, versionInfo[\"\"]).items():\n", " globals()[param] = value\n", " api[v].makeAvailableIn(globals())\n", " caption(4, \"Active version is now -> {} <-\".format(v), silent=SILENT)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Inspect the amount of slots in all versions." ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "..............................................................................................\n", ". 7.49s Active version is now -> 2017 <- .\n", "..............................................................................................\n", "..............................................................................................\n", ". 7.49s Active version is now -> 2021 <- .\n", "..............................................................................................\n" ] } ], "source": [ "nSlots = {}\n", "for v in versions:\n", " activate(v)\n", " nSlots[v] = F.otype.maxSlot\n", " caption(0, \"\\t {} slots\".format(nSlots[v]))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Method\n", "\n", "When we compare two versions, we inspect the lexemes found at corresponding positions in the versions.\n", "We start at the beginning, and when the lexemes do not match, we have a closer look.\n", "\n", "However, in order not to be disturbed by minor discrepancies in the lexemes, we mask the lexemes: we\n", "apply a few transformations to it, such as removing alephs and wavs, and finally even turning them into\n", "ordered sets of letters, thereby loosing the order and multiplicity of letter.\n", "We also strip the disambiguation marks.\n", "\n", "We maintain a current mapping between the slots of the two versions, and we update it if we encounter\n", "disturbances.\n", "Initially, this map is the identity map.\n", "\n", "What we encounter as remaining differences boils down to the following:\n", "\n", "* a lexeme is split into two lexemes with the same total material, typically involving `H`, `MN`, or `B`\n", "* the lexeme is part of a special case, listed in the `cases` table (which has been found by repeatedly\n", " chasing for the first remaining difference.\n", "* the both lexemes differ, but that's it, no map updates have to be done.\n", "\n", "The first two types of cases can be solved by splitting a lexeme into `k` parts or combining `k` lexemes into one.\n", "After that the mapping has to be shifted to the right or to the left from a certain point onward.\n", "\n", "The loop then is as follows:\n", "\n", "* find the first slot with a lexeme in the first version that is different from the lexeme at the mapped slot\n", " in the second version\n", "* analyse what is the case:\n", " * if the disturbance is recognized on the basis of existing patterns and cases, update the map and\n", " consider this case solved\n", " * if the disturbance is not recognized, the case is unsolved, and we break out of the loop.\n", " More analysis is needed, and the outcome of that has to be coded as an extra pattern or case.\n", "* if the status is solved, go back to the first step\n", "\n", "We end up with a mapping from the slots of the first version to those of the other version that links\n", "slots with approximately equal lexemes together." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Making slot mappings\n", "## Lexeme masking\n", "We start by defining our masking function, and compile lists of all lexemes and masked lexemes for all versions." ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [], "source": [ "masks = [\n", " (lambda lex: lex.rstrip(\"[/=\"), \"strip disambiguation\"),\n", " (lambda lex: lex[0:-2] if lex.endswith(\"JM\") else lex, \"remove JM\"),\n", " (lambda lex: lex[0:-2] if lex.endswith(\"WT\") else lex, \"remove WT\"),\n", " (lambda lex: lex.replace(\"J\", \"\"), \"remove J\"),\n", " (lambda lex: lex.replace(\">\", \"\"), \"remove Alef\"),\n", " (lambda lex: lex.replace(\"W\", \"\"), \"remove W\"),\n", " (lambda lex: lex.replace(\"Z\", \"N\"), \"identify Z and N\"),\n", " (lambda lex: lex.rstrip(\"HT\"), \"strip HT\"),\n", " (\n", " lambda lex: (\"\".join(sorted(set(set(lex))))) + \"_\" * lex.count(\"_\"),\n", " \"ignore order and multiplicity\",\n", " ),\n", "]\n", "\n", "\n", "def mask(lex, trans=None):\n", " \"\"\"Apply a single masking operation or apply them all.\n", " \n", " Parameters\n", " ----------\n", " lex: string\n", " The text of the lexem\n", " trans: integer, optional `None`\n", " If given, it is an index in the `masks` list, and the corresponding\n", " mask transformation will be applied to `lex`.\n", " If `None`, all transformations in the `masks` list will be applied in that order.\n", " \n", " Returns\n", " -------\n", " string\n", " The result of transforming `lex`\n", " \"\"\"\n", " if trans is not None:\n", " return masks[trans][0](lex)\n", " for (fun, desc) in masks:\n", " lex = fun(lex)\n", " return lex" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Carry out the lexeme masking for all versions." ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "..............................................................................................\n", ". 12s Masking lexemes .\n", "..............................................................................................\n", "..............................................................................................\n", ". 12s Active version is now -> 2017 <- .\n", "..............................................................................................\n", "..............................................................................................\n", ". 13s Active version is now -> 2021 <- .\n", "..............................................................................................\n" ] } ], "source": [ "lexemes = {}\n", "\n", "caption(4, \"Masking lexemes\", silent=SILENT)\n", "for v in versions:\n", " activate(v)\n", " lexemes[v] = collections.OrderedDict()\n", " for n in F.otype.s(\"word\"):\n", " lex = Fs(LEX).v(n) # noqa F821\n", " lexemes[v][n] = (lex, mask(lex, trans=0), mask(lex))\n", "caption(0, \"Done\", silent=SILENT)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now for each version `v`, `lexemes[v]` is a mapping from word nodes `n` \n", "to lexeme information of the word at node `n`.\n", "The lexeme information is a tuple with members\n", "\n", "* **full lexeme** the full disambiguated lexeme\n", "* **lexeme** the lexeme without the disambiguation marks\n", "* **masked lexeme** the fully transformed lexeme" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "# Cases and mappings\n", "In `cases` we store special cases that we stumbled upon.\n", "Every time we encountered a disturbance which did not follow a recognized pattern,\n", "we turned it into a case.\n", "The number is the slot number in the first version where the case will be applied.\n", "Cases will only be applied at these exact slot numbers and nowhere else.\n", "\n", "In `mappings` we build a mapping between corresponding nodes across a pair of versions.\n", "At some of those correspondences there are disturbances, there we add a measure of the\n", "dissimilarity to the mapped pair.\n", "\n", "Later, we extend those slot mappings to *node* mappings, which are maps between versions where\n", "*all* nodes get mapped, not just slot nodes.\n", "We deliver those node mappings as formal edges in TF.\n", "Then these edges will be added in the second version, so that each newer version knows\n", "how to link to the previous version.\n", "We build the node maps in `edges`.\n", "\n", "We store the dissimilarities in a separate dictionary, `dissimilarity`.\n", "\n", "All these dictionaries are keyed by a 2-tuple of versions." ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "cases = {}\n", "mappings = {}\n", "dissimilarity = {}\n", "edges = {}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Algorithm\n", "\n", "Here is the code that directly implements the method.\n", "Every pair of distinct versions can be mapped.\n", "We store the mappings in a dictionary, keyed by tuples like `(4, 4b)`,\n", "for the mapping from version `4` to `4b`, for instance.\n", "\n", "The loop is in `doDiffs` below." ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [], "source": [ "def inspect(v, w, start, end):\n", " \"\"\"Helper function for inspecting the situation in a given range of slots.\n", " \n", " Parameters\n", " ----------\n", " v: string\n", " First version\n", " w: string\n", " Second version\n", " start: integer\n", " Slot number (in first version) where we start the inspection.\n", " end: integer\n", " Slot number (in first version) where we end the inspection.\n", " \n", " Returns\n", " -------\n", " None\n", " The situation will be printed as a table with a row for each slot\n", " and columns:\n", " slot number in version 1,\n", " lexeme of that slot in version 1,\n", " lexeme of the corresponding slot in version 2\n", " \"\"\"\n", " mapKey = (v, w)\n", " mapping = mappings[mapKey]\n", " version1Info = versionInfo.get(v, versionInfo[\"\"])\n", " version2Info = versionInfo.get(w, versionInfo[\"\"])\n", " \n", " for slot in range(start, end):\n", " print(\n", " \"{:>6}: {:<8} {:<8}\".format(\n", " slot,\n", " api[v].Fs(version1Info[\"LEX\"]).v(slot),\n", " api[w].Fs(version2Info[\"LEX\"]).v(mapping[slot]),\n", " )\n", " )\n", "\n", "\n", "def inspect2(v, w, slot, k):\n", " \"\"\"Helper function for inspecting the edges in a given range of slots.\n", " \n", " Not used, currently.\n", " \n", " Parameters\n", " ----------\n", " v: string\n", " First version\n", " w: string\n", " Second version\n", " slot: integer\n", " Slot number (in first version) in the center of the inspection\n", " k: integer\n", " Amount of slots left and right from the center where we inspect.\n", " \n", " Returns\n", " -------\n", " None\n", " The situation will be printed as a table with a row for each slot\n", " and columns:\n", " slot number in version 1,\n", " the edge at that slot number, or X if there is no edge\n", " \"\"\"\n", " mapKey = (v, w)\n", " edge = edges[mapKey]\n", " for i in range(slot - k, slot + k + 1):\n", " print(f\"EDGE {i} =>\", edge.get(i, \"X\"))\n", "\n", "\n", "def firstDiff(v, w, start):\n", " \"\"\"Find the first discrepancy after a given position.\n", " \n", " First we walk quickly through the slots of th first version,\n", " until we reach the starting position.\n", " \n", " Then we continue walking until the current slot is either\n", " \n", " * a special case\n", " * a discrepancy\n", " \n", " Parameters\n", " ----------\n", " v: string\n", " First version\n", " w: string\n", " Second version\n", " start: integer\n", " start position\n", " \n", " Returns\n", " -------\n", " int or None\n", " If there is no discrepancy, None is returned,\n", " otherwise the position of the first discrepancy.\n", " \"\"\"\n", " mapKey = (v, w)\n", " mapping = mappings[mapKey]\n", " theseCases = cases[mapKey]\n", "\n", " fDiff = None\n", " for (slot, (lex1, bareLex1, maskedLex1)) in lexemes[v].items():\n", " if slot < start:\n", " continue\n", " maskedLex2 = lexemes[w][mapping[slot]][2]\n", " if slot in theseCases or maskedLex1 != maskedLex2:\n", " fDiff = slot\n", " break\n", " return fDiff\n", "\n", "\n", "def printDiff(v, w, slot, k):\n", " \"\"\"Prints the situation around a discrepancy.\n", " \n", " We also show phrase atom boundaries.\n", " WE show the bare lexemes in the display, not the masked lexemes.\n", " \n", " Parameters\n", " ----------\n", " v: string\n", " First version\n", " w: string\n", " Second version\n", " slot: integer\n", " position of the discrepancy\n", " k: integer\n", " amount of slots around the discrepancy to include in the display\n", " \n", " Returns\n", " -------\n", " A plain text display of the situation around the discrepancy.\n", " \"\"\"\n", " \n", " mapKey = (v, w)\n", " mapping = mappings[mapKey]\n", " comps = {}\n", " prevChunkV = None\n", " prevChunkW = None\n", " \n", " # gather the comparison material in comps\n", " # which has as keys the versions and as value a list of display items\n", " \n", " for i in range(slot - k, slot + k + 1):\n", " # determine if we are at a phrase atom boundary in version 1\n", " chunkV = None if i not in mapping else api[v].L.u(i, otype=\"phrase_atom\")\n", " boundaryV = prevChunkV is not None and prevChunkV != chunkV\n", " prevChunkV = chunkV\n", " # determine if we are at the actual discrepancy in version 1\n", " currentV = i == slot\n", "\n", " # determine if we are at a phrase atom boundary in version 2\n", " j = mapping.get(i, None)\n", " chunkW = None if j is None else api[w].L.u(j, otype=\"phrase_atom\")\n", " boundaryW = prevChunkW is not None and prevChunkW != chunkW\n", " prevChunkW = chunkW\n", " # determine if we are at the actual discrepancy in version 2\n", " currentW = j == mapping[slot]\n", "\n", " lvTuple = lexemes[v].get(i, None)\n", " lwTuple = None if j is None else lexemes[w].get(j, None)\n", " lv = \"□\" if lvTuple is None else lvTuple[1] # bare lexeme\n", " lw = \"□\" if lwTuple is None else lwTuple[1] # bare lexeme\n", "\n", " comps.setdefault(v, []).append((lv, currentV, boundaryV))\n", " comps.setdefault(w, []).append((lw, currentW, boundaryW))\n", " \n", " # turn the display items into strings and store them in rep\n", " # which is also keyed by the versions\n", " \n", " rep = {}\n", " for version in comps:\n", " rep[version] = printVersion(version, comps[version])\n", "\n", " # compose the display out of the strings per version\n", " # and make a header of sectional information and slot positions\n", " \n", " print(\n", " \"\"\"{} {}:{} ==> slot {} ==> {}\n", " {}\n", " {}\n", "\"\"\".format(\n", " *api[v].T.sectionFromNode(slot),\n", " slot,\n", " mapping[slot],\n", " rep[v],\n", " rep[w],\n", " )\n", " )\n", "\n", "\n", "def printVersion(v, comps):\n", " \"\"\"Generate a string displaying a stretch of lexemes around a position.\n", " \n", " Parameters\n", " ----------\n", " comps: list of tuple\n", " For each slot there is a comp tuple consisting of\n", " \n", " * the bare lexeme\n", " * whether the slot is in the discrepancy position\n", " * whether the slot is at a phrase atom boundary\n", " \n", " Returns\n", " -------\n", " string\n", " A sequence of lexemes with boundary characters in between.\n", " \"\"\"\n", " \n", " rep = \"\"\n", " for (lex, isCurrent, boundary) in comps:\n", " rep += \"┫┣\" if boundary else \"╋\"\n", " rep += f\"▶{lex}◀\" if isCurrent else lex\n", " rep += \"╋\"\n", " return rep" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "# `doDiffs`\n", "\n", "This function contains the loop to walk through all differences.\n", "\n", "We walk from discrepancy to discrepancy, and stop if there are no more discrepancies or when we\n", "have reached an artificial upper boundary of discrepancies.\n", "\n", "We try to solve the discrepancies.\n", "If we hit a discrepancy that we cannot solve, we break out the loop too.\n", "\n", "## `MAX_ITER`\n", "\n", "The artificial limit is `MAX_ITER`.\n", "You determine it experimentally.\n", "Keep it low at first, when you are meeting the initial discrepancies.\n", "When you have dealt with them and discover that you can dealt with that amount of discrepancies,\n", "increase the limit.\n", "\n", "## Cases\n", "\n", "We will encounter discrepancies, and we will learn how to solve them.\n", "There are some generic ways of solving them, and these we collect in a dictionary of cases.\n", "\n", "The keys of the cases are either slot positions or lexemes.\n", "\n", "When the algorithms walks through the corpus, it will consider slots\n", "whose number or whose lexeme is in the cases as solved.\n", "\n", "The value of a case is a tuple consisting of\n", "\n", "* the name of an *action*\n", "* a parameter\n", "\n", "Here are the actions\n", "\n", "key | action | parameters | description\n", "--- | --- | --- | ---\n", "slot | `ok` | `None` | the discrepancy is OK, nothing to worry about; we set the dissimilarity to 0, which is worse than `None`\n", "slot | `split` | `n` integer | split the lexeme in version 1 into `n` lexemes in version 2; set the dissimilarity to `n`\n", "slot | `collapse` | `n` integer | collapse `n` lexemes in version 1 into one lexeme in version 2; dissimilarity `-n`\n", "lex | `ok` | `alt` string | the discrepancy is OK if version 2 has *alt* instead of *lex*; dissimilarity set to 0\n", "lex | `split` | `n` integer | split *lex* in version 1 into `n` extra slots in version 2; set the dissimilarity to `n`\n", "\n", "If a discrepancy falls through all these special cases, we have a few generic rules that will also be applied:\n", "\n", "* if a lexeme in version 1 contains `_`, we split on it and treat it as separate lexemes.\n", " In fact, we perform the action `split` with parameter the number of parts separated by `_`.\n", "* if the lex in version 1 equals the lex in version 2 plus the next lex in version 2, and if the lex in version 2 is `H`,\n", " we split the lex in version 1 into that `H` and the rest.\n", "* if the set of letters in the masked lexeme in version 1 is the union of the sets of the corresponding masked lexeme\n", " in version 2 plus that of the next lexeme in version 2, and if the corresponding lexeme in version 2 is either `B` or `MN`,\n", " we split the lex in version 1 into that `B` or `MN` and the rest.\n", " \n", "Note that these rules are very corpus dependent, and have been distilled from experience with the BHSA versions involved.\n", "If you are in the process of applying this algorithm to other corpora, you can leave out these rules, and add your\n", "own depending on what you encounter." ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "lines_to_end_of_cell_marker": 2 }, "outputs": [], "source": [ "MAX_ITER = 250\n", "\n", "\n", "def doDiffs(v, w):\n", " mapKey = (v, w)\n", " \n", " thisDissimilarity = {}\n", " dissimilarity[mapKey] = thisDissimilarity\n", " \n", " thisMapping = dict(((n, n) for n in api[v].F.otype.s(\"word\")))\n", " mappings[mapKey] = thisMapping\n", " \n", " theseCases = cases.get(mapKey, {})\n", "\n", " iteration = 0\n", " start = 1\n", "\n", " solved = True\n", "\n", " while True:\n", " # try to find the next difference from where you are now\n", " n = firstDiff(v, w, start)\n", "\n", " if n is None:\n", " print(f\"No more differences.\\nFound {iteration} points of disturbance\")\n", " break\n", "\n", " if iteration > MAX_ITER:\n", " print(\"There might be more disturbances: increase MAX_ITER\")\n", " break\n", "\n", " iteration += 1\n", " \n", " # there is a difference: we have to do work\n", " # we print it as a kind of logging\n", " \n", " printDiff(v, w, n, 5)\n", "\n", " # we try to solve the discrepancy\n", " # first we gather the information of about the lexemes at this position in both versions\n", " \n", " (lex1, bareLex1, maskedLex1) = lexemes[v][n]\n", " (lex2, bareLex2, maskedLex2) = lexemes[w][thisMapping[n]]\n", " \n", " # and at the next position\n", " \n", " (lex1next, bareLex1next, maskedLex1next) = lexemes[v][n + 1]\n", " (lex2next, bareLex2next, maskedLex2next) = lexemes[w][thisMapping[n + 1]]\n", "\n", " # the discrepancy is not solved unless we find it in a case or in a rule\n", " solved = None\n", " skip = 0\n", " \n", " # first check the explicit cases\n", " \n", " if n in theseCases:\n", " (action, param) = theseCases[n]\n", " if action == \"collapse\":\n", " plural = \"\" if param == 1 else \"s\"\n", " solved = f\"{action} {param} fewer slot{plural}\"\n", " thisDissimilarity[n] = -param\n", " skip = param\n", " for m in range(api[v].F.otype.maxSlot, n + param, -1):\n", " thisMapping[m] = thisMapping[m - param]\n", " for m in range(n + 1, n + param + 1):\n", " thisMapping[m] = thisMapping[n]\n", " elif action == \"split\":\n", " plural = \"\" if param == 1 else \"s\"\n", " solved = f\"{action} into {param} extra slot{plural}\"\n", " thisDissimilarity[n] = param\n", " for m in range(n + 1, api[v].F.otype.maxSlot + 1):\n", " thisMapping[m] = thisMapping[m] + param\n", " elif action == \"ok\":\n", " solved = \"incidental variation in lexeme\"\n", " thisDissimilarity[n] = 0\n", " elif lex1 in theseCases:\n", " (action, param) = theseCases[lex1]\n", " if action == \"ok\":\n", " if lex2 == param:\n", " solved = \"systematic variation in lexeme\"\n", " thisDissimilarity[n] = 0\n", " elif action == \"split\":\n", " plural = \"\" if param == 1 else \"s\"\n", " solved = f\"systematic {action} into {param} extra slot{plural}\"\n", " thisDissimilarity[n] = param\n", " for m in range(n + 1, api[v].F.otype.maxSlot + 1):\n", " thisMapping[m] = thisMapping[m] + param\n", " \n", " # then try some more general rules\n", " \n", " elif \"_\" in lex1:\n", " action = \"split\"\n", " param = lex1.count(\"_\")\n", " plural = \"\" if param == 1 else \"s\"\n", " solved = f\"{action} on _ into {param} extra slot{plural}\"\n", " thisDissimilarity[n] = param\n", " for m in range(n + 1, api[v].F.otype.maxSlot + 1):\n", " thisMapping[m] = thisMapping[m] + param\n", " elif lex1 == lex2 + lex2next:\n", " if lex2 == \"H\":\n", " solved = \"split article off\"\n", " thisDissimilarity[n] = 1\n", " for m in range(n + 1, api[v].F.otype.maxSlot + 1):\n", " thisMapping[m] = thisMapping[m] + 1\n", " elif set(maskedLex1) == set(maskedLex2) | set(maskedLex2next):\n", " if lex2 == \"B\" or lex2 == \"MN\":\n", " solved = \"split preposition off\"\n", " thisDissimilarity[n] = 1\n", " for m in range(n + 1, api[v].F.otype.maxSlot + 1):\n", " thisMapping[m] = thisMapping[m] + 1\n", " print(f\"Action: {solved if solved else 'BLOCKED'}\\n\")\n", "\n", " # stop the loop if the discrepancy is not solved\n", " # The discrepancy has already been printed to the output,\n", " # so you can see immediately what is happening there\n", " \n", " if not solved:\n", " break\n", "\n", " # if the discrepancy was solved, \n", " # advance to the first position after the discrepancy\n", " # and try to find a new discrepancy in the next iteration\n", " start = n + 1 + skip\n", "\n", " if not solved:\n", " print(f\"Blocking difference in {iteration} iterations\")" ] }, { "cell_type": "markdown", "metadata": { "lines_to_next_cell": 2 }, "source": [ "The mappings itself are needed elsewhere in Text-Fabric, let us write them to file.\n", "We write them into the dataset corresponding to the target version.\n", "So the map `3-4` ends up in version `4`." ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [], "source": [ "def edgesFromMaps():\n", " edges.clear()\n", " for ((v, w), mp) in sorted(mappings.items()):\n", " caption(4, \"Make edge from slot mapping {} => {}\".format(v, w), silent=SILENT)\n", "\n", " edge = {}\n", " dm = dissimilarity[(v, w)]\n", "\n", " for n in range(1, api[v].F.otype.maxSlot + 1):\n", " m = mp[n]\n", " k = dm.get(n, None)\n", " if k is None:\n", " if n in edge:\n", " if m not in edge[n]:\n", " edge[n][m] = None\n", " else:\n", " edge.setdefault(n, {})[m] = None\n", " else:\n", " if k > 0:\n", " for j in range(m, m + k + 1):\n", " edge.setdefault(n, {})[j] = k\n", " elif k < 0:\n", " for i in range(n, n - k + 1):\n", " edge.setdefault(i, {})[m] = k\n", " else:\n", " edge.setdefault(n, {})[m] = 0\n", " edges[(v, w)] = edge" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Running\n", "\n", "Here we run the mapping between `3` and `4`.\n", "\n", "## 3 => 4\n", "\n", "Here are the special cases for this conversion." ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [], "source": [ "cases.update(\n", " {\n", " (\"3\", \"4\"): {\n", " \"CXH[\": (\"ok\", \"XWH[\"),\n", " \"MQYT/\": (\"split\", 1),\n", " 28730: (\"ok\", None),\n", " 121812: (\"ok\", None),\n", " 174515: (\"ok\", None),\n", " 201089: (\"ok\", None),\n", " 218383: (\"split\", 2),\n", " 221436: (\"ok\", None),\n", " 247730: (\"ok\", None),\n", " 272883: (\"collapse\", 1),\n", " 353611: (\"ok\", None),\n", " },\n", " }\n", ")" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "scrolled": true, "tags": [] }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Genesis 18:2 ==> slot 7840 ==> 7840\n", " ╋MN╋PTX╋H╋>HL┫┣W┫┣▶CXH◀┫┣>RY┫┣W┫┣>MR┫┣>DNJ┫┣>M╋\n", " ╋MN╋PTX╋H╋>HL┫┣W┫┣▶XWH◀┫┣>RY┫┣W┫┣>MR┫┣>DNJ┫┣>M╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 19:1 ==> slot 8447 ==> 8447\n", " ╋W┫┣QWM┫┣L╋QR>┫┣W┫┣▶CXH◀┫┣>P┫┣>RY┫┣W┫┣>MR┫┣HNH╋\n", " ╋W┫┣QWM┫┣L╋QR>┫┣W┫┣▶XWH◀┫┣>P┫┣>RY┫┣W┫┣>MR┫┣HNH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 21:14 ==> slot 9856 ==> 9856\n", " ╋HLK┫┣W┫┣TR_CB<◀┫┣W┫┣KLH┫┣H╋MJM┫┣MN╋\n", " ╋HLK┫┣W┫┣TR◀╋CB<┫┣W┫┣KLH┫┣H╋MJM╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 21:31 ==> slot 10174 ==> 10175\n", " ╋L╋H╋MQWM╋H╋HW>┫┣▶B>R_CB<◀┫┣KJ┫┣CM┫┣CB<┫┣CNJM┫┣W╋\n", " ╋L╋H╋MQWM╋H╋HW>┫┣▶B>R◀╋CB<┫┣KJ┫┣CM┫┣CB<┫┣CNJM╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 21:32 ==> slot 10183 ==> 10185\n", " ╋CNJM┫┣W┫┣KRT┫┣BRJT┫┣B╋▶B>R_CB<◀┫┣W┫┣QWM┫┣>BJMLK╋W╋PJKL╋\n", " ╋CNJM┫┣W┫┣KRT┫┣BRJT┫┣B╋▶B>R◀╋CB<┫┣W┫┣QWM┫┣>BJMLK╋W╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 21:33 ==> slot 10200 ==> 10203\n", " ╋PLCTJ┫┣W┫┣NV<┫┣>CL┫┣B╋▶B>R_CB<◀┫┣W┫┣QR>┫┣CM┫┣B╋CM╋\n", " ╋PLCTJ┫┣W┫┣NV<┫┣>CL┫┣B╋▶B>R◀╋CB<┫┣W┫┣QR>┫┣CM┫┣B╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 22:5 ==> slot 10341 ==> 10345\n", " ╋NL┫┣W┫┣LQX╋\n", " ╋NL┫┣W┫┣LQX╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 22:19 ==> slot 10641 ==> 10645\n", " ╋QWM┫┣W┫┣HLK┫┣JXDW┫┣>L╋▶B>R_CB<◀┫┣W┫┣JCB┫┣>BRHM┫┣B╋B>R_CB<╋\n", " ╋QWM┫┣W┫┣HLK┫┣JXDW┫┣>L╋▶B>R◀╋CB<┫┣W┫┣JCB┫┣>BRHM┫┣B╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 22:19 ==> slot 10646 ==> 10651\n", " ╋B>R_CB<┫┣W┫┣JCB┫┣>BRHM┫┣B╋▶B>R_CB<◀┫┣W┫┣HJH┫┣>XR╋H╋DBR╋\n", " ╋B>R┫┣W┫┣JCB┫┣>BRHM┫┣B╋▶B>R◀╋CB<┫┣W┫┣HJH┫┣>XR╋H╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 23:7 ==> slot 10830 ==> 10836\n", " ╋MWT┫┣W┫┣QWM┫┣>BRHM┫┣W┫┣▶CXH◀┫┣L╋RY┫┣L╋\n", " ╋MWT┫┣W┫┣QWM┫┣>BRHM┫┣W┫┣▶XWH◀┫┣L╋RY┫┣L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 23:12 ==> slot 10933 ==> 10939\n", " ╋NTN┫┣L┫┣QBR┫┣MWT┫┣W┫┣▶CXH◀┫┣>BRHM┫┣L╋PNH╋BRHM┫┣L╋PNH╋ slot 11604 ==> 11610\n", " ╋W┫┣QDD┫┣H╋>JC┫┣W┫┣▶CXH◀┫┣L╋JHWH┫┣W┫┣>MR┫┣BRK╋\n", " ╋W┫┣QDD┫┣H╋>JC┫┣W┫┣▶XWH◀┫┣L╋JHWH┫┣W┫┣>MR┫┣BRK╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 24:48 ==> slot 12051 ==> 12057\n", " ╋T╋\n", " ╋T╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 24:52 ==> slot 12144 ==> 12150\n", " ╋BRHM┫┣>T╋DBR┫┣W┫┣▶CXH◀┫┣>RY┫┣L╋JHWH┫┣W┫┣JY>╋\n", " ╋BRHM┫┣>T╋DBR┫┣W┫┣▶XWH◀┫┣>RY┫┣L╋JHWH┫┣W┫┣JY>╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 25:20 ==> slot 12724 ==> 12730\n", " ╋BT╋BTW>L┫┣H╋>RMJ┫┣MN╋▶PDN_>RM◀┫┣>XWT╋LBN┫┣H╋>RMJ┫┣L╋\n", " ╋BT╋BTW>L┫┣H╋>RMJ┫┣MN╋▶PDN◀╋>RM┫┣>XWT╋LBN┫┣H╋>RMJ╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 26:23 ==> slot 13405 ==> 13412\n", " ╋>RY┫┣W┫┣R_CB<◀┫┣W┫┣R>H┫┣>L┫┣JHWH┫┣B╋\n", " ╋>RY┫┣W┫┣R◀╋CB<┫┣W┫┣R>H┫┣>L┫┣JHWH╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 26:33 ==> slot 13588 ==> 13596\n", " ╋R_CB<◀┫┣R◀╋CB<┫┣ slot 14101 ==> 14110\n", " ╋W╋TJRWC┫┣M┫┣HWH┫┣GBJR┫┣L╋\n", " ╋W╋TJRWC┫┣M┫┣HWH┫┣GBJR┫┣L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 27:29 ==> slot 14109 ==> 14118\n", " ╋HWH┫┣GBJR┫┣L╋>X┫┣W┫┣▶CXH◀┫┣L┫┣BN╋>M┫┣>RR┫┣>RR╋\n", " ╋HWH┫┣GBJR┫┣L╋>X┫┣W┫┣▶XWH◀┫┣L┫┣BN╋>M┫┣>RR┫┣>RR╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 28:2 ==> slot 14510 ==> 14519\n", " ╋MN╋BT╋KNRM◀┫┣BJT╋BTW>L┫┣>B╋>M┫┣W╋\n", " ╋MN╋BT╋KNRM┫┣BJT╋BTW>L┫┣>B╋>M╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 28:5 ==> slot 14568 ==> 14578\n", " ╋JYXQ┫┣>T╋JRM◀┫┣>L╋LBN┫┣BN╋BTW>L┫┣H╋\n", " ╋JYXQ┫┣>T╋JRM┫┣>L╋LBN┫┣BN╋BTW>L╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 28:6 ==> slot 14592 ==> 14603\n", " ╋>T╋JT┫┣▶PDN_>RM◀┫┣L╋LQX┫┣L┫┣MN╋CM╋\n", " ╋>T╋JT┫┣▶PDN◀╋>RM┫┣L╋LQX┫┣L┫┣MN╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 28:7 ==> slot 14623 ==> 14635\n", " ╋W╋>L╋>M┫┣W┫┣HLK┫┣▶PDN_>RM◀┫┣W┫┣R>H┫┣L╋>M┫┣W┫┣HLK┫┣▶PDN◀╋>RM┫┣W┫┣R>H┫┣ slot 14659 ==> 14672\n", " ╋>CH┫┣W┫┣JY>┫┣JR_CB<◀┫┣W┫┣HLK┫┣XRN┫┣W┫┣PG<╋\n", " ╋>CH┫┣W┫┣JY>┫┣JR◀╋CB<┫┣W┫┣HLK┫┣XRN┫┣W╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 31:18 ==> slot 16687 ==> 16701\n", " ╋MQNH╋QNJN┫┣>CR┫┣RKC┫┣B╋▶PDN_>RM◀┫┣L╋BW>┫┣>L╋JYXQ┫┣>B╋\n", " ╋MQNH╋QNJN┫┣>CR┫┣RKC┫┣B╋▶PDN◀╋>RM┫┣L╋BW>┫┣>L╋JYXQ╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 33:3 ==> slot 18117 ==> 18132\n", " ╋HW>┫┣RY┫┣CB<╋P┫┣RY┫┣CB<╋P slot 18175 ==> 18190\n", " ╋CPXH┫┣HNH╋W╋JLD┫┣W┫┣▶CXH◀┫┣W┫┣NGC┫┣GM┫┣L>H╋W╋\n", " ╋CPXH┫┣HNH╋W╋JLD┫┣W┫┣▶XWH◀┫┣W┫┣NGC┫┣GM╋L>H╋W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 33:7 ==> slot 18183 ==> 18198\n", " ╋GM┫┣L>H╋W╋JLD┫┣W┫┣▶CXH◀┫┣W┫┣>XR┫┣NGC┫┣JWSP╋W╋\n", " ╋GM╋L>H╋W╋JLD┫┣W┫┣▶XWH◀┫┣W┫┣>XR┫┣NGC┫┣JWSP╋W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 33:7 ==> slot 18191 ==> 18206\n", " ╋NGC┫┣JWSP╋W╋RXL┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣MJ┫┣L┫┣KL╋\n", " ╋NGC┫┣JWSP╋W╋RXL┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣MJ┫┣L┫┣KL╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 33:18 ==> slot 18397 ==> 18412\n", " ╋>RY╋KN┫┣MN╋▶PDN_>RM◀┫┣W┫┣XNH┫┣>T╋PNH╋H╋\n", " ╋>RY╋KN┫┣MN╋▶PDN◀╋>RM┫┣W┫┣XNH┫┣>T╋PNH╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 35:9 ==> slot 19216 ==> 19232\n", " ╋J┫┣MN╋▶PDN_>RM◀┫┣W┫┣BRK┫┣>T┫┣W┫┣>MR╋\n", " ╋J┫┣MN╋▶PDN◀╋>RM┫┣W┫┣BRK┫┣>T┫┣W╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 35:26 ==> slot 19485 ==> 19502\n", " ╋JCR┫┣JLD┫┣L┫┣B╋▶PDN_>RM◀┫┣W┫┣BW>┫┣JL╋JYXQ╋\n", " ╋JCR┫┣JLD┫┣L┫┣B╋▶PDN◀╋>RM┫┣W┫┣BW>┫┣JL╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 37:7 ==> slot 20271 ==> 20289\n", " ╋W┫┣HNH┫┣SBB┫┣>LMH┫┣W┫┣▶CXH◀┫┣L╋>LMH┫┣W┫┣>MR┫┣L╋\n", " ╋W┫┣HNH┫┣SBB┫┣>LMH┫┣W┫┣▶XWH◀┫┣L╋>LMH┫┣W┫┣>MR┫┣L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 37:9 ==> slot 20323 ==> 20341\n", " ╋JRX╋W╋>XD╋L╋>B╋\n", " ╋JRX╋W╋>XD╋L╋>B╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 37:10 ==> slot 20355 ==> 20373\n", " ╋W╋>M╋W╋>X┫┣L╋▶CXH◀┫┣L┫┣>RY┫┣W┫┣QN>┫┣B╋\n", " ╋W╋>M╋W╋>X┫┣L╋▶XWH◀┫┣L┫┣>RY┫┣W┫┣QN>┫┣B╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 42:6 ==> slot 23509 ==> 23527\n", " ╋W┫┣BW>┫┣>X╋JWSP┫┣W┫┣▶CXH◀┫┣L┫┣>P┫┣>RY┫┣W┫┣R>H╋\n", " ╋W┫┣BW>┫┣>X╋JWSP┫┣W┫┣▶XWH◀┫┣L┫┣>P┫┣>RY┫┣W┫┣R>H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 43:26 ==> slot 24650 ==> 24668\n", " ╋B╋JD┫┣H╋BJT┫┣W┫┣▶CXH◀┫┣L┫┣>RY┫┣W┫┣C>L┫┣L╋\n", " ╋B╋JD┫┣H╋BJT┫┣W┫┣▶XWH◀┫┣L┫┣>RY┫┣W┫┣C>L┫┣L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 43:28 ==> slot 24682 ==> 24700\n", " ╋┫┣H╋\n", " ╋┫┣H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 46:1 ==> slot 25981 ==> 25999\n", " ╋KL┫┣>CR┫┣L┫┣W┫┣BW>┫┣▶B>R_CB<◀┫┣W┫┣ZBX┫┣ZBX┫┣L╋>LHJM╋\n", " ╋KL┫┣>CR┫┣L┫┣W┫┣BW>┫┣▶B>R◀╋CB<┫┣W┫┣ZBX┫┣ZBX┫┣L╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 46:5 ==> slot 26042 ==> 26061\n", " ╋R_CB<◀┫┣W┫┣NF>┫┣BN╋JFR>L┫┣>T╋\n", " ╋R◀╋CB<┫┣W┫┣NF>┫┣BN╋JFR>L╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 46:15 ==> slot 26201 ==> 26221\n", " ╋>CR┫┣JLD┫┣L╋JRM◀┫┣W┫┣>T╋DJNH┫┣BT┫┣KL╋\n", " ╋>CR┫┣JLD┫┣L╋JRM┫┣W┫┣>T╋DJNH┫┣BT╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Genesis 47:31 ==> slot 27267 ==> 27288\n", " ╋L┫┣W┫┣CB<┫┣L┫┣W┫┣▶CXH◀┫┣JFR>L┫┣C╋H╋MVH╋\n", " ╋L┫┣W┫┣CB<┫┣L┫┣W┫┣▶XWH◀┫┣JFR>L┫┣C╋H╋MVH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 48:12 ==> slot 27501 ==> 27522\n", " ╋>T┫┣MN╋P┫┣>RY┫┣W┫┣LQX╋\n", " ╋>T┫┣MN╋P┫┣>RY┫┣W┫┣LQX╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 49:8 ==> slot 27858 ==> 27879\n", " ╋>X┫┣JD┫┣B╋JB┫┣▶CXH◀┫┣L┫┣BN╋>B┫┣GWR╋>RJH╋\n", " ╋>X┫┣JD┫┣B╋JB┫┣▶XWH◀┫┣L┫┣BN╋>B┫┣GWR╋>RJH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Genesis 50:26 ==> slot 28730 ==> 28751\n", " ╋CNH┫┣W┫┣XNV┫┣>T┫┣W┫┣▶FJM◀┫┣B╋H╋>RWN┫┣B╋MYRJM╋\n", " ╋CNH┫┣W┫┣XNV┫┣>T┫┣W┫┣▶JFM◀┫┣B╋H╋>RWN┫┣B╋MYRJM╋\n", "\n", "Action: incidental variation in lexeme\n", "\n", "Exodus 4:31 ==> slot 30778 ==> 30799\n", " ╋>T╋XR┫┣BW>┫┣MCH╋W╋\n", " ╋>T╋XR┫┣BW>┫┣MCH╋W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Exodus 11:8 ==> slot 34587 ==> 34608\n", " ╋KL╋LH┫┣>L┫┣W┫┣▶CXH◀┫┣L┫┣L╋>MR┫┣JY>┫┣>TH╋\n", " ╋KL╋LH┫┣>L┫┣W┫┣▶XWH◀┫┣L┫┣L╋>MR┫┣JY>┫┣>TH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Exodus 12:27 ==> slot 35293 ==> 35314\n", " ╋W┫┣QDD┫┣H╋ slot 38561 ==> 38582\n", " ╋MCH┫┣L╋QR>┫┣XTN┫┣W┫┣▶CXH◀┫┣W┫┣NCQ┫┣L┫┣W┫┣C>L╋\n", " ╋MCH┫┣L╋QR>┫┣XTN┫┣W┫┣▶XWH◀┫┣W┫┣NCQ┫┣L┫┣W┫┣C>L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Exodus 20:5 ==> slot 39627 ==> 39648\n", " ╋TXT╋L╋H╋>RY┫┣L>┫┣▶CXH◀┫┣L┫┣W┫┣L>┫┣RY┫┣L>┫┣▶XWH◀┫┣L┫┣W┫┣L>┫┣ slot 41400 ==> 41421\n", " ╋H╋JBWSJ┫┣W┫┣KXD┫┣L>┫┣▶CXH◀┫┣L╋>LHJM┫┣W┫┣L>┫┣┫┣▶XWH◀┫┣L╋>LHJM┫┣W┫┣L>┫┣ slot 41592 ==> 41613\n", " ╋CB<┫┣MN╋ZQN╋JFR>L┫┣W┫┣▶CXH◀┫┣MN╋RXWQ┫┣W┫┣NGC┫┣MCH╋\n", " ╋CB<┫┣MN╋ZQN╋JFR>L┫┣W┫┣▶XWH◀┫┣MN╋RXWQ┫┣W┫┣NGC┫┣MCH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Exodus 32:8 ==> slot 46644 ==> 46665\n", " ╋ slot 47480 ==> 47501\n", " ╋QWM┫┣KL╋H╋JC┫┣PTX╋>HL┫┣W┫┣DBR╋\n", " ╋QWM┫┣KL╋H╋JC┫┣PTX╋>HL┫┣W┫┣DBR╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Exodus 34:8 ==> slot 47918 ==> 47939\n", " ╋MCH┫┣W┫┣QDD┫┣>RY┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣>M┫┣N>┫┣MY>╋\n", " ╋MCH┫┣W┫┣QDD┫┣>RY┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣>M┫┣N>┫┣MY>╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Exodus 34:14 ==> slot 48052 ==> 48073\n", " ╋>T╋>CRH┫┣KRT┫┣KJ┫┣L>┫┣▶CXH◀┫┣L╋>L╋>XR┫┣KJ┫┣JHWH╋\n", " ╋>T╋>CRH┫┣KRT┫┣KJ┫┣L>┫┣▶XWH◀┫┣L╋>L╋>XR┫┣KJ┫┣JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Leviticus 26:1 ==> slot 68138 ==> 68159\n", " ╋L>┫┣NTN┫┣B╋>RY┫┣L╋▶CXH◀┫┣NJ┫┣JHWH┫┣>LHJM╋\n", " ╋L>┫┣NTN┫┣B╋>RY┫┣L╋▶XWH◀┫┣NJ┫┣JHWH┫┣>LHJM╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Numbers 22:31 ==> slot 84445 ==> 84466\n", " ╋B╋JD┫┣W┫┣QDD┫┣W┫┣▶CXH◀┫┣L╋>P┫┣W┫┣>MR┫┣>L╋\n", " ╋B╋JD┫┣W┫┣QDD┫┣W┫┣▶XWH◀┫┣L╋>P┫┣W┫┣>MR┫┣>L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Numbers 25:2 ==> slot 85620 ==> 85641\n", " ╋W┫┣>KL┫┣H╋LHJM┫┣W┫┣YMD┫┣JFR>L╋\n", " ╋W┫┣>KL┫┣H╋LHJM┫┣W┫┣YMD┫┣JFR>L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Deuteronomy 4:19 ==> slot 95563 ==> 95584\n", " ╋H╋CMJM┫┣W┫┣NDX┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣CR┫┣XLQ╋\n", " ╋H╋CMJM┫┣W┫┣NDX┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣CR┫┣XLQ╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Deuteronomy 5:9 ==> slot 96432 ==> 96453\n", " ╋TXT╋L╋H╋>RY┫┣L>┫┣▶CXH◀┫┣L┫┣W┫┣L>┫┣RY┫┣L>┫┣▶XWH◀┫┣L┫┣W┫┣L>┫┣ slot 98341 ==> 98362\n", " ╋>LHJM╋>XR┫┣W┫┣LHJM╋>XR┫┣W┫┣ slot 99903 ==> 99924\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣XRH┫┣>P╋JHWH╋\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣XRH┫┣>P╋JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Deuteronomy 17:3 ==> slot 102960 ==> 102981\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣L╋H╋CMC╋\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣L╋H╋CMC╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Deuteronomy 26:10 ==> slot 107354 ==> 107375\n", " ╋L╋PNH╋JHWH┫┣>LHJM┫┣W┫┣▶CXH◀┫┣L╋PNH╋JHWH┫┣>LHJM┫┣W╋\n", " ╋L╋PNH╋JHWH┫┣>LHJM┫┣W┫┣▶XWH◀┫┣L╋PNH╋JHWH┫┣>LHJM┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Deuteronomy 29:25 ==> slot 110064 ==> 110085\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣>LHJM┫┣>CR┫┣L>┫┣JD<╋\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣>LHJM┫┣>CR┫┣L>┫┣JD<╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Deuteronomy 30:17 ==> slot 110528 ==> 110549\n", " ╋L>┫┣CM<┫┣W┫┣NDX┫┣W┫┣▶CXH◀┫┣L╋>LHJM╋>XR┫┣W┫┣┫┣CM<┫┣W┫┣NDX┫┣W┫┣▶XWH◀┫┣L╋>LHJM╋>XR┫┣W┫┣ slot 115288 ==> 115309\n", " ╋JHWCW<┫┣>L╋PNH┫┣>RY┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣L┫┣MH┫┣>DWN╋\n", " ╋JHWCW<┫┣>L╋PNH┫┣>RY┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣L┫┣MH┫┣>DWN╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Joshua 15:22 ==> slot 121812 ==> 121833\n", " ╋W╋JGWR┫┣W┫┣QJNH╋W╋▶DJBWN◀╋W╋ slot 121847 ==> 121868\n", " ╋W╋BJT_PLV┫┣W┫┣XYR_CWR_CB<◀╋W╋BZJWTJH┫┣BR◀╋CB<╋W╋BZJWTJH┫┣B slot 123405 ==> 123427\n", " ╋W┫┣HJH┫┣L┫┣B╋NXLH┫┣▶B>R_CB<◀╋W╋CB<╋W╋MWLDH┫┣W╋\n", " ╋W┫┣HJH┫┣L┫┣B╋NXLH┫┣▶B>R◀╋CB<╋W╋CB<╋W╋MWLDH╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Joshua 19:13 ==> slot 123552 ==> 123575\n", " ╋MN╋CM┫┣┫┣RMWN┫┣H╋\n", " ╋MN╋CM┫┣╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Joshua 19:13 ==> slot 123553 ==> 123577\n", " ╋CM┫┣┫┣RMWN┫┣H┫┣T>R╋\n", " ╋CM┫┣┫┣RMWN┫┣H╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Joshua 23:7 ==> slot 126335 ==> 126360\n", " ╋W┫┣L>┫┣┫┣▶CXH◀┫┣L┫┣KJ╋>M┫┣B╋JHWH╋\n", " ╋W┫┣L>┫┣┫┣▶XWH◀┫┣L┫┣KJ╋>M┫┣B╋JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Joshua 23:16 ==> slot 126577 ==> 126602\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣XRH┫┣>P╋JHWH╋\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣XRH┫┣>P╋JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Judges 2:12 ==> slot 128424 ==> 128449\n", " ╋H╋CR┫┣SBJB┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣KT╋JHWH╋\n", " ╋H╋CR┫┣SBJB┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣KT╋JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Judges 2:17 ==> slot 128518 ==> 128543\n", " ╋ZNH┫┣>XR╋>LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣SWR┫┣MHR┫┣MN╋H╋\n", " ╋ZNH┫┣>XR╋>LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣SWR┫┣MHR┫┣MN╋H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Judges 2:19 ==> slot 128586 ==> 128611\n", " ╋>XR┫┣L╋┫┣NPL┫┣MN╋MXR┫┣L╋┫┣NPL┫┣MN╋M slot 131789 ==> 131814\n", " ╋XLWM╋W╋>T╋CBR┫┣W┫┣▶CXH◀┫┣W┫┣CWB┫┣>L╋MXNH╋JFR>L╋\n", " ╋XLWM╋W╋>T╋CBR┫┣W┫┣▶XWH◀┫┣W┫┣CWB┫┣>L╋MXNH╋JFR>L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Judges 20:1 ==> slot 139866 ==> 139891\n", " ╋L╋MN╋DN╋W╋R_CB<◀╋W╋>RY╋H╋GLL╋\n", " ╋L╋MN╋DN╋W╋R◀╋CB<╋W╋>RY╋H╋GL slot 141555 ==> 141581\n", " ╋ slot 141867 ==> 141893\n", " ╋CKM┫┣B╋H╋BQR┫┣W┫┣▶CXH◀┫┣L╋PNH╋JHWH┫┣W┫┣CWB╋\n", " ╋CKM┫┣B╋H╋BQR┫┣W┫┣▶XWH◀┫┣L╋PNH╋JHWH┫┣W┫┣CWB╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 1:28 ==> slot 142070 ==> 142096\n", " ╋HW>┫┣C>L┫┣L╋JHWH┫┣W┫┣▶CXH◀┫┣CM┫┣L╋JHWH┫┣W┫┣PLL╋\n", " ╋HW>┫┣C>L┫┣L╋JHWH┫┣W┫┣▶XWH◀┫┣CM┫┣L╋JHWH┫┣W┫┣PLL╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 2:36 ==> slot 142808 ==> 142834\n", " ╋JTR┫┣B╋BJT┫┣BW>┫┣L╋▶CXH◀┫┣L┫┣L╋>GWRH╋KSP╋W╋\n", " ╋JTR┫┣B╋BJT┫┣BW>┫┣L╋▶XWH◀┫┣L┫┣L╋>GWRH╋KSP╋W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 3:20 ==> slot 143208 ==> 143234\n", " ╋JFR>L┫┣MN╋DN╋W╋R_CB<◀┫┣KJ┫┣>MN┫┣CMW>L┫┣L╋NBJ>╋\n", " ╋JFR>L┫┣MN╋DN╋W╋R◀╋CB<┫┣KJ┫┣>MN┫┣CMW>L┫┣L╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "1_Samuel 8:2 ==> slot 145073 ==> 145100\n", " ╋CM╋MCNH┫┣>BJH┫┣CPV┫┣B╋▶B>R_CB<◀┫┣W┫┣L>┫┣HLK┫┣BN┫┣B╋\n", " ╋CM╋MCNH┫┣>BJH┫┣CPV┫┣B╋▶B>R◀╋CB<┫┣W┫┣L>┫┣HLK┫┣BN╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "1_Samuel 9:21 ==> slot 145980 ==> 146008\n", " ╋C>WL┫┣W┫┣>MR┫┣H┫┣L>┫┣▶BN_JMJNJ◀┫┣>NKJ┫┣MN╋QVN╋CBV╋JFR>L╋\n", " ╋C>WL┫┣W┫┣>MR┫┣H┫┣L>┫┣▶BN◀╋JMJNJ┫┣>NKJ┫┣MN╋QVN╋CBV╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "1_Samuel 15:25 ==> slot 150040 ==> 150069\n", " ╋XV>T┫┣W┫┣CWB┫┣MR┫┣CMW>L╋\n", " ╋XV>T┫┣W┫┣CWB┫┣MR┫┣CMW>L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 15:30 ==> slot 150127 ==> 150156\n", " ╋JFR>L┫┣W┫┣CWB┫┣LHJM┫┣W┫┣CWB╋\n", " ╋JFR>L┫┣W┫┣CWB┫┣LHJM┫┣W┫┣CWB╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 15:31 ==> slot 150137 ==> 150166\n", " ╋CWB┫┣CMW>L┫┣>XR╋C>WL┫┣W┫┣▶CXH◀┫┣C>WL┫┣L╋JHWH┫┣W┫┣>MR╋\n", " ╋CWB┫┣CMW>L┫┣>XR╋C>WL┫┣W┫┣▶XWH◀┫┣C>WL┫┣L╋JHWH┫┣W┫┣>MR╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 20:41 ==> slot 154264 ==> 154293\n", " ╋NPL┫┣L╋>P┫┣>RY┫┣W┫┣▶CXH◀┫┣CLC╋PJC╋\n", " ╋NPL┫┣L╋>P┫┣>RY┫┣W┫┣▶XWH◀┫┣CLC╋PJC╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 24:9 ==> slot 156141 ==> 156170\n", " ╋QDD┫┣DWD┫┣>P┫┣>RY┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣DWD┫┣L╋C>WL╋\n", " ╋QDD┫┣DWD┫┣>P┫┣>RY┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣DWD┫┣L╋C>WL╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 25:23 ==> slot 156977 ==> 157006\n", " ╋>P╋DWD┫┣RY┫┣W┫┣NPL┫┣P╋DWD┫┣RY┫┣W┫┣NPL┫┣ slot 157436 ==> 157465\n", " ╋L╋>CH┫┣W┫┣QWM┫┣W┫┣▶CXH◀┫┣>P┫┣>RY┫┣W┫┣>MR┫┣HNH╋\n", " ╋L╋>CH┫┣W┫┣QWM┫┣W┫┣▶XWH◀┫┣>P┫┣>RY┫┣W┫┣>MR┫┣HNH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Samuel 28:14 ==> slot 158789 ==> 158818\n", " ╋W┫┣QDD┫┣>P┫┣>RY┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣CMW>L┫┣>L╋C>WL╋\n", " ╋W┫┣QDD┫┣>P┫┣>RY┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣CMW>L┫┣>L╋C>WL╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 1:2 ==> slot 160475 ==> 160504\n", " ╋DWD┫┣W┫┣NPL┫┣>RY┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣L┫┣DWD┫┣>J╋\n", " ╋DWD┫┣W┫┣NPL┫┣>RY┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣L┫┣DWD┫┣>J╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 3:10 ==> slot 161848 ==> 161877\n", " ╋JHWDH┫┣MN╋DN╋W╋R_CB<◀┫┣W┫┣L>┫┣JKL┫┣R◀╋CB<┫┣W┫┣L>┫┣JKL┫┣ slot 165033 ==> 165063\n", " ╋W┫┣NPL┫┣MR┫┣DWD┫┣MPJBCT┫┣W╋\n", " ╋W┫┣NPL┫┣MR┫┣DWD┫┣MPJBCT┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 9:8 ==> slot 165073 ==> 165103\n", " ╋LXM┫┣MR┫┣MH┫┣MR┫┣MH┫┣ slot 166812 ==> 166842\n", " ╋W┫┣BW>┫┣BJT╋JHWH┫┣W┫┣▶CXH◀┫┣W┫┣BW>┫┣>L╋BJT┫┣W╋\n", " ╋W┫┣BW>┫┣BJT╋JHWH┫┣W┫┣▶XWH◀┫┣W┫┣BW>┫┣>L╋BJT┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 14:4 ==> slot 168053 ==> 168083\n", " ╋NPL┫┣P┫┣>RY┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣JC<┫┣H╋MLK╋\n", " ╋NPL┫┣P┫┣>RY┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣JC<┫┣H╋MLK╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 14:22 ==> slot 168532 ==> 168562\n", " ╋JW>B┫┣>L╋PNH┫┣>RY┫┣W┫┣▶CXH◀┫┣W┫┣BRK┫┣>T╋H╋MLK╋\n", " ╋JW>B┫┣>L╋PNH┫┣>RY┫┣W┫┣▶XWH◀┫┣W┫┣BRK┫┣>T╋H╋MLK╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 14:33 ==> slot 168816 ==> 168846\n", " ╋BW>┫┣>L╋H╋MLK┫┣W┫┣▶CXH◀┫┣L┫┣P┫┣>RY┫┣L╋\n", " ╋BW>┫┣>L╋H╋MLK┫┣W┫┣▶XWH◀┫┣L┫┣P┫┣>RY┫┣L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 15:5 ==> slot 168939 ==> 168969\n", " ╋HJH┫┣B╋QRB┫┣>JC┫┣L╋▶CXH◀┫┣L┫┣W┫┣CLX┫┣>T╋JD╋\n", " ╋HJH┫┣B╋QRB┫┣>JC┫┣L╋▶XWH◀┫┣L┫┣W┫┣CLX┫┣>T╋JD╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 15:32 ==> slot 169576 ==> 169606\n", " ╋BW>┫┣C┫┣>CR┫┣▶CXH◀┫┣CM┫┣L╋>LHJM┫┣W┫┣HNH╋\n", " ╋BW>┫┣C┫┣>CR┫┣▶XWH◀┫┣CM┫┣L╋>LHJM┫┣W┫┣HNH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 16:4 ==> slot 169812 ==> 169842\n", " ╋L╋MPJBCT┫┣W┫┣>MR┫┣YJB>┫┣▶CXH◀┫┣MY>┫┣XN┫┣B╋DWN╋\n", " ╋L╋MPJBCT┫┣W┫┣>MR┫┣YJB>┫┣▶XWH◀┫┣MY>┫┣XN┫┣B╋DWN╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 17:11 ==> slot 170486 ==> 170516\n", " ╋JFR>L┫┣MN╋DN╋W╋R_CB<◀┫┣K╋H╋XWL┫┣>CR┫┣L┫┣MN╋DN╋W╋R◀╋CB<┫┣K╋H╋XWL┫┣>CR╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "2_Samuel 18:21 ==> slot 171572 ==> 171603\n", " ╋H╋MLK┫┣>CR┫┣R>H┫┣W┫┣▶CXH◀┫┣KCJ┫┣L╋JW>B┫┣W┫┣RWY╋\n", " ╋H╋MLK┫┣>CR┫┣R>H┫┣W┫┣▶XWH◀┫┣KCJ┫┣L╋JW>B┫┣W┫┣RWY╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 18:28 ==> slot 171749 ==> 171780\n", " ╋>L╋H╋MLK┫┣CLWM┫┣W┫┣▶CXH◀┫┣L╋H╋MLK┫┣L╋>P╋\n", " ╋>L╋H╋MLK┫┣CLWM┫┣W┫┣▶XWH◀┫┣L╋H╋MLK┫┣L╋>P╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Samuel 22:27 ==> slot 174515 ==> 174546\n", " ╋GBWR╋TMJM┫┣TMM┫┣ slot 175416 ==> 175447\n", " ╋JFR>L┫┣MN╋DN╋W╋R_CB<◀┫┣W┫┣PQD┫┣>T╋H╋L┫┣MN╋DN╋W╋R◀╋CB<┫┣W┫┣PQD┫┣>T╋H╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "2_Samuel 24:6 ==> slot 175525 ==> 175557\n", " ╋>L╋>RY╋TXTJM_XDCJ┫┣W┫┣BW>┫┣▶DN_JL╋YJDWN┫┣W╋\n", " ╋>L╋>RY╋TXTJM_XDCJ┫┣W┫┣BW>┫┣▶DN◀╋JL╋YJDWN╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "2_Samuel 24:7 ==> slot 175547 ==> 175580\n", " ╋W┫┣JY>┫┣>L╋NGB╋JHWDH┫┣▶B>R_CB<◀┫┣W┫┣CWV┫┣B╋KL╋H╋\n", " ╋W┫┣JY>┫┣>L╋NGB╋JHWDH┫┣▶B>R◀╋CB<┫┣W┫┣CWV┫┣B╋KL╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "2_Samuel 24:15 ==> slot 175749 ==> 175783\n", " ╋R_CB<◀┫┣CB<╋>LP╋>JC┫┣W┫┣CLX╋\n", " ╋R◀╋CB<┫┣CB<╋>LP╋>JC┫┣W╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "2_Samuel 24:20 ==> slot 175878 ==> 175913\n", " ╋┫┣>RWNH┫┣W┫┣▶CXH◀┫┣L╋H╋MLK┫┣>P┫┣>RY╋\n", " ╋┫┣>RWNH┫┣W┫┣▶XWH◀┫┣L╋H╋MLK┫┣>P┫┣>RY╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Kings 1:16 ==> slot 176360 ==> 176395\n", " ╋MLK┫┣W┫┣QDD┫┣BT_CB<┫┣W┫┣▶CXH◀┫┣L╋H╋MLK┫┣W┫┣>MR╋\n", " ╋MLK┫┣W┫┣QDD┫┣BT_CB<┫┣W┫┣▶XWH◀┫┣L╋H╋MLK┫┣W┫┣>MR╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Kings 1:23 ==> slot 176502 ==> 176537\n", " ╋L╋PNH╋H╋MLK┫┣W┫┣▶CXH◀┫┣L╋H╋MLK┫┣P╋\n", " ╋L╋PNH╋H╋MLK┫┣W┫┣▶XWH◀┫┣L╋H╋MLK┫┣P╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Kings 1:31 ==> slot 176685 ==> 176720\n", " ╋QDD┫┣BT_CB<┫┣>P┫┣>RY┫┣W┫┣▶CXH◀┫┣L╋H╋MLK┫┣W┫┣>MR╋\n", " ╋QDD┫┣BT_CB<┫┣>P┫┣>RY┫┣W┫┣▶XWH◀┫┣L╋H╋MLK┫┣W┫┣>MR╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Kings 1:47 ==> slot 177085 ==> 177120\n", " ╋>T╋KS>┫┣MN╋KS>┫┣W┫┣▶CXH◀┫┣H╋MLK┫┣T╋KS>┫┣MN╋KS>┫┣W┫┣▶XWH◀┫┣H╋MLK┫┣ slot 177213 ==> 177248\n", " ╋H╋MZBX┫┣W┫┣BW>┫┣W┫┣▶CXH◀┫┣L╋H╋MLK┫┣CLMH┫┣W╋\n", " ╋H╋MZBX┫┣W┫┣BW>┫┣W┫┣▶XWH◀┫┣L╋H╋MLK┫┣CLMH┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Kings 2:19 ==> slot 177635 ==> 177670\n", " ╋H╋MLK┫┣L╋QR>┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣JCB┫┣╋\n", " ╋H╋MLK┫┣L╋QR>┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣JCB┫┣╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Kings 5:5 ==> slot 179399 ==> 179434\n", " ╋T>NH┫┣MN╋DN╋W╋R_CB<◀┫┣KL╋JWM╋CLMH┫┣W┫┣HJH╋\n", " ╋T>NH┫┣MN╋DN╋W╋R◀╋CB<┫┣KL╋JWM╋CLMH┫┣W╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "1_Kings 9:6 ==> slot 183762 ==> 183798\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣KRT┫┣>T╋JFR>L╋\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣KRT┫┣>T╋JFR>L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Kings 9:9 ==> slot 183852 ==> 183888\n", " ╋XZQ┫┣B╋>LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣ slot 185644 ==> 185680\n", " ╋JFR>L┫┣JCR┫┣LHJM╋YJDNJ┫┣L╋\n", " ╋JFR>L┫┣JCR┫┣LHJM╋YJDNJ┫┣L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Kings 16:31 ==> slot 189693 ==> 189729\n", " ╋T╋H╋BT╋H╋B slot 191400 ==> 191436\n", " ╋HLK┫┣>L╋NPC┫┣W┫┣BW>┫┣▶B>R_CB<◀┫┣>CR┫┣L╋JHWDH┫┣W┫┣NWX╋\n", " ╋HLK┫┣>L╋NPC┫┣W┫┣BW>┫┣▶B>R◀╋CB<┫┣>CR┫┣L╋JHWDH┫┣W╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "1_Kings 22:54 ==> slot 194696 ==> 194733\n", " ╋T╋H╋BT╋JHWH╋\n", " ╋T╋H╋BT╋JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 2:15 ==> slot 195567 ==> 195604\n", " ╋W┫┣BW>┫┣L╋QR>┫┣W┫┣▶CXH◀┫┣L┫┣>RY┫┣W┫┣>MR┫┣>L╋\n", " ╋W┫┣BW>┫┣L╋QR>┫┣W┫┣▶XWH◀┫┣L┫┣>RY┫┣W┫┣>MR┫┣>L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 4:37 ==> slot 197205 ==> 197242\n", " ╋W┫┣NPL┫┣RY┫┣W┫┣NF>┫┣>T╋BN╋\n", " ╋W┫┣NPL┫┣RY┫┣W┫┣NF>┫┣>T╋BN╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 5:18 ==> slot 197853 ==> 197890\n", " ╋BW>┫┣>DWN┫┣BJT╋RMWN┫┣L╋▶CXH◀┫┣CM┫┣W┫┣HW>┫┣C┫┣>DWN┫┣BJT╋RMWN┫┣L╋▶XWH◀┫┣CM┫┣W┫┣HW>┫┣C slot 197861 ==> 197898\n", " ╋HW>┫┣C┫┣C slot 197865 ==> 197902\n", " ╋W┫┣CXH┫┣BJT╋RMWN┫┣B╋▶CXH◀┫┣BJT╋RMWN┫┣SLX┫┣JHWH┫┣L╋\n", " ╋W┫┣XWH┫┣BJT╋RMWN┫┣B╋▶XWH◀┫┣BJT╋RMWN┫┣SLX┫┣JHWH┫┣L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 10:2 ==> slot 201089 ==> 201126\n", " ╋>X>B┫┣L╋>MR┫┣W┫┣┫┣H╋SPR╋H╋ZH╋\n", " ╋>X>B┫┣L╋>MR┫┣W┫┣┫┣H╋SPR╋H╋ZH╋\n", "\n", "Action: incidental variation in lexeme\n", "\n", "2_Kings 12:2 ==> slot 202531 ==> 202568\n", " ╋W┫┣CM╋>M┫┣YBJH┫┣MN╋▶B>R_CB<◀┫┣W┫┣C┫┣H┫┣JCR╋\n", " ╋W┫┣CM╋>M┫┣YBJH┫┣MN╋▶B>R◀╋CB<┫┣W┫┣C┫┣H╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "2_Kings 17:16 ==> slot 205873 ==> 205911\n", " ╋CRH┫┣W┫┣▶CXH◀┫┣L╋KL╋YB>╋H╋CMJM╋\n", " ╋CRH┫┣W┫┣▶XWH◀┫┣L╋KL╋YB>╋H╋CMJM╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 17:35 ==> slot 206336 ==> 206374\n", " ╋JR>┫┣>LHJM╋>XR┫┣W┫┣L>┫┣▶CXH◀┫┣L┫┣W┫┣L>┫┣┫┣>LHJM╋>XR┫┣W┫┣L>┫┣▶XWH◀┫┣L┫┣W┫┣L>┫┣ slot 206366 ==> 206404\n", " ╋NVH┫┣>T┫┣JR>┫┣W┫┣L┫┣▶CXH◀┫┣W┫┣L┫┣ZBX┫┣W┫┣>T╋\n", " ╋NVH┫┣>T┫┣JR>┫┣W┫┣L┫┣▶XWH◀┫┣W┫┣L┫┣ZBX┫┣W┫┣>T╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 18:22 ==> slot 206992 ==> 207030\n", " ╋PNH╋H╋MZBX╋H╋ZH┫┣▶CXH◀┫┣B╋JRWCLM┫┣W┫┣ slot 208105 ==> 208143\n", " ╋B╋NJNWH┫┣W┫┣HJH┫┣HW>┫┣▶CXH◀┫┣BJT╋NSRK┫┣>LHJM┫┣W┫┣>DRMLK╋\n", " ╋B╋NJNWH┫┣W┫┣HJH┫┣HW>┫┣▶XWH◀┫┣BJT╋NSRK┫┣>LHJM┫┣W┫┣>DRMLK╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 21:3 ==> slot 208684 ==> 208722\n", " ╋X>B┫┣MLK╋JFR>L┫┣W┫┣▶CXH◀┫┣L╋KL╋YB>╋H╋CMJM╋\n", " ╋X>B┫┣MLK╋JFR>L┫┣W┫┣▶XWH◀┫┣L╋KL╋YB>╋H╋CMJM╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 21:21 ==> slot 209093 ==> 209131\n", " ╋GLWLJM┫┣>CR┫┣B┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣T╋JHWH╋\n", " ╋GLWLJM┫┣>CR┫┣B┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣T╋JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Kings 23:8 ==> slot 210002 ==> 210040\n", " ╋H╋KHN┫┣MN╋GB<┫┣R_CB<◀┫┣W┫┣NTY┫┣>T╋BMH╋H╋\n", " ╋H╋KHN┫┣MN╋GB<┫┣R◀╋CB<┫┣W┫┣NTY┫┣>T╋BMH╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Isaiah 2:8 ==> slot 212626 ==> 212665\n", " ╋>RY┫┣>LJL┫┣L╋MCR┫┣YB<┫┣W╋\n", " ╋>RY┫┣>LJL┫┣L╋MCR┫┣YB<┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Isaiah 2:20 ==> slot 212810 ==> 212849\n", " ╋ZHB┫┣>CR┫┣CR┫┣ slot 218383 ==> 218422\n", " ╋W┫┣CB<┫┣L╋JHWH╋YB>┫┣▶MR┫┣L╋>XD┫┣B╋H╋\n", " ╋W┫┣CB<┫┣L╋JHWH╋YB>┫┣▶MR┫┣L╋>XD╋\n", "\n", "Action: split into 2 extra slots\n", "\n", "Isaiah 27:13 ==> slot 220739 ==> 220780\n", " ╋NDX┫┣B╋>RY╋MYRJM┫┣W┫┣▶CXH◀┫┣L╋JHWH┫┣B╋HR╋H╋\n", " ╋NDX┫┣B╋>RY╋MYRJM┫┣W┫┣▶XWH◀┫┣L╋JHWH┫┣B╋HR╋H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Isaiah 29:9 ==> slot 221436 ==> 221477\n", " ╋H┫┣YB>┫┣┫┣ slot 223853 ==> 223894\n", " ╋PNH╋H╋MZBX╋H╋ZH┫┣▶CXH◀┫┣W┫┣┫┣>T╋\n", " ╋PNH╋H╋MZBX╋H╋ZH┫┣▶XWH◀┫┣W┫┣┫┣>T╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Isaiah 37:38 ==> slot 224931 ==> 224972\n", " ╋B╋NJNWH┫┣W┫┣HJH┫┣HW>┫┣▶CXH◀┫┣BJT╋NSRK┫┣>LHJM┫┣W┫┣>DRMLK╋\n", " ╋B╋NJNWH┫┣W┫┣HJH┫┣HW>┫┣▶XWH◀┫┣BJT╋NSRK┫┣>LHJM┫┣W┫┣>DRMLK╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Isaiah 44:15 ==> slot 227568 ==> 227609\n", " ╋LXM┫┣>P┫┣PL┫┣W┫┣▶CXH◀┫┣P┫┣PL┫┣W┫┣▶XWH◀┫┣ slot 227604 ==> 227645\n", " ╋L╋PSL┫┣SGD┫┣L┫┣W┫┣▶CXH◀┫┣W┫┣PLL┫┣>L┫┣W┫┣>MR╋\n", " ╋L╋PSL┫┣SGD┫┣L┫┣W┫┣▶XWH◀┫┣W┫┣PLL┫┣>L┫┣W┫┣>MR╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Isaiah 45:14 ==> slot 228066 ==> 228107\n", " ╋H╋ZQJM┫┣L┫┣▶CXH◀┫┣>L┫┣PLL┫┣>K┫┣B┫┣>L╋\n", " ╋H╋ZQJM┫┣L┫┣▶XWH◀┫┣>L┫┣PLL┫┣>K┫┣B┫┣>L╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Isaiah 46:6 ==> slot 228361 ==> 228402\n", " ╋W┫┣L┫┣SGD┫┣>P┫┣▶CXH◀┫┣NF>┫┣L┫┣SGD┫┣>P┫┣▶XWH◀┫┣NF>┫┣ slot 229225 ==> 229266\n", " ╋R>H┫┣W┫┣QWM┫┣FR┫┣W┫┣▶CXH◀┫┣LMCR┫┣>MN┫┣QDWC╋\n", " ╋R>H┫┣W┫┣QWM┫┣FR┫┣W┫┣▶XWH◀┫┣LMCR┫┣>MN┫┣QDWC╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Isaiah 49:23 ==> slot 229494 ==> 229535\n", " ╋W┫┣FRH┫┣MJNQT┫┣>P┫┣>RY┫┣▶CXH◀┫┣L┫┣W┫┣P┫┣>RY┫┣▶XWH◀┫┣L┫┣W┫┣ slot 232790 ==> 232831\n", " ╋>L┫┣CXX┫┣BN╋Y╋\n", " ╋>L┫┣CXX┫┣BN╋Y╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Isaiah 66:23 ==> slot 234914 ==> 234955\n", " ╋CBT┫┣BW>┫┣KL╋BFR┫┣L╋▶CXH◀┫┣L╋PNH┫┣>MR┫┣JHWH┫┣W╋\n", " ╋CBT┫┣BW>┫┣KL╋BFR┫┣L╋▶XWH◀┫┣L╋PNH┫┣>MR┫┣JHWH┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Jeremiah 1:16 ==> slot 235230 ==> 235271\n", " ╋QVR┫┣L╋>LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L╋MTH╋\n", " ╋QVR┫┣L╋>LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L╋MTH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Jeremiah 7:2 ==> slot 238176 ==> 238217\n", " ╋H╋CLH┫┣L╋▶CXH◀┫┣L╋JHWH┫┣KH┫┣>MR┫┣JHWH╋\n", " ╋H╋CLH┫┣L╋▶XWH◀┫┣L╋JHWH┫┣KH┫┣>MR┫┣JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Jeremiah 8:2 ==> slot 238952 ==> 238993\n", " ╋W┫┣>CR┫┣DRC┫┣W┫┣>CR┫┣▶CXH◀┫┣L┫┣L>┫┣>SP┫┣W┫┣L>╋\n", " ╋W┫┣>CR┫┣DRC┫┣W┫┣>CR┫┣▶XWH◀┫┣L┫┣L>┫┣>SP┫┣W┫┣L>╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Jeremiah 13:10 ==> slot 241282 ==> 241323\n", " ╋>XR┫┣L╋XR┫┣L╋ slot 242826 ==> 242867\n", " ╋>LHJM╋>XR┫┣W┫┣T┫┣LHJM╋>XR┫┣W┫┣T┫┣ slot 245440 ==> 245481\n", " ╋>T╋BRJT╋JHWH┫┣>LHJM┫┣W┫┣▶CXH◀┫┣L╋>LHJM╋>XR┫┣W┫┣T╋BRJT╋JHWH┫┣>LHJM┫┣W┫┣▶XWH◀┫┣L╋>LHJM╋>XR┫┣W┫┣ slot 247079 ==> 247120\n", " ╋>XR┫┣L╋┫┣KT╋\n", " ╋>XR┫┣L╋┫┣KT╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Jeremiah 25:34 ==> slot 247730 ==> 247771\n", " ╋ML>┫┣JWM┫┣L╋VBX┫┣W┫┣▶TPWYH◀┫┣W┫┣NPL┫┣K╋KLJ╋XMDH╋\n", " ╋ML>┫┣JWM┫┣L╋VBX┫┣W┫┣▶PWY◀┫┣W┫┣NPL┫┣K╋KLJ╋XMDH╋\n", "\n", "Action: incidental variation in lexeme\n", "\n", "Jeremiah 26:2 ==> slot 247827 ==> 247868\n", " ╋┫┣L╋▶CXH◀┫┣BJT╋JHWH┫┣>T╋KL╋H╋\n", " ╋┫┣L╋▶XWH◀┫┣BJT╋JHWH┫┣>T╋KL╋H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Ezekiel 8:16 ==> slot 267952 ==> 267993\n", " ╋W┫┣PNH┫┣QDM┫┣W┫┣HMH┫┣▶CXH◀┫┣QDM┫┣L╋H╋CMC┫┣W╋\n", " ╋W┫┣PNH┫┣QDM┫┣W┫┣HMH┫┣▶XWH◀┫┣QDM┫┣L╋H╋CMC┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Ezekiel 18:14 ==> slot 272883 ==> 272924\n", " ╋W┫┣R>H┫┣W┫┣L>┫┣╋\n", " ╋W┫┣R>H┫┣W┫┣L>┫┣┫┣>KL╋\n", "\n", "Action: collapse 1 fewer slot\n", "\n", "Ezekiel 46:2 ==> slot 289063 ==> 289103\n", " ╋T╋CLM┫┣W┫┣▶CXH◀┫┣T╋CLM┫┣W┫┣▶XWH◀┫┣ slot 289079 ==> 289119\n", " ╋SGR┫┣RY┫┣PTX╋H╋\n", " ╋SGR┫┣RY┫┣PTX╋H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Ezekiel 46:9 ==> slot 289210 ==> 289250\n", " ╋BW>┫┣DRK╋C┫┣DRK╋C┫┣DRK╋C┫┣DRK╋C slot 296522 ==> 296562\n", " ╋H╋GLGL┫┣L>┫┣BW>┫┣W┫┣▶B>R_CB<◀┫┣L>┫┣┫┣BW>┫┣W┫┣▶B>R◀╋CB<┫┣L>┫┣ slot 297728 ==> 297769\n", " ╋>LHJM┫┣DN┫┣W┫┣XJ┫┣DRK╋▶B>R_CB<◀┫┣W┫┣NPL┫┣W┫┣L>┫┣QWM╋\n", " ╋>LHJM┫┣DN┫┣W┫┣XJ┫┣DRK╋▶B>R◀╋CB<┫┣W┫┣NPL┫┣W┫┣L>╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Micah 5:12 ==> slot 300758 ==> 300800\n", " ╋MYBH┫┣MN╋QRB┫┣W┫┣L>┫┣▶CXH◀┫┣┫┣▶XWH◀┫┣ slot 303109 ==> 303151\n", " ╋H╋KHN┫┣W┫┣>T┫┣H┫┣▶CXH◀┫┣╋\n", " ╋H╋KHN┫┣W┫┣>T╋H┫┣▶XWH◀┫┣╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Zephaniah 1:5 ==> slot 303120 ==> 303162\n", " ╋H╋CMJM┫┣W┫┣>T┫┣H┫┣▶CXH◀┫┣H┫┣CB<┫┣L╋JHWH┫┣W╋\n", " ╋H╋CMJM┫┣W┫┣>T╋H┫┣▶XWH◀┫┣H┫┣CB<┫┣L╋JHWH┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Zephaniah 2:11 ==> slot 303598 ==> 303640\n", " ╋KL╋>LHJM╋H╋>RY┫┣W┫┣▶CXH◀┫┣L┫┣>JC┫┣MN╋MQWM┫┣KL╋\n", " ╋KL╋>LHJM╋H╋>RY┫┣W┫┣▶XWH◀┫┣L┫┣>JC┫┣MN╋MQWM┫┣KL╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Zechariah 14:16 ==> slot 309251 ==> 309293\n", " ╋DJ╋CNH┫┣B╋CNH┫┣L╋▶CXH◀┫┣L╋MLK┫┣JHWH╋YB>┫┣W╋\n", " ╋DJ╋CNH┫┣B╋CNH┫┣L╋▶XWH◀┫┣L╋MLK┫┣JHWH╋YB>┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Zechariah 14:17 ==> slot 309276 ==> 309318\n", " ╋H╋>RY┫┣>L╋JRWCLM┫┣L╋▶CXH◀┫┣L╋MLK┫┣JHWH╋YB>┫┣W╋\n", " ╋H╋>RY┫┣>L╋JRWCLM┫┣L╋▶XWH◀┫┣L╋MLK┫┣JHWH╋YB>┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 5:8 ==> slot 311040 ==> 311082\n", " ╋B╋RB╋XSD┫┣BW>┫┣BJT┫┣▶CXH◀┫┣>L╋HJKL╋QDC┫┣B╋JR>H╋\n", " ╋B╋RB╋XSD┫┣BW>┫┣BJT┫┣▶XWH◀┫┣>L╋HJKL╋QDC┫┣B╋JR>H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 7:1 ==> slot 311219 ==> 311261\n", " ╋L╋JHWH┫┣LHJM┫┣B┫┣XSH┫┣JC<╋\n", " ╋L╋JHWH┫┣LHJM┫┣B┫┣XSH╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Psalms 22:28 ==> slot 313781 ==> 313824\n", " ╋JHWH┫┣KL╋>PS╋>RY┫┣W┫┣▶CXH◀┫┣L╋PNH┫┣KL╋MCPXH╋GWJ╋\n", " ╋JHWH┫┣KL╋>PS╋>RY┫┣W┫┣▶XWH◀┫┣L╋PNH┫┣KL╋MCPXH╋GWJ╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 22:30 ==> slot 313799 ==> 313842\n", " ╋B╋H╋GWJ┫┣>KL┫┣W┫┣▶CXH◀┫┣KL╋DCN╋>RY┫┣L╋PNH╋\n", " ╋B╋H╋GWJ┫┣>KL┫┣W┫┣▶XWH◀┫┣KL╋DCN╋>RY┫┣L╋PNH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 29:2 ==> slot 314650 ==> 314693\n", " ╋JHB┫┣L╋JHWH┫┣KBWD╋CM┫┣▶CXH◀┫┣L╋JHWH┫┣B╋HDRH╋QDC╋\n", " ╋JHB┫┣L╋JHWH┫┣KBWD╋CM┫┣▶XWH◀┫┣L╋JHWH┫┣B╋HDRH╋QDC╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 45:12 ==> slot 317948 ==> 317991\n", " ╋JPJ┫┣KJ┫┣HW>┫┣>DWN┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣BT╋YR┫┣B╋\n", " ╋JPJ┫┣KJ┫┣HW>┫┣>DWN┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣BT╋YR┫┣B╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 66:4 ==> slot 320924 ==> 320967\n", " ╋L┫┣>JB┫┣KL╋H╋>RY┫┣▶CXH◀┫┣L┫┣W┫┣ZMR┫┣L┫┣ZMR╋\n", " ╋L┫┣>JB┫┣KL╋H╋>RY┫┣▶XWH◀┫┣L┫┣W┫┣ZMR┫┣L┫┣ZMR╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 72:11 ==> slot 322344 ==> 322387\n", " ╋W╋SB>┫┣>CKR┫┣QRB┫┣W┫┣▶CXH◀┫┣L┫┣KL╋MLK┫┣KL╋GWJ╋\n", " ╋W╋SB>┫┣>CKR┫┣QRB┫┣W┫┣▶XWH◀┫┣L┫┣KL╋MLK┫┣KL╋GWJ╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 81:10 ==> slot 324624 ==> 324667\n", " ╋B┫┣>L╋ZR┫┣W┫┣L>┫┣▶CXH◀┫┣L╋>L╋NKR┫┣>NKJ┫┣JHWH╋\n", " ╋B┫┣>L╋ZR┫┣W┫┣L>┫┣▶XWH◀┫┣L╋>L╋NKR┫┣>NKJ┫┣JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 86:9 ==> slot 325294 ==> 325337\n", " ╋GWJ┫┣>CR┫┣┫┣W┫┣▶CXH◀┫┣L╋PNH┫┣>DNJ┫┣W┫┣KBD╋\n", " ╋GWJ┫┣>CR┫┣┫┣W┫┣▶XWH◀┫┣L╋PNH┫┣>DNJ┫┣W┫┣KBD╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 95:6 ==> slot 326949 ==> 326992\n", " ╋W┫┣JBCT┫┣JD┫┣JYR┫┣BW>┫┣▶CXH◀┫┣W┫┣KR<┫┣BRK┫┣L╋PNH╋\n", " ╋W┫┣JBCT┫┣JD┫┣JYR┫┣BW>┫┣▶XWH◀┫┣W┫┣KR<┫┣BRK┫┣L╋PNH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 96:9 ==> slot 327100 ==> 327143\n", " ╋MNXH┫┣W┫┣BW>┫┣L╋XYR┫┣▶CXH◀┫┣L╋JHWH┫┣B╋HDRH╋QDC╋\n", " ╋MNXH┫┣W┫┣BW>┫┣L╋XYR┫┣▶XWH◀┫┣L╋JHWH┫┣B╋HDRH╋QDC╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 97:7 ==> slot 327237 ==> 327280\n", " ╋H┫┣HLL┫┣B╋H╋>LJL┫┣▶CXH◀┫┣L┫┣KL╋>LHJM┫┣CM<┫┣W╋\n", " ╋H┫┣HLL┫┣B╋H╋>LJL┫┣▶XWH◀┫┣L┫┣KL╋>LHJM┫┣CM<┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 99:5 ==> slot 327443 ==> 327486\n", " ╋LHJM┫┣W┫┣▶CXH◀┫┣L╋HDM╋RGL┫┣QDWC┫┣HW>╋\n", " ╋LHJM┫┣W┫┣▶XWH◀┫┣L╋HDM╋RGL┫┣QDWC┫┣HW>╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 99:9 ==> slot 327492 ==> 327535\n", " ╋LHJM┫┣W┫┣▶CXH◀┫┣L╋HR╋QDC┫┣KJ┫┣QDWC╋\n", " ╋LHJM┫┣W┫┣▶XWH◀┫┣L╋HR╋QDC┫┣KJ┫┣QDWC╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 106:19 ==> slot 329085 ==> 329128\n", " ╋T╋\n", " ╋T╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 132:7 ==> slot 333437 ==> 333480\n", " ╋FDH╋J┫┣L╋MCKN┫┣▶CXH◀┫┣L╋HDM╋RGL┫┣QWM┫┣JHWH╋\n", " ╋FDH╋J┫┣L╋MCKN┫┣▶XWH◀┫┣L╋HDM╋RGL┫┣QWM┫┣JHWH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Psalms 138:2 ==> slot 334200 ==> 334243\n", " ╋KL╋LB┫┣NGD╋>LHJM┫┣ZMR┫┣▶CXH◀┫┣>L╋HJKL╋QDC┫┣W┫┣JDH╋\n", " ╋KL╋LB┫┣NGD╋>LHJM┫┣ZMR┫┣▶XWH◀┫┣>L╋HJKL╋QDC┫┣W┫┣JDH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Job 1:20 ==> slot 336428 ==> 336471\n", " ╋R>C┫┣W┫┣NPL┫┣>RY┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣┫┣MN╋\n", " ╋R>C┫┣W┫┣NPL┫┣>RY┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣┫┣MN╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Proverbs 24:31 ==> slot 353611 ==> 353654\n", " ╋LB┫┣W┫┣HNH┫┣ slot 356377 ==> 356420\n", " ╋W┫┣NPL┫┣RY┫┣W┫┣>MR┫┣>L┫┣MDW<╋\n", " ╋W┫┣NPL┫┣RY┫┣W┫┣>MR┫┣>L┫┣MDW<╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Esther 3:2 ==> slot 366631 ==> 366674\n", " ╋C slot 366646 ==> 366689\n", " ╋MRDKJ┫┣L>┫┣KR<┫┣W┫┣L>┫┣▶CXH◀┫┣W┫┣>MR┫┣┫┣KR<┫┣W┫┣L>┫┣▶XWH◀┫┣W┫┣>MR┫┣ slot 366702 ==> 366745\n", " ╋KJ┫┣>JN┫┣MRDKJ┫┣KR<╋W╋▶CXH◀┫┣L┫┣W┫┣ML>┫┣HMN┫┣XMH╋\n", " ╋KJ┫┣>JN┫┣MRDKJ┫┣KR<╋W╋▶XWH◀┫┣L┫┣W┫┣ML>┫┣HMN┫┣XMH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Daniel 1:2 ==> slot 370037 ==> 370080\n", " ╋>T╋JHWJQJM┫┣MLK╋JHWDH┫┣W┫┣▶MQYT◀╋KLJ╋BJT╋H╋>LHJM┫┣W╋\n", " ╋>T╋JHWJQJM┫┣MLK╋JHWDH┫┣W┫┣▶MN◀╋QYT╋KLJ╋BJT╋H╋>LHJM╋\n", "\n", "Action: systematic split into 1 extra slot\n", "\n", "Daniel 1:5 ==> slot 370138 ==> 370182\n", " ╋L╋GDL┫┣CNH╋CLC┫┣W┫┣▶MQYT◀┫┣ slot 370329 ==> 370374\n", " ╋W┫┣NSH┫┣JWM╋H┫┣MR>H┫┣VWB╋\n", " ╋W┫┣NSH┫┣JWM╋H┫┣MR>H╋\n", "\n", "Action: systematic split into 1 extra slot\n", "\n", "Daniel 1:18 ==> slot 370390 ==> 370436\n", " ╋XZWN╋W╋XLWM┫┣W┫┣L╋▶MQYT◀╋H╋JWM┫┣>CR┫┣>MR┫┣H╋\n", " ╋XZWN╋W╋XLWM┫┣W┫┣L╋▶MN◀╋QYT╋H╋JWM┫┣>CR┫┣>MR╋\n", "\n", "Action: systematic split into 1 extra slot\n", "\n", "Nehemiah 7:69 ==> slot 386987 ==> 387034\n", " ╋CB<╋M>H╋W╋C╋H╋>B┫┣NTN┫┣L╋\n", " ╋CB<╋M>H╋W╋C╋H╋>B┫┣NTN╋\n", "\n", "Action: systematic split into 1 extra slot\n", "\n", "Nehemiah 8:6 ==> slot 387279 ==> 387327\n", " ╋MP┫┣>RY┫┣W╋\n", " ╋MP┫┣>RY┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Nehemiah 9:3 ==> slot 387715 ==> 387763\n", " ╋JWM┫┣W┫┣RBJLHJM┫┣W┫┣QWM╋\n", " ╋JWM┫┣W┫┣RBJLHJM┫┣W┫┣QWM╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Nehemiah 9:6 ==> slot 387815 ==> 387863\n", " ╋W┫┣YB>╋H╋CMJM┫┣L┫┣▶CXH◀┫┣>TH┫┣HW>┫┣JHWH┫┣H╋>LHJM╋\n", " ╋W┫┣YB>╋H╋CMJM┫┣L┫┣▶XWH◀┫┣>TH┫┣HW>┫┣JHWH┫┣H╋>LHJM╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "Nehemiah 11:27 ==> slot 389626 ==> 389674\n", " ╋W┫┣B╋XYR_CWR_CB<◀╋W╋BT┫┣W┫┣B╋YQLG╋\n", " ╋W┫┣B╋XYR_CWR◀╋CB<╋W╋BT┫┣W┫┣B╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Nehemiah 11:30 ==> slot 389660 ==> 389709\n", " ╋W╋BT┫┣W┫┣XNH┫┣MN╋▶B>R_CB<◀┫┣╋HNM┫┣W┫┣BN╋\n", " ╋W╋BT┫┣W┫┣XNH┫┣MN╋▶B>R◀╋CB<┫┣╋HNM┫┣W╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "1_Chronicles 4:28 ==> slot 393152 ==> 393202\n", " ╋BN╋JHWDH┫┣W┫┣JCB┫┣B╋▶B>R_CB<◀┫┣W┫┣MWLDH╋W╋XYR_CWR◀╋CB<╋W╋MWLDH╋W╋XYR_CW slot 399894 ==> 399945\n", " ╋MNXH┫┣W┫┣BW>┫┣L╋PNH┫┣▶CXH◀┫┣L╋JHWH┫┣B╋HDRH╋QDC╋\n", " ╋MNXH┫┣W┫┣BW>┫┣L╋PNH┫┣▶XWH◀┫┣L╋JHWH┫┣B╋HDRH╋QDC╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Chronicles 21:2 ==> slot 401751 ==> 401802\n", " ╋HLK┫┣SPR┫┣>T╋JFR>L┫┣MN╋▶B>R_CB<◀┫┣W┫┣╋\n", " ╋HLK┫┣SPR┫┣>T╋JFR>L┫┣MN╋▶B>R◀╋CB<┫┣W┫┣ slot 402212 ==> 402264\n", " ╋JY>┫┣MN╋H╋GRN┫┣W┫┣▶CXH◀┫┣L╋DWD┫┣>P┫┣>RY┫┣W╋\n", " ╋JY>┫┣MN╋H╋GRN┫┣W┫┣▶XWH◀┫┣L╋DWD┫┣>P┫┣>RY┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "1_Chronicles 27:12 ==> slot 405025 ==> 405077\n", " ╋>BJBJ slot 406515 ==> 406568\n", " ╋>LHJM╋>B┫┣W┫┣QDD┫┣W┫┣▶CXH◀┫┣L╋JHWH┫┣W┫┣L╋H╋\n", " ╋>LHJM╋>B┫┣W┫┣QDD┫┣W┫┣▶XWH◀┫┣L╋JHWH┫┣W┫┣L╋H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Chronicles 7:3 ==> slot 409997 ==> 410050\n", " ╋>RY┫┣RY┫┣ slot 410427 ==> 410480\n", " ╋W┫┣LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣NTC┫┣MN╋LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣NTC┫┣MN╋ slot 410513 ==> 410566\n", " ╋XZQ┫┣B╋>LHJM╋>XR┫┣W┫┣▶CXH◀┫┣L┫┣W┫┣LHJM╋>XR┫┣W┫┣▶XWH◀┫┣L┫┣W┫┣ slot 415696 ==> 415749\n", " ╋JY>┫┣B╋H╋R_CB<◀┫┣PRJM┫┣W┫┣CWB╋\n", " ╋JY>┫┣B╋H╋R◀╋CB<┫┣PRJM┫┣W╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "2_Chronicles 20:18 ==> slot 416304 ==> 416358\n", " ╋NPL┫┣L╋PNH╋JHWH┫┣L╋▶CXH◀┫┣L╋JHWH┫┣W┫┣QWM┫┣H╋\n", " ╋NPL┫┣L╋PNH╋JHWH┫┣L╋▶XWH◀┫┣L╋JHWH┫┣W┫┣QWM┫┣H╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Chronicles 24:1 ==> slot 418154 ==> 418208\n", " ╋W┫┣CM╋>M┫┣YBJH┫┣MN╋▶B>R_CB<◀┫┣W┫┣C┫┣H┫┣JCR╋\n", " ╋W┫┣CM╋>M┫┣YBJH┫┣MN╋▶B>R◀╋CB<┫┣W┫┣C┫┣H╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "2_Chronicles 24:17 ==> slot 418544 ==> 418599\n", " ╋JHWJD<┫┣BW>┫┣FR╋JHWDH┫┣W┫┣▶CXH◀┫┣L╋H╋MLK┫┣>Z┫┣CM<╋\n", " ╋JHWJD<┫┣BW>┫┣FR╋JHWDH┫┣W┫┣▶XWH◀┫┣L╋H╋MLK┫┣>Z┫┣CM<╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Chronicles 25:14 ==> slot 419155 ==> 419210\n", " ╋L╋>LHJM┫┣W┫┣L╋PNH┫┣▶CXH◀┫┣W┫┣L┫┣QVR┫┣W┫┣XRH╋\n", " ╋L╋>LHJM┫┣W┫┣L╋PNH┫┣▶XWH◀┫┣W┫┣L┫┣QVR┫┣W┫┣XRH╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Chronicles 29:28 ==> slot 421534 ==> 421589\n", " ╋JFR>L┫┣W┫┣KL╋H╋QHL┫┣▶CXH◀┫┣W┫┣H╋CJR┫┣CJR┫┣W╋\n", " ╋JFR>L┫┣W┫┣KL╋H╋QHL┫┣▶XWH◀┫┣W┫┣H╋CJR┫┣CJR┫┣W╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Chronicles 29:29 ==> slot 421564 ==> 421619\n", " ╋KL┫┣H┫┣MY>┫┣>T┫┣W┫┣▶CXH◀┫┣W┫┣>MR┫┣JXZQJHW┫┣H╋MLK╋\n", " ╋KL┫┣H┫┣MY>┫┣>T┫┣W┫┣▶XWH◀┫┣W┫┣>MR┫┣JXZQJHW┫┣H╋MLK╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Chronicles 29:30 ==> slot 421595 ==> 421650\n", " ╋L╋FMXH┫┣W┫┣QDD┫┣W┫┣▶CXH◀┫┣W┫┣MR╋\n", " ╋L╋FMXH┫┣W┫┣QDD┫┣W┫┣▶XWH◀┫┣W┫┣MR╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Chronicles 30:5 ==> slot 421839 ==> 421894\n", " ╋QWL┫┣B╋KL╋JFR>L┫┣MN╋▶B>R_CB<◀┫┣W┫┣╋\n", " ╋QWL┫┣B╋KL╋JFR>L┫┣MN╋▶B>R◀╋CB<┫┣W┫┣ slot 423282 ==> 423338\n", " ╋>MR┫┣L╋PNH╋MZBX╋>XD┫┣▶CXH◀┫┣W┫┣╋\n", " ╋>MR┫┣L╋PNH╋MZBX╋>XD┫┣▶XWH◀┫┣W┫┣╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "2_Chronicles 33:3 ==> slot 423850 ==> 423906\n", " ╋BCRH┫┣W┫┣▶CXH◀┫┣L╋KL╋YB>╋H╋CMJM╋\n", " ╋BCRH┫┣W┫┣▶XWH◀┫┣L╋KL╋YB>╋H╋CMJM╋\n", "\n", "Action: systematic variation in lexeme\n", "\n", "No more differences.\n", "Found 234 points of disturbance\n" ] } ], "source": [ "doDiffs(\"3\", \"4\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Running\n", "\n", "Here we run the mapping between `4` and `4b`.\n", "The points of disturbance will be written into the output cell.\n", "\n", "## 4 => 4b\n", "\n", "Here are the special cases for this conversion." ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [], "source": [ "cases.update(\n", " {\n", " (\"4\", \"4b\"): {\n", " 214730: (\"collapse\", 3),\n", " 260028: (\"split\", 1),\n", " 289948: (\"ok\", None),\n", " 307578: (\"split\", 1),\n", " 323067: (\"ok\", None),\n", " 389774: (\"ok\", None),\n", " 407543: (\"split\", 1),\n", " 408429: (\"split\", 1),\n", " },\n", " }\n", ")" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Genesis 24:65 ==> slot 12369 ==> 12369\n", " ╋H╋JC╋▶HLZH◀┫┣H┫┣HLK┫┣B╋H╋FDH╋\n", " ╋H╋JC╋▶H◀╋LZH┫┣H┫┣HLK┫┣B╋H╋\n", "\n", "Action: split article off\n", "\n", "Genesis 37:19 ==> slot 20514 ==> 20515\n", " ╋>X┫┣HNH┫┣B┫┣W┫┣X┫┣HNH┫┣B┫┣W┫┣ slot 130846 ==> 130848\n", " ╋W┫┣NWX┫┣>L╋H╋SL<╋▶HLZ◀┫┣W┫┣>T╋H╋MRQ┫┣CPK╋\n", " ╋W┫┣NWX┫┣>L╋H╋SL<╋▶H◀╋LZ┫┣W┫┣>T╋H╋MRQ╋\n", "\n", "Action: split article off\n", "\n", "1_Samuel 14:1 ==> slot 148319 ==> 148322\n", " ╋MYB╋PLCTJ┫┣>CR┫┣MN╋B┫┣L>┫┣NGD╋\n", " ╋MYB╋PLCTJ┫┣>CR┫┣MN╋B┫┣L>╋\n", "\n", "Action: split article off\n", "\n", "1_Samuel 17:26 ==> slot 151331 ==> 151335\n", " ╋>CR┫┣NKH┫┣>T╋H╋PLCTJ╋▶HLZ◀┫┣W┫┣SWR┫┣XRPH┫┣MN╋CR┫┣NKH┫┣>T╋H╋PLCTJ╋▶H◀╋LZ┫┣W┫┣SWR┫┣XRPH┫┣MN╋\n", "\n", "Action: split article off\n", "\n", "1_Samuel 20:19 ==> slot 153816 ==> 153821\n", " ╋W┫┣JCB┫┣>YL╋H╋>BN┫┣▶H>ZL◀┫┣W┫┣>NJ┫┣CLC╋H╋XY╋\n", " ╋W┫┣JCB┫┣>YL╋H╋>BN╋▶H◀╋>ZL┫┣W┫┣>NJ┫┣CLC╋H╋\n", "\n", "Action: split article off\n", "\n", "2_Kings 4:25 ==> slot 196975 ==> 196981\n", " ╋GJXZJ┫┣N┫┣L╋QR>╋\n", " ╋GJXZJ┫┣N┫┣L╋\n", "\n", "Action: split article off\n", "\n", "2_Kings 23:17 ==> slot 210326 ==> 210333\n", " ╋W┫┣>MR┫┣MH┫┣H╋YJWN╋▶HLZ◀┫┣>CR┫┣>NJ┫┣R>H┫┣W┫┣>MR╋\n", " ╋W┫┣>MR┫┣MH┫┣H╋YJWN╋▶H◀╋LZ┫┣>CR┫┣>NJ┫┣R>H┫┣W╋\n", "\n", "Action: split article off\n", "\n", "Isaiah 8:1 ==> slot 214730 ==> 214738\n", " ╋NWC┫┣L╋▶MHR◀┫┣CLL┫┣XWC┫┣BZ┫┣W┫┣NWC┫┣L╋▶MHR_CLL_XC_BZ◀┫┣W┫┣MN╋\n", "\n", "Action: collapse 3 fewer slots\n", "\n", "Jeremiah 46:20 ==> slot 260028 ==> 260033\n", " ╋NYH┫┣MN╋>JN╋JCB┫┣╋\n", " ╋NYH┫┣MN╋>JN╋JCB┫┣ slot 271124 ==> 271130\n", " ╋RBH┫┣W┫┣GDL┫┣W┫┣BW>┫┣▶B┫┣▶B◀╋ slot 283104 ==> 283111\n", " ╋MR┫┣H╋>RY╋▶HLZW◀┫┣H┫┣CMM┫┣HJH┫┣K╋GN╋\n", " ╋MR┫┣H╋>RY╋▶H◀╋LZW┫┣H┫┣CMM┫┣HJH┫┣K╋\n", "\n", "Action: split article off\n", "\n", "Ezekiel 47:13 ==> slot 289948 ==> 289956\n", " ╋TRWPH┫┣KH┫┣>MR┫┣>DNJ╋JHWH┫┣▶ZH◀┫┣GBWL┫┣>CR┫┣NXL┫┣>T╋H╋\n", " ╋TRWPH┫┣KH┫┣>MR┫┣>DNJ╋JHWH┫┣▶G>◀╋GBWL┫┣>CR┫┣NXL┫┣>T╋H╋\n", "\n", "Action: incidental variation in lexeme\n", "\n", "Zechariah 2:8 ==> slot 305480 ==> 305488\n", " ╋RWY┫┣DBR┫┣>L╋H╋NMR┫┣PRZWT┫┣JCB┫┣JRWCLM╋\n", " ╋RWY┫┣DBR┫┣>L╋H╋NMR┫┣PRZWT┫┣JCB╋\n", "\n", "Action: split article off\n", "\n", "Zechariah 9:8 ==> slot 307578 ==> 307587\n", " ╋JBWSJ┫┣W┫┣XNH┫┣L╋BJT┫┣▶MYBH◀┫┣MN╋┫┣MN╋ slot 323067 ==> 323077\n", " ╋M┫┣MN╋MDBR┫┣▶RWM◀┫┣KJ┫┣>LHJM┫┣CPV┫┣ZH┫┣CPL╋\n", " ╋M┫┣MN╋MDBR╋▶HR◀┫┣KJ┫┣>LHJM┫┣CPV┫┣ZH┫┣CPL╋\n", "\n", "Action: incidental variation in lexeme\n", "\n", "Daniel 8:16 ==> slot 375529 ==> 375539\n", " ╋W┫┣>MR┫┣GBRJ>L┫┣BJN┫┣L╋▶HLZ◀┫┣>T╋H╋MR>H┫┣W┫┣BW>╋\n", " ╋W┫┣>MR┫┣GBRJ>L┫┣BJN┫┣L╋▶H◀╋LZ┫┣>T╋H╋MR>H┫┣W╋\n", "\n", "Action: split article off\n", "\n", "Nehemiah 12:4 ==> slot 389774 ==> 389785\n", " ╋XVWC┫┣CKNJH╋RXWM╋MRMWT┫┣╋▶GNTWN◀╋>BJH┫┣MJMJN╋M╋▶GNTWJ◀╋>BJH┫┣MJMJN╋M slot 407543 ==> 407554\n", " ╋>JC╋XKM┫┣JD<┫┣BJNH┫┣L╋▶XWRM_>BJ◀┫┣BN╋>CH┫┣MN╋BT╋DN╋\n", " ╋>JC╋XKM┫┣JD<┫┣BJNH┫┣L╋▶XWRM◀┫┣>B┫┣BN╋>CH┫┣MN╋BT╋\n", "\n", "Action: split into 1 extra slot\n", "\n", "2_Chronicles 4:16 ==> slot 408429 ==> 408441\n", " ╋W┫┣>T╋KL╋KLJ┫┣BJ◀┫┣L╋H╋MLK┫┣CLMH┫┣L╋\n", " ╋W╋>T╋KL╋KLJ┫┣B┫┣L╋H╋MLK┫┣CLMH╋\n", "\n", "Action: split into 1 extra slot\n", "\n", "No more differences.\n", "Found 20 points of disturbance\n" ] } ], "source": [ "doDiffs(\"4\", \"4b\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 4b => 2016\n", "\n", "We need other cases." ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [], "source": [ "cases.update(\n", " {\n", " (\"4b\", \"2016\"): {\n", " 28423: (\"split\", 2),\n", " 28455: (\"split\", 2),\n", " 91193: (\"split\", 1),\n", " 91197: (\"split\", 1),\n", " 122218: (\"split\", 1),\n", " 122247: (\"split\", 1),\n", " 123160: (\"split\", 1),\n", " 184086: (\"split\", 1),\n", " 394186: (\"collapse\", 1),\n", " 395150: (\"ok\", None),\n", " 395190: (\"ok\", None),\n", " 401036: (\"split\", 2),\n", " 404503: (\"ok\", None),\n", " 419138: (\"split\", 2),\n", " },\n", " }\n", ")" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "scrolled": true }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Genesis 50:10 ==> slot 28423 ==> 28423\n", " ╋KBD╋M>D┫┣W┫┣BW>┫┣VD◀┫┣>CR┫┣B╋D┫┣W┫┣BW>┫┣VD┫┣>CR┫┣B╋ slot 28455 ==> 28457\n", " ╋KNT╋H╋>BL┫┣B╋▶GRN_>VD◀┫┣W┫┣>MR┫┣>BL╋KBD┫┣ZH╋\n", " ╋KNT╋H╋>BL┫┣B╋▶GRN◀╋H╋>VD┫┣W┫┣>MR┫┣>BL╋\n", "\n", "Action: split into 2 extra slots\n", "\n", "Numbers 33:45 ==> slot 91193 ==> 91197\n", " ╋MN╋ slot 91197 ==> 91202\n", " ╋B╋DJBWN_GD┫┣W┫┣NS<┫┣MN╋▶DJBWN_GD◀┫┣W┫┣XNH┫┣B╋ slot 122218 ==> 122224\n", " ╋GBWL╋H╋JPLVJ┫┣ slot 122247 ==> 122254\n", " ╋GBWL╋NXLH┫┣MZRX┫┣DR┫┣┫┣H╋GBWL┫┣H╋\n", " ╋GBWL╋NXLH┫┣MZRX┫┣DR┫┣┫┣H╋GBWL╋\n", "\n", "Action: split into 1 extra slot\n", "\n", "Joshua 18:13 ==> slot 123160 ==> 123168\n", " ╋HR┫┣>CR┫┣MN╋NGB┫┣L╋▶BJT_XRWN_TXTWN◀┫┣W┫┣T>R┫┣H╋GBWL┫┣W╋\n", " ╋HR┫┣>CR┫┣MN╋NGB┫┣L╋▶BJT_XWRWN◀╋TXTWN┫┣W┫┣T>R┫┣H╋GBWL╋\n", "\n", "Action: split into 1 extra slot\n", "\n", "1_Kings 9:17 ==> slot 184086 ==> 184095\n", " ╋CLMH┫┣>T╋GZR╋W╋>T╋▶BJT_XRWN_TXTWN◀┫┣W┫┣>T╋BT╋\n", " ╋CLMH┫┣>T╋GZR╋W╋>T╋▶BJT_XWRWN◀╋TXTWN┫┣W┫┣>T╋B slot 394186 ==> 394196\n", " ╋W┫┣BN╋CMW>L┫┣H╋BKR╋▶W◀╋CNJ╋W╋>BJH┫┣BN╋MRRJ╋\n", " ╋W┫┣BN╋CMW>L┫┣H╋BKR┫┣▶WCNJ◀┫┣W┫┣>BJH┫┣BN╋MRRJ┫┣MXLJ╋\n", "\n", "Action: collapse 1 fewer slot\n", "\n", "1_Chronicles 7:12 ==> slot 395150 ==> 395159\n", " ╋YB>┫┣L╋H╋MLXMH┫┣W┫┣▶CPM◀╋W╋XPJM┫┣BN╋┫┣L╋H╋MLXMH┫┣W┫┣▶CPJM◀╋W╋XPJM┫┣BN╋ slot 395190 ==> 395199\n", " ╋>CH┫┣L╋XPJM╋W╋L╋▶CPM◀┫┣W┫┣CM╋>XWT┫┣MCH┫┣L╋XPJM╋W╋L╋▶CPJM◀┫┣W┫┣CM╋>XWT┫┣M slot 401036 ==> 401045\n", " ╋YRWJH┫┣NKH┫┣>T╋>DWM┫┣B╋▶GJ>_MLX◀┫┣CMNH╋LP┫┣W┫┣FJM╋\n", " ╋YRWJH┫┣NKH┫┣>T╋>DWM┫┣B╋▶GJ>◀╋H╋MLX┫┣CMNH╋LP╋\n", "\n", "Action: split into 2 extra slots\n", "\n", "1_Chronicles 26:16 ==> slot 404503 ==> 404514\n", " ╋BN┫┣BJT╋H╋>SP┫┣L╋▶CPM◀╋W╋L╋XSH┫┣L╋H╋\n", " ╋BN┫┣BJT╋H╋>SP┫┣L╋▶CPJM◀╋W╋L╋XSH┫┣L╋H╋\n", "\n", "Action: incidental variation in lexeme\n", "\n", "2_Chronicles 25:11 ==> slot 419138 ==> 419149\n", " ╋NHG┫┣>T╋_MLX◀┫┣W┫┣NKH┫┣>T╋BN╋FT╋◀╋H╋MLX┫┣W┫┣NKH┫┣>T╋\n", "\n", "Action: split into 2 extra slots\n", "\n", "No more differences.\n", "Found 14 points of disturbance\n" ] } ], "source": [ "doDiffs(\"4b\", \"2016\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 2016 => 2017\n", "\n", "We need other cases." ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [], "source": [ "cases.update(\n", " {\n", " (\"2016\", \"2017\"): {\n", " 16562: (\"split\", 1),\n", " 392485: (\"split\", 2),\n", " },\n", " }\n", ")" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Genesis 31:11 ==> slot 16562 ==> 16562\n", " ╋>MR┫┣>L┫┣ML>K╋H╋>LHJM┫┣▶B◀╋XLWM┫┣JMR┫┣HNH╋\n", " ╋>MR┫┣>L┫┣ML>K╋H╋>LHJM┫┣▶B◀╋H╋XLWM┫┣JMR╋\n", "\n", "Action: split into 1 extra slot\n", "\n", "1_Chronicles 2:52 ==> slot 392485 ==> 392486\n", " ╋L╋CWBL┫┣>B╋QRJT_JH┫┣▶XYJ_HMNXWT◀┫┣W┫┣MCPXH╋QRJT_JB╋QRJT_JH┫┣▶XYJ◀╋H╋MNWXH┫┣W┫┣MCPXH╋QRJT_J 2021\n", "\n", "No changes expected right now." ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [], "source": [ "cases.update(\n", " {\n", " (\"2017\", \"2021\"): {},\n", " }\n", ")" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Genesis 24:10 ==> slot 11325 ==> 11325\n", " ╋W┫┣QWM┫┣W┫┣HLK┫┣>L╋▶>RM_NHRJM◀┫┣>L╋L╋▶>RM◀╋NHR┫┣>L╋ slot 105981 ==> 105982\n", " ╋BLRM_NHRJM◀┫┣L╋QLL┫┣W┫┣L>┫┣>BH╋\n", " ╋BLRM◀╋NHR┫┣L╋QLL┫┣W┫┣L>╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Judges 3:8 ==> slot 128871 ==> 128873\n", " ╋MKR┫┣B╋JD╋KWCN_RCRM_NHRJM◀┫┣W┫┣L┫┣>T╋\n", " ╋MKR┫┣B╋JD╋KWCN_RCRM◀╋NHR┫┣W┫┣L╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Psalms 60:2 ==> slot 320252 ==> 320255\n", " ╋L╋LMD┫┣B╋NYH┫┣>T╋▶>RM_NHRJM◀╋W╋>T╋>RM╋YWB>┫┣W╋\n", " ╋L╋LMD┫┣B╋NYH┫┣>T╋▶>RM◀╋NHR╋W╋>T╋>RM╋YWB>╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "1_Chronicles 19:6 ==> slot 401289 ==> 401293\n", " ╋KSP┫┣L╋FKR┫┣L┫┣MN╋▶>RM_NHRJM◀╋W╋MN╋>RM_MRM◀╋NHR╋W╋MN╋>RM╋M slot 401292 ==> 401297\n", " ╋L┫┣MN╋>RM_NHRJM╋W╋MN╋▶>RM_M┫┣RKB╋W╋\n", " ╋L┫┣MN╋>RM╋W╋MN╋▶>RM◀╋M┫┣RKB╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "No more differences.\n", "Found 6 points of disturbance\n" ] } ], "source": [ "doDiffs(\"2017\", \"2021\")" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "# c => 2021\n", "\n", "No changes expected right now." ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [], "source": [ "cases.update(\n", " {\n", " (\"c\", \"2021\"): {},\n", " }\n", ")" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Genesis 24:10 ==> slot 11325 ==> 11325\n", " ╋W┫┣QWM┫┣W┫┣HLK┫┣>L╋▶>RM_NHRJM◀┫┣>L╋L╋▶>RM◀╋NHR┫┣>L╋ slot 105981 ==> 105982\n", " ╋BLRM_NHRJM◀┫┣L╋QLL┫┣W┫┣L>┫┣>BH╋\n", " ╋BLRM◀╋NHR┫┣L╋QLL┫┣W┫┣L>╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Judges 3:8 ==> slot 128871 ==> 128873\n", " ╋MKR┫┣B╋JD╋KWCN_RCRM_NHRJM◀┫┣W┫┣L┫┣>T╋\n", " ╋MKR┫┣B╋JD╋KWCN_RCRM◀╋NHR┫┣W┫┣L╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "Psalms 60:2 ==> slot 320252 ==> 320255\n", " ╋L╋LMD┫┣B╋NYH┫┣>T╋▶>RM_NHRJM◀╋W╋>T╋>RM╋YWB>┫┣W╋\n", " ╋L╋LMD┫┣B╋NYH┫┣>T╋▶>RM◀╋NHR╋W╋>T╋>RM╋YWB>╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "1_Chronicles 19:6 ==> slot 401289 ==> 401293\n", " ╋KSP┫┣L╋FKR┫┣L┫┣MN╋▶>RM_NHRJM◀╋W╋MN╋>RM_MRM◀╋NHR╋W╋MN╋>RM╋M slot 401292 ==> 401297\n", " ╋L┫┣MN╋>RM_NHRJM╋W╋MN╋▶>RM_M┫┣RKB╋W╋\n", " ╋L┫┣MN╋>RM╋W╋MN╋▶>RM◀╋M┫┣RKB╋\n", "\n", "Action: split on _ into 1 extra slot\n", "\n", "No more differences.\n", "Found 6 points of disturbance\n" ] } ], "source": [ "doDiffs(\"c\", \"2021\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Clearly, the only difference between versions `c` and `2021` is\n", "that some composite words in `c` have been split in version `2021`." ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "..............................................................................................\n", ". 47s Make edge from slot mapping 2017 => 2021 .\n", "..............................................................................................\n" ] } ], "source": [ "edgesFromMaps()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Extending to node mappings" ] }, { "cell_type": "code", "execution_count": 16, "metadata": {}, "outputs": [], "source": [ "nodeMapping = {}\n", "diagnosis = {}" ] }, { "cell_type": "code", "execution_count": 17, "metadata": {}, "outputs": [], "source": [ "statLabels = collections.OrderedDict(\n", " b=\"unique, perfect\",\n", " d=\"multiple, one perfect\",\n", " c=\"unique, imperfect\",\n", " f=\"multiple, cleanly composed\",\n", " e=\"multiple, non-perfect\",\n", " a=\"not mapped\",\n", ")" ] }, { "cell_type": "code", "execution_count": 18, "metadata": {}, "outputs": [], "source": [ "def makeNodeMapping(nodeType, v, w, force=False):\n", " caption(2, \"Mapping {} nodes {} ==> {}\".format(nodeType, v, w), silent=SILENT)\n", " mapKey = (v, w)\n", " edge = edges[mapKey]\n", "\n", " if not force and mapKey in nodeMapping and nodeType in nodeMapping[mapKey]:\n", " mapping = nodeMapping[mapKey][nodeType]\n", " diag = diagnosis[mapKey][nodeType]\n", "\n", " else:\n", " mapping = {}\n", " diag = {}\n", " caption(\n", " 0, \"Extending slot mapping {} ==> {} for {} nodes\".format(*mapKey, nodeType),\n", " silent=SILENT\n", " )\n", " for n in api[v].F.otype.s(nodeType):\n", " slots = api[v].E.oslots.s(n)\n", " mappedSlotsTuple = reduce(\n", " lambda x, y: x + y,\n", " [tuple(edge.get(s, ())) for s in slots],\n", " (),\n", " )\n", " mappedSlots = set(mappedSlotsTuple)\n", " mappedNodes = reduce(\n", " set.union,\n", " [set(api[w].L.u(s, nodeType)) for s in mappedSlots],\n", " set(),\n", " )\n", " result = {}\n", " nMs = len(mappedNodes)\n", " if nMs == 0:\n", " diag[n] = \"a\"\n", "\n", " elif nMs >= 1:\n", " theseMSlots = {}\n", " for m in mappedNodes:\n", " mSlots = set(api[w].E.oslots.s(m))\n", " dis = len(mappedSlots | mSlots) - len(mappedSlots & mSlots)\n", " result[m] = dis\n", " theseMSlots[m] = mSlots\n", " mapping[n] = result\n", "\n", " # we wait further case analysis before we put these counterparts of n into the edge\n", "\n", " if nMs == 1:\n", " m = list(mappedNodes)[0]\n", " dis = result[m]\n", " if dis == 0:\n", " diag[n] = \"b\"\n", " edge[n] = {\n", " m: None\n", " } # this is the most freqent case, hence an optimization: no dis value.\n", " # all other cases require the dis value to be passed on, even if 0\n", " else:\n", " diag[n] = \"c\"\n", " edge[n] = {m: dis}\n", " else:\n", " edge[n] = result\n", " dis = min(result.values())\n", " if dis == 0:\n", " diag[n] = \"d\"\n", " else:\n", " allMSlots = reduce(\n", " set.union,\n", " [set(theseMSlots[m]) for m in mappedNodes],\n", " set(),\n", " )\n", " composed = allMSlots == mappedSlots and sum(\n", " result.values()\n", " ) == len(mappedSlots) * (len(mappedNodes) - 1)\n", "\n", " if composed:\n", " diag[n] = \"f\"\n", " else:\n", " diag[n] = \"e\"\n", "\n", " diagnosis.setdefault(mapKey, {})[nodeType] = diag\n", " nodeMapping.setdefault(mapKey, {})[nodeType] = mapping\n", " caption(0, \"\\tDone\", silent=SILENT)" ] }, { "cell_type": "code", "execution_count": 19, "metadata": {}, "outputs": [], "source": [ "def exploreNodeMapping(nodeType, v, w, force=False):\n", " caption(4, \"Statistics for {} ==> {} ({})\".format(v, w, nodeType), silent=SILENT)\n", " mapKey = (v, w)\n", " diag = diagnosis[mapKey][nodeType]\n", " total = len(diag)\n", " if total == 0:\n", " return\n", "\n", " reasons = collections.Counter()\n", "\n", " for (n, dia) in diag.items():\n", " reasons[dia] += 1\n", "\n", " caption(0, \"\\t{:<30} : {:6.2f}% {:>7}x\".format(\"TOTAL\", 100, total), silent=SILENT)\n", " for stat in statLabels:\n", " statLabel = statLabels[stat]\n", " amount = reasons[stat]\n", " if amount == 0:\n", " continue\n", " perc = 100 * amount / total\n", " caption(0, \"\\t{:<30} : {:6.2f}% {:>7}x\".format(statLabel, perc, amount), silent=SILENT)" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "lines_to_next_cell": 2 }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "**********************************************************************************************\n", "* *\n", "* 53s Mapping book nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 02s Statistics for 2017 ==> 2021 (book) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 02s Mapping chapter nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 04s Statistics for 2017 ==> 2021 (chapter) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 04s Mapping lex nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 11s Statistics for 2017 ==> 2021 (lex) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 11s Mapping verse nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 12s Statistics for 2017 ==> 2021 (verse) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 12s Mapping half_verse nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 13s Statistics for 2017 ==> 2021 (half_verse) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 13s Mapping sentence nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 14s Statistics for 2017 ==> 2021 (sentence) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 15s Mapping sentence_atom nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 16s Statistics for 2017 ==> 2021 (sentence_atom) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 16s Mapping clause nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 17s Statistics for 2017 ==> 2021 (clause) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 17s Mapping clause_atom nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 18s Statistics for 2017 ==> 2021 (clause_atom) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 18s Mapping phrase nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 20s Statistics for 2017 ==> 2021 (phrase) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 20s Mapping phrase_atom nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 22s Statistics for 2017 ==> 2021 (phrase_atom) .\n", "..............................................................................................\n", "\n", "**********************************************************************************************\n", "* *\n", "* 1m 22s Mapping subphrase nodes 2017 ==> 2021 *\n", "* *\n", "**********************************************************************************************\n", "\n", "..............................................................................................\n", ". 1m 22s Statistics for 2017 ==> 2021 (subphrase) .\n", "..............................................................................................\n" ] } ], "source": [ "# ntypes = api[\"3\"].F.otype.all\n", "ntypes = api[\"2021\"].F.otype.all\n", "for (i, v) in enumerate(versions):\n", " if i == 0:\n", " continue\n", " prev = versions[i - 1]\n", " ntypes = api[v].F.otype.all\n", " for ntype in ntypes[0:-1]:\n", " makeNodeMapping(ntype, prev, v, force=False)\n", " exploreNodeMapping(ntype, prev, v)" ] }, { "cell_type": "markdown", "metadata": { "lines_to_next_cell": 2 }, "source": [ "# Writing mappings as TF edges" ] }, { "cell_type": "code", "execution_count": 21, "metadata": {}, "outputs": [], "source": [ "def writeMaps():\n", " for ((v1, v2), edge) in sorted(edges.items()):\n", " fName = \"omap@{}-{}\".format(v1, v2)\n", " caption(4, \"Write edge as TF feature {}\".format(fName), silent=SILENT)\n", "\n", " edgeFeatures = {fName: edge}\n", " metaData = {\n", " fName: {\n", " \"description\": \"⚠️ Maps the nodes of version {} to {}\".format(\n", " v1, v2\n", " ),\n", " \"encoder\": \"Dirk Roorda by a semi-automatic method\",\n", " \"see\": \"https://github.com/ETCBC/bhsa/blob/master/programs/versionMappings.ipynb\",\n", " \"valueType\": \"int\",\n", " \"edgeValues\": True,\n", " }\n", " }\n", " activate(v2)\n", " TF.save(\n", " nodeFeatures={},\n", " edgeFeatures=edgeFeatures,\n", " metaData=metaData,\n", " silent=SILENT,\n", " )" ] }, { "cell_type": "code", "execution_count": 23, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "..............................................................................................\n", ". 2m 46s Write mappings as TF edges .\n", "..............................................................................................\n", "..............................................................................................\n", ". 2m 46s Write edge as TF feature omap@2017-2021 .\n", "..............................................................................................\n", "..............................................................................................\n", ". 2m 46s Active version is now -> 2021 <- .\n", "..............................................................................................\n", " 0.00s Exporting 0 node and 1 edge and 0 config features to ~/github/etcbc/bhsa/tf/2021:\n", " | 1.76s T omap@2017-2021 to ~/github/etcbc/bhsa/tf/2021\n", " 1.76s Exported 0 node features and 1 edge features and 0 config features to ~/github/etcbc/bhsa/tf/2021\n" ] } ], "source": [ "caption(4, \"Write mappings as TF edges\", silent=SILENT)\n", "for (v1, v2) in sorted(mappings.keys()):\n", " caption(0, \"\\t {:>4} ==> {:<4}\".format(v1, v2), silent=SILENT)\n", "\n", "writeMaps()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "jupytext": { "encoding": "# -*- coding: utf-8 -*-" }, "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.10.4" }, "widgets": { "application/vnd.jupyter.widget-state+json": { "state": {}, "version_major": 2, "version_minor": 0 } } }, "nbformat": 4, "nbformat_minor": 4 }