{ "cells": [ { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": false }, "outputs": [], "source": [ "#hide\n", "#default_exp cli\n", "from nbdev.showdoc import show_doc" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Command line functions\n", "\n", "> Console commands added by the nbdev library" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "from nbdev.imports import *\n", "from nbdev.export import *\n", "from nbdev.sync import *\n", "from nbdev.merge import *\n", "from nbdev.export2html import *\n", "from nbdev.test import *\n", "from fastcore.script import call_parse,Param,bool_arg\n", "from subprocess import check_output,STDOUT" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`nbdev` comes with the following commands. To use any of them, you must be in one of the subfolders of your project: they will search for the `settings.ini` recursively in the parent directory but need to access it to be able to work. Their names all begin with nbdev so you can easily get a list with tab completion.\n", "- `nbdev_build_docs` builds the documentation from the notebooks\n", "- `nbdev_build_lib` builds the library from the notebooks\n", "- `nbdev_bump_version` increments version in `settings.py` by one\n", "- `nbdev_clean_nbs` removes all superfluous metadata form the notebooks, to avoid merge conflicts\n", "- `nbdev_detach` exports cell attachments to `dest` and updates references\n", "- `nbdev_diff_nbs` gives you the diff between the notebooks and the exported library\n", "- `nbdev_fix_merge` will fix merge conflicts in a notebook file\n", "- `nbdev_install_git_hooks` installs the git hooks that use the last two command automatically on each commit/merge\n", "- `nbdev_nb2md` converts a notebook to a markdown file\n", "- `nbdev_new` creates a new nbdev project\n", "- `nbdev_read_nbs` reads all notebooks to make sure none are broken\n", "- `nbdev_test_nbs` runs tests in notebooks\n", "- `nbdev_trust_nbs` trusts all notebooks (so that the HTML content is shown)\n", "- `nbdev_update_lib` propagates any change in the library back to the notebooks" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Navigating from notebooks to script and back" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_build_lib(fname:Param(\"A notebook name or glob to convert\", str)=None):\n", " \"Export notebooks matching `fname` to python modules\"\n", " write_tmpls()\n", " notebook2script(fname=fname)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default (`fname` left to `None`), the whole library is built from the notebooks in the `lib_folder` set in your `settings.ini`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_update_lib(fname:Param(\"A notebook name or glob to convert\", str)=None):\n", " \"Propagates any change in the modules matching `fname` to the notebooks that created them\"\n", " script2notebook(fname=fname)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default (`fname` left to `None`), the whole library is treated. Note that this tool is only designed for small changes such as typo or small bug fixes. You can't add new cells in notebook from the library." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_diff_nbs():\n", " \"Prints the diff between an export of the library in notebooks and the actual modules\"\n", " diff_nb_script()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Extracting tests" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "def _test_one(fname, flags=None, verbose=True):\n", " print(f\"testing {fname}\")\n", " start = time.time()\n", " try:\n", " test_nb(fname, flags=flags)\n", " return True,time.time()-start\n", " except Exception as e:\n", " if \"ZMQError\" in str(e): _test_one(item, flags=flags, verbose=verbose)\n", " if verbose: print(f'Error in {fname}:\\n{e}')\n", " return False,time.time()-start" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_test_nbs(fname:Param(\"A notebook name or glob to convert\", str)=None,\n", " flags:Param(\"Space separated list of flags\", str)=None,\n", " n_workers:Param(\"Number of workers to use\", int)=None,\n", " verbose:Param(\"Print errors along the way\", bool)=True,\n", " timing:Param(\"Timing each notebook to see the ones are slow\", bool)=False,\n", " pause:Param(\"Pause time (in secs) between notebooks to avoid race conditions\", float)=0.5):\n", " \"Test in parallel the notebooks matching `fname`, passing along `flags`\"\n", " if flags is not None: flags = flags.split(' ')\n", " if fname is None:\n", " files = [f for f in Config().nbs_path.glob('*.ipynb') if not f.name.startswith('_')]\n", " else: files = glob.glob(fname)\n", " files = [Path(f).absolute() for f in sorted(files)]\n", " if n_workers is None: n_workers = 0 if len(files)==1 else min(num_cpus(), 8)\n", " # make sure we are inside the notebook folder of the project\n", " os.chdir(Config().nbs_path)\n", " results = parallel(_test_one, files, flags=flags, verbose=verbose, n_workers=n_workers, pause=pause)\n", " passed,times = [r[0] for r in results],[r[1] for r in results]\n", " if all(passed): print(\"All tests are passing!\")\n", " else:\n", " msg = \"The following notebooks failed:\\n\"\n", " raise Exception(msg + '\\n'.join([f.name for p,f in zip(passed,files) if not p]))\n", " if timing:\n", " for i,t in sorted(enumerate(times), key=lambda o:o[1], reverse=True):\n", " print(f\"Notebook {files[i].name} took {int(t)} seconds\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default (`fname` left to `None`), the whole library is tested from the notebooks in the `lib_folder` set in your `settings.ini`." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Building documentation" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The following functions complete the ones in `export2html` to fully build the documentation of your library." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "_re_index = re.compile(r'^(?:\\d*_|)index\\.ipynb$')" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "def make_readme():\n", " \"Convert the index notebook to README.md\"\n", " index_fn = None\n", " for f in Config().nbs_path.glob('*.ipynb'):\n", " if _re_index.match(f.name): index_fn = f\n", " assert index_fn is not None, \"Could not locate index notebook\"\n", " print(f\"converting {index_fn} to README.md\")\n", " convert_md(index_fn, Config().config_file.parent, jekyll=False)\n", " n = Config().config_file.parent/index_fn.with_suffix('.md').name\n", " shutil.move(n, Config().config_file.parent/'README.md')\n", " if Path(Config().config_file.parent/'PRE_README.md').is_file():\n", " with open(Config().config_file.parent/'README.md', 'r') as f: readme = f.read()\n", " with open(Config().config_file.parent/'PRE_README.md', 'r') as f: pre_readme = f.read()\n", " with open(Config().config_file.parent/'README.md', 'w') as f: f.write(f'{pre_readme}\\n{readme}')" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_build_docs(fname:Param(\"A notebook name or glob to convert\", str)=None,\n", " force_all:Param(\"Rebuild even notebooks that haven't changed\", bool_arg)=False,\n", " mk_readme:Param(\"Also convert the index notebook to README\", bool_arg)=True,\n", " n_workers:Param(\"Number of workers to use\", int)=None,\n", " pause:Param(\"Pause time (in secs) between notebooks to avoid race conditions\", float)=0.5):\n", " \"Build the documentation by converting notebooks mathing `fname` to html\"\n", " notebook2html(fname=fname, force_all=force_all, n_workers=n_workers, pause=pause)\n", " if fname is None: make_sidebar()\n", " if mk_readme: make_readme()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default (`fname` left to `None`), the whole documentation is build from the notebooks in the `lib_folder` set in your `settings.ini`, only converting the ones that have been modified since the their corresponding html was last touched unless you pass `force_all=True`. The index is also converted to make the README file, unless you pass along `mk_readme=False`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_nb2md(fname:Param(\"A notebook file name to convert\", str),\n", " dest:Param(\"The destination folder\", str)='.',\n", " img_path:Param(\"Folder to export images to\")=\"\",\n", " jekyll:Param(\"To use jekyll metadata for your markdown file or not\", bool_arg)=False,):\n", " \"Convert the notebook in `fname` to a markdown file\"\n", " nb_detach_cells(fname, dest=img_path)\n", " convert_md(fname, dest, jekyll=jekyll, img_path=img_path)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_detach(path_nb:Param(\"Path to notebook\"),\n", " dest:Param(\"Destination folder\", str)=\"\",\n", " use_img:Param(\"Convert markdown images to img tags\", bool_arg)=False):\n", " \"Export cell attachments to `dest` and update references\"\n", " nb_detach_cells(path_nb, dest=dest, use_img=use_img)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Other utils" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_read_nbs(fname:Param(\"A notebook name or glob to convert\", str)=None):\n", " \"Check all notebooks matching `fname` can be opened\"\n", " files = Config().nbs_path.glob('**/*.ipynb') if fname is None else glob.glob(fname)\n", " for nb in files:\n", " try: _ = read_nb(nb)\n", " except Exception as e:\n", " print(f\"{nb} is corrupted and can't be opened.\")\n", " raise e" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default (`fname` left to `None`), the all the notebooks in `lib_folder` are checked." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_trust_nbs(fname:Param(\"A notebook name or glob to convert\", str)=None,\n", " force_all:Param(\"Trust even notebooks that haven't changed\", bool)=False):\n", " \"Trust noteboks matching `fname`\"\n", " check_fname = Config().nbs_path/\".last_checked\"\n", " last_checked = os.path.getmtime(check_fname) if check_fname.exists() else None\n", " files = Config().nbs_path.glob('**/*.ipynb') if fname is None else glob.glob(fname)\n", " for fn in files:\n", " if last_checked and not force_all:\n", " last_changed = os.path.getmtime(fn)\n", " if last_changed < last_checked: continue\n", " nb = read_nb(fn)\n", " if not NotebookNotary().check_signature(nb): NotebookNotary().sign(nb)\n", " check_fname.touch(exist_ok=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "By default (`fname` left to `None`), the all the notebooks in `lib_folder` are trusted. To speed things up, only the ones touched since the last time this command was run are trusted unless you pass along `force_all=True`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_fix_merge(fname:Param(\"A notebook filename to fix\", str),\n", " fast:Param(\"Fast fix: automatically fix the merge conflicts in outputs or metadata\", bool)=True,\n", " trust_us:Param(\"Use local outputs/metadata when fast mergning\", bool)=True):\n", " \"Fix merge conflicts in notebook `fname`\"\n", " fix_conflicts(fname, fast=fast, trust_us=trust_us)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "When you have merge conflicts after a `git pull`, the notebook file will be broken and won't open in jupyter notebook anymore. This command fixes this by changing the notebook to a proper json file again and add markdown cells to signal the conflict, you just have to open that notebook again and look for `>>>>>>>` to see those conflicts and manually fix them. The old broken file is copied with a `.ipynb.bak` extension, so is still accessible in case the merge wasn't sucessful.\n", "\n", "Moreover, if `fast=True`, conflicts in outputs and metadata will automatically be fixed by using the local version if `trust_us=True`, the remote one if `trust_us=False`. With this option, it's very likely you won't have anything to do, unless there is a real conflict." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "def bump_version(version, part=2):\n", " version = version.split('.')\n", " version[part] = str(int(version[part]) + 1)\n", " for i in range(part+1, 3): version[i] = '0'\n", " return '.'.join(version)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "test_eq(bump_version('0.1.1' ), '0.1.2')\n", "test_eq(bump_version('0.1.1', 1), '0.2.0')" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_bump_version(part:Param(\"Part of version to bump\", int)=2):\n", " \"Increment version in `settings.py` by one\"\n", " cfg = Config()\n", " print(f'Old version: {cfg.version}')\n", " cfg.d['version'] = bump_version(Config().version, part)\n", " cfg.save()\n", " update_version()\n", " print(f'New version: {cfg.version}')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Git hooks" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "import subprocess" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_install_git_hooks():\n", " \"Install git hooks to clean/trust notebooks automatically\"\n", " try: path = Config().config_file.parent\n", " except: path = Path.cwd()\n", " hook_path = path/'.git'/'hooks'\n", " fn = hook_path/'post-merge'\n", " hook_path.mkdir(parents=True, exist_ok=True)\n", " #Trust notebooks after merge\n", " with open(fn, 'w') as f:\n", " f.write(\"\"\"#!/bin/bash\n", "echo \"Trusting notebooks\"\n", "nbdev_trust_nbs\n", "\"\"\"\n", " )\n", " os.chmod(fn, os.stat(fn).st_mode | stat.S_IEXEC)\n", " #Clean notebooks on commit/diff\n", " with open(path/'.gitconfig', 'w') as f:\n", " f.write(\"\"\"# Generated by nbdev_install_git_hooks\n", "#\n", "# If you need to disable this instrumentation do:\n", "#\n", "# git config --local --unset include.path\n", "#\n", "# To restore the filter\n", "#\n", "# git config --local include.path .gitconfig\n", "#\n", "# If you see notebooks not stripped, checked the filters are applied in .gitattributes\n", "#\n", "[filter \"clean-nbs\"]\n", " clean = nbdev_clean_nbs --read_input_stream True\n", " smudge = cat\n", " required = true\n", "[diff \"ipynb\"]\n", " textconv = nbdev_clean_nbs --disp True --fname\n", "\"\"\")\n", " cmd = \"git config --local include.path ../.gitconfig\"\n", " print(f\"Executing: {cmd}\")\n", " result = subprocess.run(cmd.split(), shell=False, check=False, stderr=subprocess.PIPE)\n", " if result.returncode == 0:\n", " print(\"Success: hooks are installed and repo's .gitconfig is now trusted\")\n", " else:\n", " print(\"Failed to trust repo's .gitconfig\")\n", " if result.stderr: print(f\"Error: {result.stderr.decode('utf-8')}\")\n", " try: nb_path = Config().nbs_path\n", " except: nb_path = Path.cwd()\n", " with open(nb_path/'.gitattributes', 'w') as f:\n", " f.write(\"\"\"**/*.ipynb filter=clean-nbs\n", "**/*.ipynb diff=ipynb\n", "\"\"\"\n", " )" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This command installs git hooks to make sure notebooks are cleaned before you commit them to GitHub and automatically trusted at each merge. To be more specific, this creates:\n", "- an executable '.git/hooks/post-merge' file that contains the command `nbdev_trust_nbs`\n", "- a `.gitconfig` file that uses `nbev_clean_nbs` has a filter/diff on all notebook files inside `nbs_folder` and a `.gitattributes` file generated in this folder (copy this file in other folders where you might have notebooks you want cleaned as well)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Starting a new project" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "_template_git_repo = \"https://github.com/fastai/nbdev_template.git\"" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#export\n", "@call_parse\n", "def nbdev_new(name: Param(\"A directory to create the project in\", str),\n", " template_git_repo: Param(\"url to template repo\", str)=_template_git_repo):\n", " \"Create a new nbdev project with a given name.\"\n", "\n", " path = Path(f\"./{name}\").absolute()\n", "\n", " if path.is_dir():\n", " print(f\"Directory {path} already exists. Aborting.\")\n", " return\n", "\n", " print(f\"Creating a new nbdev project {name}.\")\n", "\n", " def rmtree_onerror(func, path, exc_info):\n", " \"Use with `shutil.rmtree` when you need to delete files/folders that might be read-only.\"\n", " os.chmod(path, stat.S_IWRITE)\n", " func(path)\n", "\n", " try:\n", " subprocess.run(['git', 'clone', f'{template_git_repo}', f'{path}'], check=True, timeout=5000)\n", " # Note: on windows, .git is created with a read-only flag\n", " shutil.rmtree(path/\".git\", onerror=rmtree_onerror)\n", " subprocess.run(\"git init\".split(), cwd=path, check=True)\n", " subprocess.run(\"git add .\".split(), cwd=path, check=True)\n", " subprocess.run(\"git commit -am \\\"Initial\\\"\".split(), cwd=path, check=True)\n", "\n", " print(f\"Created a new repo for project {name}. Please edit settings.ini and run nbdev_build_lib to get started.\")\n", " except Exception as e:\n", " print(\"An error occured while copying nbdev project template:\")\n", " print(e)\n", " if os.path.isdir(path):\n", " try:\n", " shutil.rmtree(path, onerror=rmtree_onerror)\n", " except Exception as e2:\n", " print(f\"An error occured while cleaning up. Failed to delete {path}:\")\n", " print(e2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`nbdev_new` is a command line tool that creates a new nbdev project based on the [nbdev_template repo](https://github.com/fastai/nbdev_template). You can use a custom template by passing in `template_git_repo`. It'll initialize a new git repository and commit the new project.\n", "\n", "After you run `nbdev_new`, please edit `settings.ini` and run `nbdev_build_lib`." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Add css for \"collapse\" components\n", "\n", "If you want to use collapsable cells in your HTML docs, you need to style the details tag in customstyles.css. `_add_collapse_css` will do this for you, if the details tag is not already styled." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "#exporti\n", "_details_description_css = \"\"\"\\n\n", "/*Added by nbdev add_collapse_css*/\n", "details.description[open] summary::after {\n", " content: attr(data-open);\n", "}\n", "\n", "details.description:not([open]) summary::after {\n", " content: attr(data-close);\n", "}\n", "\n", "details.description summary {\n", " text-align: right;\n", " font-size: 15px;\n", " color: #337ab7;\n", " cursor: pointer;\n", "}\n", "\n", "details + div.output_wrapper {\n", " /* show/hide code */\n", " margin-top: 25px;\n", "}\n", "\n", "div.input + details {\n", " /* show/hide output */\n", " margin-top: -25px;\n", "}\n", "/*End of section added by nbdev add_collapse_css*/\"\"\"\n", "\n", "def _add_collapse_css(doc_path=None):\n", " \"Update customstyles.css so that collapse components can be used in HTML pages\"\n", " fn = (Path(doc_path) if doc_path else Config().doc_path/'css')/'customstyles.css'\n", " with open(fn) as f:\n", " if 'details.description' in f.read():\n", " print('details.description already styled in customstyles.css, no changes made')\n", " else:\n", " with open(fn, 'a') as f: f.write(_details_description_css)\n", " print('details.description styles added to customstyles.css')" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "details.description styles added to customstyles.css\n", "details.description already styled in customstyles.css, no changes made\n" ] } ], "source": [ "#hide\n", "with open('/tmp/customstyles.css', 'w') as f:\n", " f.write('/*test file*/')\n", "_add_collapse_css('/tmp') # details.description styles added ...\n", "with open('/tmp/customstyles.css') as f:\n", " test_eq(''.join(['/*test file*/', _details_description_css]), f.read())\n", "with open('/tmp/customstyles.css', 'a') as f:\n", " f.write('\\nmore things added after')\n", "_add_collapse_css('/tmp') # details.description already styled ...\n", "with open('/tmp/customstyles.css') as f:\n", " test_eq(''.join(['/*test file*/', _details_description_css, '\\nmore things added after']), f.read())" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Export -" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Converted 00_export.ipynb.\n", "Converted 01_sync.ipynb.\n", "Converted 02_showdoc.ipynb.\n", "Converted 03_export2html.ipynb.\n", "Converted 04_test.ipynb.\n", "Converted 05_merge.ipynb.\n", "Converted 06_cli.ipynb.\n", "Converted 07_clean.ipynb.\n", "Converted 99_search.ipynb.\n", "Converted index.ipynb.\n", "Converted tutorial.ipynb.\n" ] } ], "source": [ "#hide\n", "from nbdev.export import notebook2script\n", "notebook2script()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "jupytext": { "split_at_heading": true }, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" } }, "nbformat": 4, "nbformat_minor": 4 }