{ "cells": [ { "cell_type": "markdown", "id": "a024bb09", "metadata": {}, "source": [ "### Watch thie video below to see a walkthrough of the Direct Lake Migration process\n", "[](https://www.youtube.com/watch?v=gGIxMrTVyyI?t=495)\n" ] }, { "cell_type": "markdown", "id": "5c27dfd1-4fe0-4a97-92e6-ddf78889aa93", "metadata": { "nteract": { "transient": { "deleting": false } } }, "source": [ "### Install the latest .whl package\n", "\n", "Check [here](https://pypi.org/project/semantic-link-labs/) to see the latest version." ] }, { "cell_type": "code", "execution_count": null, "id": "d5cae9db-cef9-48a8-a351-9c5fcc99645c", "metadata": { "jupyter": { "outputs_hidden": true, "source_hidden": false }, "nteract": { "transient": { "deleting": false } } }, "outputs": [], "source": [ "%pip install semantic-link-labs" ] }, { "cell_type": "markdown", "id": "969a29bf", "metadata": {}, "source": [ "### Import the library and set initial parameters" ] }, { "cell_type": "code", "execution_count": null, "id": "29c923f8", "metadata": {}, "outputs": [], "source": [ "import sempy_labs as labs\n", "from sempy_labs import migration, directlake\n", "import sempy_labs.report as rep\n", "\n", "dataset_name = '' #Enter the import/DQ semantic model name\n", "workspace_name = None #Enter the workspace of the import/DQ semantic model. It set to none it will use the current workspace.\n", "new_dataset_name = '' #Enter the new Direct Lake semantic model name\n", "new_dataset_workspace_name = None #Enter the workspace where the Direct Lake model will be created. If set to None it will use the current workspace.\n", "lakehouse_name = None #Enter the lakehouse to be used for the Direct Lake model. If set to None it will use the lakehouse attached to the notebook.\n", "lakehouse_workspace_name = None #Enter the lakehouse workspace. If set to None it will use the new_dataset_workspace_name." ] }, { "cell_type": "markdown", "id": "5a3fe6e8-b8aa-4447-812b-7931831e07fe", "metadata": { "nteract": { "transient": { "deleting": false } } }, "source": [ "### Create the [Power Query Template](https://learn.microsoft.com/power-query/power-query-template) file\n", "\n", "This encapsulates all of the semantic model's Power Query logic into a single file." ] }, { "cell_type": "code", "execution_count": null, "id": "cde43b47-4ecc-46ae-9125-9674819c7eab", "metadata": { "jupyter": { "outputs_hidden": false, "source_hidden": false }, "nteract": { "transient": { "deleting": false } } }, "outputs": [], "source": [ "migration.create_pqt_file(dataset = dataset_name, workspace = workspace_name)" ] }, { "cell_type": "markdown", "id": "bf945d07-544c-4934-b7a6-cfdb90ca725e", "metadata": { "nteract": { "transient": { "deleting": false } } }, "source": [ "### Import the Power Query Template to Dataflows Gen2\n", "\n", "- Open the [OneLake file explorer](https://www.microsoft.com/download/details.aspx?id=105222) and sync your files (right click -> Sync from OneLake)\n", "\n", "- Navigate to your lakehouse. From this window, create a new Dataflows Gen2 and import the Power Query Template file from OneLake (OneLake -> Workspace -> Lakehouse -> Files...), and publish the Dataflows Gen2.\n", "\n", "