Bulk Copy Semantic Model Objects and Properties Between Models Using Semantic Link Labs

The new version of Semantic Link Labs is out (v 0.9.11), and as always, Michael Kovalsky has added many new features that make working with Semantic Models in Fabric items much easier. One new method introduced is copy_object()
, which, as the name suggests, makes copying semantic model objects from one semantic model to another a breeze. Previously, you could do this using TOM, but now it has its own function, so you can use the boilerplate function. You can also use Tabular Editor (either manually copy/paste or using C#). I have done many migrations and have always used a C# script I developed. In this case, we use Python. You can copy paste between Import < - > Direct Lake models as well (as long as the objects/properties are supported).
copy_obejct():
The method is easy to use:
#%pip install semantic_link_labs --q
import sempy.labs as labs
with labs.tom.connect_semantic_model(dataset="SourceDataset", workspace="SourceWorkspace") as tom:
# to copy Sales table from source dataset to target dataset
table = tom.model.Tables["Sales"]
tom.copy_object(
object=table,
target_dataset="TargetDataset",
target_workspace="TargetWorkspace"
)
Bulk Copy Objects:
For demo purposes, I created two models model_1
and model_2
. model_1
has relationships, calculated columns, calculated tables and measures. model_2
has the two import tables without any relationships and calc tables/calumns and measures. I want to copy from model_1
to model_2
. I will define the functions:
from sempy_labs.tom import connect_semantic_model
import sempy.fabric as fabric
#if workspaces are None, current workspace is used
def copy_relationships(
source_dataset: str,
target_dataset: str,
source_workspace: str = None,
target_workspace: str = None
):
"""
Copy all relationships from the source semantic model to the target semantic model.
"""
with connect_semantic_model(dataset=source_dataset, workspace=source_workspace) as src_tom:
for rel in src_tom.model.Relationships:
print(f"Copying relationship: {rel.Name}")
src_tom.copy_object(
object=rel,
target_dataset=target_dataset,
target_workspace=target_workspace
)
print("All relationships have been copied.")
def copy_calculated_columns(
source_dataset: str,
target_dataset: str,
source_workspace: str = None,
target_workspace: str = None
):
"""
Copy all calculated columns from the source semantic model to the target semantic model.
Uses sempy.fabric.list_columns to reliably identify calculated columns.
"""
calc_cols_df = fabric.list_columns(source_dataset, workspace=source_workspace)[['Table Name', 'Column Name', 'Type']]
calculated_columns = set(
tuple(x) for x in calc_cols_df.query('Type == "Calculated"')[['Table Name', 'Column Name']].values
)
with connect_semantic_model(dataset=source_dataset, workspace=source_workspace) as src_tom:
for table in src_tom.model.Tables:
for column in table.Columns:
if (table.Name, column.Name) in calculated_columns:
print(f"Copying calculated column: {column.Name} from table: {table.Name}")
src_tom.copy_object(
object=column,
target_dataset=target_dataset,
target_workspace=target_workspace
)
print("All calculated columns have been copied.")
def copy_calculated_tables(
source_dataset: str,
target_dataset: str,
source_workspace: str = None,
target_workspace: str = None
):
"""
Copy all calculated tables from the source semantic model to the target semantic model.
"""
calculated_tables = set(
fabric.list_tables(source_dataset, workspace=source_workspace)
.query('Type == "Calculated Table"')['Name']
)
with connect_semantic_model(dataset=source_dataset, workspace=source_workspace) as src_tom:
for table in src_tom.model.Tables:
if table.Name in calculated_tables:
print(f"Copying calculated table: {table.Name}")
src_tom.copy_object(
object=table,
target_dataset=target_dataset,
target_workspace=target_workspace
)
print("All calculated tables have been copied.")
from sempy_labs.tom import connect_semantic_model
def copy_all_measures(
source_dataset: str,
target_dataset: str,
source_workspace: str = None,
target_workspace: str = None
):
"""
Copy all measures from every table in the source semantic model to the target semantic model.
"""
with connect_semantic_model(dataset=source_dataset, workspace=source_workspace) as src_tom:
for table in src_tom.model.Tables:
for measure in table.Measures:
print(f"Copying measure '{measure.Name}' from table '{table.Name}'...")
try:
src_tom.copy_object(
object=measure,
target_dataset=target_dataset,
target_workspace=target_workspace)
except:
print(f"Error with {measure.Name}, check again")
continue
print("All measures have been copied.")
copy_relationships(
source_dataset="model_1",
target_dataset="model_2",
source_workspace="a79cbb27-3cc-d64bf25ca405",
target_workspace="a79cbb27-3bf64bf25ca405"
)
copy_calculated_tables(
source_dataset="model_1",
target_dataset="model_2",
source_workspace="a79cbb27-3cc-d64bf25ca405",
target_workspace="a79cbb27-3bf64bf25ca405"
)
copy_calculated_columns(
source_dataset="model_1",
target_dataset="model_2",
source_workspace="a79cbb27-3cc-d64bf25ca405",
target_workspace="a79cbb27-3bf64bf25ca405"
)
copy_all_measures(
source_dataset="model_1",
target_dataset="model_2",
source_workspace="a79cbb27-3cc-d64bf25ca405",
target_workspace="a79cbb27-3bf64bf25ca405"
)
Before:
After
When you copy tables, make sure to update the connection and refresh the target semantic model to reflect the changes and ensure all objects are copied correctly. When you copy objects, all its properties are also copied (e.g. formatting, annotations etc.) You can also copy any TOM object, such as partitions, hierarchies, perspectives, and more.
You need XMLA Write enabled in the tenant and capacity settings. You also need Contributor+ access (or build permissions) to both the semantic models. This works in any Premium/Fabric workspace.
Download notebook from here.
References:
Subscribe to my newsletter
Read articles from Sandeep Pawar directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Sandeep Pawar
Sandeep Pawar
Microsoft MVP with expertise in data analytics, data science and generative AI using Microsoft data platform.