top of page
Jihwan Kim

Tracking a Power BI Project (Improve Semantic Model Size) Using Semantic Link in Fabric

Recently, I started exploring the basics of Semantic Link, and it opened up new possibilities for me to track the progress of one of my project—specifically, one aimed at reducing the size of a Power BI Semantic Model in one of the production workspaces.

Let me share what I’ve learned and how I applied it to track the project’s progress.




Understanding the Problem

In this project, my goal was to reduce the size of a semantic model, for instance, from 1 GB to 500 MB. While reducing the size is a clear objective, tracking the progress effectively over time is crucial to ensuring that the changes I implement are successful and measurable. The question I posed to myself was: how can I trace the size difference at regular intervals and document the changes in an automated and structured way?




The Role of Semantic Link

Semantic Link proved to be a valuable tool for this challenge. Using Fabric Notebook and the Semantic Link library, I was able to:

  • Extract detailed semantic model information similar to what VertiPaq Analyzer provides.

  • Save the information into a Delta table in Fabric Lakehouse with timestamps to capture snapshots of the dataset size over time.




Key Considerations

1. Importance of Tracing the Project

Tracking progress is essential for several reasons:

  • Validation of changes: Without consistent tracking, it’s challenging to confirm if optimizations are effective.

  • Documentation: Keeping a history of dataset changes helps in understanding the impact of each action.

  • Accountability: Progress reports provide transparency to stakeholders and ensure the project stays on course.


2. Automating Table Creation for Proper Tracing

Automation is critical to ensure data collection is consistent and reliable. By automating the creation of tables that log dataset size and other relevant metrics daily, I was able to eliminate manual steps, reduce errors, and maintain a robust audit trail.


3. Leveraging Fabric Notebook for Automation

Fabric Notebook was instrumental in this process. It allowed me to run Semantic Link code regularly, capture the necessary data, and export it to a structured format in Fabric Lakehouse. Here’s how it was done in my sample:





Writing the Fabric Notebook

  1. Install the Required Library: Use the %pip install semantic-link-labs command to ensure that the Semantic Link library is available in your notebook environment.

  2. Import the Library: Import the sempy_labs module to access its functions.

  3. Run the VertiPaq Analyzer Function: Use the vertipaq_analyzer function to analyze the dataset, specifying the dataset name and workspace. The export="table" parameter ensures that the results are stored as a structured table.

  4. The result is saved in Fabric Lakehouse: It is saved in a Delta table with timestamps, providing a historical record of dataset sizes and other information.



%pip install semantic-link-labs
import sempy_labs as labs

labs.vertipaq_analyzer(dataset="DatasetName", workspace="WorkspaceName", export="table")

  1. In the Lakehouse, examining one of the tables (vertipaqanalyzer_model table) reveals the following structure and details.




  1. Running the same Fabric notebook a second time appends the new information, adding a row with an updated timestamp.



  1. In the notebook settings, I configured a scheduled run to automatically collect daily data.





Summary: What I’ve Learned

Through this process, I discovered how "vertipaq_analyzer" function in Semantic Link can be a powerful tool for tracking and optimizing semantic model size reduction projects. By automating data collection and storing it in structured Delta tables within Fabric Lakehouse, I gained a consistent, scalable way to measure progress over time.

This experience not only enhanced my ability to manage projects efficiently but also deepened my understanding of Semantic Link's capabilities. I encourage to explore these tools to simplify and improve the data-driven workflows, making insights more actionable and accessible.




I hope this makes authoring a Fabric Notebook with semantic links more enjoyable.

127 views0 comments

Recent Posts

See All

Comments


bottom of page