top of page
Jihwan Kim

Unlocking Insights of users activity: Using "sempy_labs.admin.list_activity_events( )" in Fabric Notebooks


In this writing, I like to share how I start to learn, write, and use the "sempy_labs.admin.list_activity_events" function in Fabric Notebook.

This feature has been a game-changer for me to analyze users' activity in Fabric environment, and I will walk through how I implemented it.


Background

My goal was to create a monitoring_power_bi_usage report and build a semantic model that tracks and traces user activities. These insights are vital for improving reports and optimizing user experiences.

While I relied on the Data Platform team for the source data, I realized the need for direct access to streamline the process.

My solution? Creating a Lakehouse (or Warehouse) as the foundation for my Power BI report and semantic model.


What Can Help?

The Semantic Link Lab provided the perfect tools for achieving this. It simplifies the process of accessing and analyzing user activity data, empowering me to take control of my reporting pipeline.


How I Did It

I leveraged the "sempy_labs.admin.list_activity_events" function to gather user activity data. Using Fabric Notebooks, I was able to:

  1. Automate the retrieval of daily user activity.

  2. Store the result of the function in a Lakehouse or Warehouse in Fabric.

  3. Analyze key metrics such as activities, view counts, usage patterns, and improvement opportunities.



Here’s the link to the documentation for sempy_labs.admin.list_activity_events:



Before writing the code in the Fabric Notebook, I followed Michael Kovalsky's guide to install and import the required library.

%pip install semantic-link-labs
import sempy_labs as labs
from sempy_labs import migration, directlake, admin
from sempy_labs import lakehouse as lake
from sempy_labs import report as rep
from sempy_labs.tom import connect_semantic_model
from sempy_labs.report import ReportWrapper
from sempy_labs import ConnectWarehouse
from sempy_labs import ConnectLakehouse


And then, I wrote the below in Fabric notebook.

df= labs.admin.list_activity_events(start_time='2024-12-26T00:00:00', end_time='2024-12-26T23:59:59')

display(df)

The results can be stored in the Lakehouse, and I scheduled the notebook to run daily, appending new data each day.



One of the key insights I discovered was the importance of the activity column in the analysis. For example, tracking how often visualizations in the specific workspace were exported by certain users allowed me to investigate the reasons behind frequent exports. From there, I could dig into it to find out the detail and identify improvement points to enhance both the report and the semantic model, ensuring end users find the reports more efficient and effective for their needs.

Another exciting capability was scheduling the notebook to run daily. This automation ensured I always had the latest insights without manual intervention, saving time and effort.


Additional consideration

During my testing in my personal Fabric environment, where I am the sole user, I observed that the user ID was visible, and it was clearly linked to myself. While this works seamlessly in an individual setting, it raises an important question: How does this behavior align with GDPR compliance in a corporate Fabric environment?

For organizations operating under GDPR (General Data Protection Regulation), protecting personal data is paramount. GDPR requires companies to handle user information responsibly, ensuring that personally identifiable information (PII), such as user IDs, is either anonymized or adequately secured. Companies must implement strict policies and technological safeguards to comply with these regulations.

In the context of Fabric, organizations may need to:

  1. Anonymize User Data: Replace user IDs with pseudonymized identifiers to protect privacy while enabling analysis.

  2. Restrict Access: Implement role-based access controls to ensure that only authorized personnel can view sensitive user information.

  3. Audit and Monitor Usage: Continuously audit how user data is accessed and ensure activities align with GDPR requirements.

  4. Data Retention Policies: Define clear data retention policies to delete or archive user data as per legal obligations.


I’m curious whether the behavior I observed in my personal environment—visibility of user IDs—would translate to a corporate environment where GDPR is strictly enforced. For instance, would the Fabric implementation automatically mask or anonymize user IDs to comply with GDPR? Or would the organization rely on custom implementations within Fabric to enforce compliance?

This is an area worth exploring further to ensure that the tools and configurations align with privacy regulations while still enabling valuable insights from user activity data.



Summary

Through this learning process, I discovered how Semantic Link Lab and "sempy_labs.admin.list_activity_events( )" could unlock actionable insights into Power BI user activity. The most exciting part is the ability to take proactive steps to enhance reports based on real usage patterns. The ability to automate this process makes it not only efficient but also highly scalable.


I hope this helps having more fun in exploring the possibilities of Semantic Link Lab in Fabric and encourages you to create smarter, data-driven solutions.

128 views0 comments

Recent Posts

See All

Comments


bottom of page