top of page

Direct Lake vs. Import vs. Direct Lake + Import: Learn Building Three Storage Patterns That Scale

  • Jihwan Kim
  • 3 days ago
  • 3 min read

Introduction

In this writing, I like to share how I learned to create three distinct storage patterns for Power BI semantic models—Import‑only, Direct Lake‑only, and the new Hybrid approach that mixes both. The capability was introduced in the May 2025 Power BI Desktop release (Feature summary).

Using a small‑sized 70K‑row FactResellerSales table on a Fabric trial capacity, this post walks through how to build each pattern, and what happens it refreshes.



1 – Setting the Stage: One Model, Three Possibilities

Pattern

Where the Data Lives

When to Choose It

Import‑only

All tables copied into VertiPaq.

Small‑to‑medium data, heavy calculated columns or row‑level security, predictable daily refresh window.

Direct Lake‑only

All tables stay in OneLake; the model reads Parquet directly.

Large datasets, need for near‑real‑time data exposure.

Hybrid

Fact tables remain in Direct Lake; smaller, volatile dimensions are Imported.

Snappy dimension slicers with minimal fact refresh load.

Throughout the blog post, I use these six tables:

  • FactResellerSales (70 K rows)

  • DimDate, DimProduct, DimEmployee, DimReseller, DimSalesTerritory




2 – Step by Step: Building Each Model

2.1 Import‑Only Model

  1. Connect → Home ▶︎ OneLake catalog ▶︎ Lakehouses ▶︎ choose my Lakehouse ▶︎ Connect to SQL endpoint ▶︎ Connect.


  1. Select tables ▶︎ Transform Data ▶︎ Import ▶︎ OK.



  1. Transform in Power Query Editor ▶︎ Close & Apply.


  1. Create the data model, add measures, build visuals, and Publish to the workspace.


Sales: = 
SUM(factresellersales[SalesAmount])

Avg. sales by reseller: = 
AVERAGEX (
    SUMMARIZE ( factresellersales, dimreseller[ResellerName] ),
    [Sales:]
)

Reseller count: = 
COUNTROWS ( SUMMARIZE ( factresellersales, dimreseller[ResellerName] ) )





  1. Check final model size with DAX Studio (see screenshot).




Tip: Import‑only unlocks every DAX feature without restriction—calculated tables, columns, and complex RLS all run inside VertiPaq.







2.2 Direct Lake‑Only Model

  1. Connect → Home ▶︎ OneLake catalog ▶︎ Lakehouses ▶︎ pick the Lakehouse ▶︎ Connect to OneLake ▶︎ Connect.



  1. Name the semantic model, select tables, and OK.



  1. Create relationships, and measures.


  2. The model auto‑saves in the workspace.

Refresh reality: A Direct Lake model doesn’t move data. A quick metadata call—sometimes called a re‑frame—points the model to the newest Parquet files and finishes in milliseconds (Direct Lake overview).



2.3 Hybrid Model (Fact in Direct Lake, Dims in Import)

  1. Create a Direct Lake model that contains only the fact table.


  1. The semantic model (Direct_Lake_and_Import) is created in the Workspace.



  1. Open Tabular Editor 2, connect to the Import‑only model, and copy the dimension tables.




  1. Open Tabular Editor 2, connect to the Direct_Lake_and_Import model, and paste those dimensions into the model.




  1. When refreshing the semantic model in the service, the error will show.


  1. Fix the gateway connection (Semantic model settings ▶︎ Gateway and cloud connections ▶︎ map SQL Server ▶︎ Create connection).



  1. When the first refresh succeeds (in my case, it took 15 minutes on trial capacity to see the first successful refresh), open the model, define relationships and measures, then build visuals.



Edit the semantic mode


Create relationship and measures


Create visualizations in Power BI Desktop





I hope this helps having fun in choosing the right storage pattern for your next semantic model. Whether you import everything, ride the Direct Lake wave, or blend the two, good semantic models and reports come by putting each table in the right storage mode and scheduling refreshes so they align with how that mode works.



References



Commentaires


bottom of page