Last week I shared how you could use the OneLake File Explorer to sync your Lakehouse tables to your local machine. It's a convenient way to get your Parquet and Delta Lake files off the cloud and onto disk — but what do you actually do with them once they're there? In this post, I’ll walk you through how to interact with your locally synced OneLake files using Python. We'll cover four practical approaches, with real code you can drop straight into a notebook. Where are your files? When OneLake File Explorer syncs your files, they land in a path that looks something like this: C:\Users\<you>\OneLake - <workspace name>\<lakehouse name>.Lakehouse\Tables\<table name> Keep that path in mind— you'll be passing it into every example below. Delta Lake tables are stored as folders containing multiple Parquet files plus a _delta_log/ directory, so make sure you're pointing at the table's root folder, not an individual file. Readin...
If you've spent any time working with Microsoft Fabric, you know that navigating to the web portal every time you need to inspect, upload, or tweak a file gets old fast. OneLake File Explorer is Microsoft's answer to that friction — a lightweight Windows application that mounts your entire Fabric data estate directly in Windows File Explorer, the same way OneDrive handles your documents. One..what? OneLake is the unified data lake underpinning every Microsoft Fabric tenant. Unlike traditional architectures where teams maintain separate data lakes per domain or business unit, every Fabric tenant gets exactly one OneLake — one place where Lakehouses, Warehouses, KQL databases, and other Fabric items store their data. There's no need to copy data between engines; Spark, SQL, and Power BI all read from the same underlying storage. The organizational hierarchy is straightforward: Tenant → Workspaces → Items (Lakehouses, Warehouses, etc.) → Files/Tables . This maps neat...