Skip to main content

Microsoft Fabric Lakehouse Connector

Guide to connecting MS Fabric Lakehouse

Updated this week

The Microsoft Fabric Lakehouse Connector allows Savant to write data into Microsoft Fabric Lakehouses using OneLake APIs and Azure Storage permissions. This connector is designed for data ingestion and pipeline outputs, enabling Savant workflows to deliver processed data directly into Fabric Lakehouse tables or files.

This connector supports Microsoft Entra ID (Azure AD) OAuth authentication through a registered Azure application.

Note:

This connector supports write operations only. To read data from Microsoft Fabric, use the Microsoft Fabric Connector.

Features

The Microsoft Fabric Lakehouse Connector enables you to:

  • Write data from Savant pipelines into Fabric Lakehouse tables or files

  • Authenticate securely using Microsoft Entra ID OAuth

  • Deliver processed data directly into OneLake storage

  • Automate Lakehouse ingestion through Savant workflows

  • Integrate Fabric Lakehouse into ETL pipelines and data automation workflows

Requirements

Before configuring the connector, ensure the following prerequisites are completed.

Savant accesses your Lakehouse from the following IP addresses:

35.188.163.165

34.122.126.216

If your environment restricts inbound connections, ensure these IP addresses are added to your network allowlist or firewall rules.

Authentication Method

The connector supports:

  • Microsoft Entra ID (Azure AD) OAuth

This requires registering an Azure Application (Service Principal) that Savant will use to access the Fabric Lakehouse.

Register a Savant Application in Azure

Follow these steps to create the required application in Azure.

Step 1 — Create an App Registration

  1. Go to Azure Portal

  2. Navigate to Microsoft Entra ID → App registrations

  3. Click + New registration

Configure the following:

Field

Value

Name

Fabric-Lakehouse-Connector

Supported account types

Accounts in this organizational directory only

Redirect URI

See below

Redirect URI

Use the appropriate URI depending on your Savant environment.

After the app is registered, save the following values:

  • Application (Client) ID

  • Directory (Tenant) ID

These will be required when configuring the connector in Savant.

Step 2 — Create a Client Secret

  1. Go to Certificates & secrets

  2. Click + New client secret

  3. Enter a description

  4. Select an expiration period

  5. Click Add

Copy the generated Client Secret value immediately, as it will not be visible again.

Step 3- Configure API Permissions

Next, configure the required API permissions.

  1. Navigate to Manage → API permissions

  2. Click + Add a permission

  3. Select APIs my organization uses

  4. Choose Azure Storage

  5. Select Delegated permissions

  6. Add the following permissions:

user_impersonation
Lakehouse.Execute.All
Lakehouse.ReadWrite.All
Workspace.ReadWrite.All
Code.AccessStorage.All
Code.AccessFabric.All
Code.AccessAzureDataLake.All
Code.AccessAzureKeyvault.All
Code.AccessAzureDataExplorer.All
offline_access
openid

Once the permissions are added:

  1. Click Grant admin consent

  2. Confirm the consent request.

Important:

Lakehouse uses Azure Storage API permissions, not Azure SQL Database permissions, because Lakehouse data is accessed via OneLake (ADLS Gen2).

Step 4- Grant Workspace Permissions

The service principal must be added to the Fabric workspace.

  1. Open your Fabric Workspace

  2. Click Manage Access (top right)

  3. Click Add people or groups

  4. Add the App Registration (Service Principal)

Assign one of the following roles:

Role

Permissions

Admin

Full control including workspace management

Member

Full CRUD access to data

Contributor

Read, write, and delete data

Contributor role is the minimum required for write operations.

Click Add.

Step 5- Grant Lakehouse-Level Permissions

Additional permissions must be granted on the Lakehouse item.

  1. Open the Lakehouse in your Fabric workspace

  2. Click Manage Access or go to Settings → Permissions

  3. Add the Service Principal

  4. Grant the following permissions:

  • Read all Apache Spark and subscribe to events

  • Read all SQL endpoint data

  • Build reports on the default semantic model

  • Execute Apache Spark jobs on lakehouse

Step 6- Find Workspace ID and Lakehouse ID

The Workspace ID and Lakehouse ID can be obtained from the Fabric portal URL.

URL Format

Example

From the URL:

Field

Description

Workspace ID

GUID after /groups/

Lakehouse ID

GUID after /lakehouses/

Example values:

Parameter

Example

Workspace ID

a1b2c3d4-e5f6-7890-abcd-ef1234567890

Lakehouse ID

f9e8d7c6-b5a4-3210-fedc-ba0987654321

Step 7- Configure the Connector in Savant

When creating the Microsoft Fabric Lakehouse connection in Savant, provide the following information.

Field

Description

Application (Client) ID

From Azure App Registration

Directory (Tenant) ID

From Azure App Registration

Client Secret

From Certificates & secrets

Workspace ID

From Fabric URL

Lakehouse ID

From Fabric URL

Important Notes

  • The Microsoft Fabric Lakehouse Connector is a write-only connector (destination connector).

  • All data reads from Fabric should use the Microsoft Fabric Connector.

  • Ensure the service principal has Contributor or higher role in the Fabric workspace.

  • Lakehouse access is performed through OneLake APIs using Azure Storage permissions.

Troubleshooting

  • Verify that the Application (Client) ID, Directory (Tenant) ID, and Client Secret are entered correctly and that the client secret has not expired.

  • Ensure that Admin Consent has been granted for all required API permissions in the Azure App Registration.

  • Confirm that the service principal is added to the Fabric workspace with at least the Contributor role.

  • Verify that the required permissions are granted directly on the Lakehouse item in the Fabric workspace.

  • Ensure the Workspace ID and Lakehouse ID are copied correctly from the Fabric portal URL.

  • Confirm that the Savant IP addresses (35.188.163.165 and 34.122.126.216) are whitelisted in your firewall or network configuration.

  • Verify that the target Lakehouse destination configuration is correct and that the service principal has write permissions.

  • Ensure that the Azure Storage API permissions are configured correctly since Lakehouse access uses OneLake (ADLS Gen2).

Did this answer your question?