Create ADLS External Tables in Azure Synapse

Create a Master Key if not present.
Why?
If a database scoped credential is used, Synapse requires encryption.
The Master Key ensures that credentials are securely stored.
✅ Needed only once per database.
✅ Required for setting up authentication.
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Password';
Add a storage account key or SAS token or Managed Identity.
Why?
Uses the Access Key of the Storage Account.
Requires adding the Access Key in the
CREDENTIAL
.
CREATE DATABASE SCOPED CREDENTIAL BlobStorageCredential
WITH IDENTITY = 'Storage Account Key',
SECRET = 'ddfgyhjuKG1Mr6SgtfvfvtbvfdmJ=';
Create a external data source. Path should be abfss only.
Why?
Defines where Synapse should look for the data.
Maps Synapse to Azure Blob Storage.
Uses the credential for authentication.
✅ Links Synapse to your Blob Storage.
✅ Required before creating external tables.
DROP EXTERNAL DATA SOURCE [AzureDataLakeStore]
GO
CREATE EXTERNAL DATA SOURCE [AzureDataLakeStore] WITH (
LOCATION = N'abfss://ext-test@abcpoc.dfs.core.windows.net',
CREDENTIAL = BlobStorageCredential
)
GO
Create external file format.
Why?
Tells Synapse how to read the files (CSV, Parquet, JSON, Avro).
Each format has different structures and metadata handling.
Without this step, Synapse doesn’t know how to interpret the file.
-- Create external file format Parquet
CREATE EXTERNAL FILE FORMAT ParquetFileFormat
WITH (
FORMAT_TYPE = PARQUET
);
-- Create external file format CSV
CREATE EXTERNAL FILE FORMAT CSVFormat
WITH (
FORMAT_TYPE = DELIMITEDTEXT,
FORMAT_OPTIONS (FIELD_TERMINATOR = ';', FIRST_ROW = 1)
);
Create external table.
Why?
External tables map to Blob Storage files but do not store data in Synapse.
Queries run directly on storage, avoiding unnecessary data movement.
CREATE EXTERNAL TABLE dbo.names_parquet
(
column0 varchar(255) NULL,
column1 varchar(255) NULL
)
WITH (
LOCATION = '/parquet/',
DATA_SOURCE = AzureDataLakeStore,
FILE_FORMAT = ParquetFileFormat
) GO
Subscribe to my newsletter
Read articles from Harshita Chaudhary directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Harshita Chaudhary
Harshita Chaudhary
Hii, My name is Harshita Chaudhary, I am a data engineer. I have joined Hashnode to share my day to day learnings with you guys, I like to write tech blogs on the fundamental topics of data engineering in a concise manner adding proper and practical examples. Happy learning and Keep Querying !!