Azure Blob Storage Source Node
Quick Reference
Connection String
Full Azure Storage connection string. Provide this or select a credential below.
ex: DefaultEndpointsProtocol=https;AccountName=mystorage;AccountKey=...;EndpointSuffix=core.windows.net
Use Credentials Select a Username/Password credential where the username is the storage account name and the password is the storage account key. Used when no connection string is supplied.
Container Name
The Azure Blob Storage container the node reads from.
ex: workflow-input
Blob Prefix
Optional prefix used to filter which blobs in the container are read. Leave blank to read every blob.
ex: logs/2026/04/
Encoding Type
Format the node uses to decode the content of each blob.
ex: JSON_OBJECT_LINE for newline-delimited JSON
The azureblobsource node reads blobs from an Azure Blob Storage container and emits each blob's decoded content as workflow records.
Overview
The Azure Blob Source connector lets you pull data stored in Azure Blob Storage into your workflow. It functions by listing blobs in the configured container (optionally filtered by a prefix), downloading each blob, and decoding its content using the selected encoding type.
This source is designed for batch ingestion. When the workflow runs, the connector authenticates with Azure Storage, iterates through the matching blobs, and emits the records contained in each blob.
Prerequisites
Before configuring the source, ensure you have:
- Authentication: either an Azure Storage connection string or a credential containing the storage account name and storage account key.
- Read access on the target container. The account or shared key must allow listing and reading blobs in the container.
Configuration
| Field | Description | Required | Placeholder |
|---|---|---|---|
| Connection String | Full Azure Storage connection string. When set, takes priority over the selected credential. | Either this or Use Credentials | DefaultEndpointsProtocol=https;AccountName=mystorage;AccountKey=... |
| Use Credentials | Select or create a Username/Password credential. Username = Storage Account Name, Password = Storage Account Key. Used only when Connection String is empty. | Either this or Connection String | azure-blob-credential |
| Container Name | Name of the Azure Blob Storage container the node reads from. | Yes | workflow-input |
| Blob Prefix | Prefix used to filter the blobs listed in the container. Only blobs whose name starts with this value are read. Leave blank to read every blob. | No | logs/2026/04/ |
| Encoding Type | Format used to decode each blob's content into workflow records. | Yes | JSON_OBJECT_LINE |
Connection String
Paste the full Azure Storage connection string. The connection string contains both the account identity and the secret, and takes priority when both Connection String and Use Credentials are provided.
Use Credentials
Select an existing Username/Password credential from the dropdown or create a new one.
- Username: Azure Storage account name (for example
mystorage). - Password: Storage account access key (the long base64 value from the Azure Portal under Access keys).
The credential is used to construct a connection string at runtime when Connection String is left empty.
Container Name
The exact name of the Blob Storage container. The container must exist before the workflow runs.
Blob Prefix
Use the prefix to scope the read to a subset of the container — for example, a date-partitioned virtual folder. The connector matches blobs whose name starts with the prefix exactly, so include any trailing / if you want to limit the read to a folder.
Encoding Type
Choose the format of the blobs you are reading. Common choices:
- JSON_OBJECT_LINE — newline-delimited JSON, one record per line.
- JSON_ARRAY — a JSON array where each element becomes a record.
- JSON_OBJECT — a single JSON object per blob.
- CSV — comma-separated rows; the first row is treated as the header.
- STRING_LINE / TEXT — plain text, one line per record.
- XML / PARQUET — also supported.
Examples
Example: Replay Archived Events
Use this node when historical events have been archived to Azure Blob Storage and you want to reprocess them through the workflow.
- Provide either the storage connection string or a credential containing the account name and key.
- Enter the container name where the archive lives.
- Set Blob Prefix to the date range you want to replay (for example
events/2026/04/). - Choose JSON_OBJECT_LINE if each archived blob is a
.jsonlexport.
Example: Ingest a Single Blob
If you only want to process one blob in the container, set Blob Prefix to the full blob name. The connector will list only that blob and read it once.
Related Nodes
- Azure Blob Storage Sink: Write workflow records back to an Azure Blob Storage container
- GCS Source: Read objects from a Google Cloud Storage bucket
- S3 Sink: Write workflow records to AWS S3