Azure Blob Storage Sink Node
Quick Reference
Connection String
Full Azure Storage connection string. Provide this or select a credential below.
ex: DefaultEndpointsProtocol=https;AccountName=mystorage;AccountKey=...;EndpointSuffix=core.windows.net
Use Credentials Select a Username/Password credential where the username is the storage account name and the password is the storage account key. Used when no connection string is supplied.
Container Name
The Azure Blob Storage container the node writes to.
ex: workflow-output
Blob Name Prefix
Prefix prepended to each generated blob name. Defaults to events/.
ex: workflow-output/
Batch Size
How many records the node collects before writing a batch as a single blob. Defaults to 100.
The azureblobsink node writes workflow records to an Azure Blob Storage container as newline-delimited JSON blobs, batching records to reduce the number of blob write operations.
Configuration
| Field | Description | Required | Placeholder |
|---|---|---|---|
| Connection String | Full Azure Storage connection string. When set, takes priority over the selected credential. | Either this or Use Credentials | DefaultEndpointsProtocol=https;AccountName=mystorage;AccountKey=... |
| Use Credentials | Select or create a Username/Password credential. Username = Storage Account Name, Password = Storage Account Key. Used only when Connection String is empty. | Either this or Connection String | azure-blob-credential |
| Container Name | Name of the Azure Blob Storage container the node writes to. | Yes | workflow-output |
| Blob Name Prefix | Prefix prepended to every generated blob name. Use it to organize the written blobs into a virtual folder. | No | events/ |
| Batch Size | Number of records buffered in memory before the node flushes them as a single blob to Azure Storage. Minimum: 1. | No | 100 |
Connection String
Paste the full Azure Storage connection string. The connection string contains both the account identity and the secret, and takes priority when both Connection String and Use Credentials are provided.
Use Credentials
Select an existing Username/Password credential from the dropdown or create a new one.
- Username: Azure Storage account name (for example
mystorage). - Password: Storage account access key (the long base64 value from the Azure Portal under Access keys).
The credential needs write access to the target container.
Container Name
The exact name of the Blob Storage container. The container must exist before the workflow runs.
Blob Name Prefix
The prefix is prepended to every blob the node writes. Final blob names use the pattern:
{prefix}{yyyy/MM/dd/HH-mm-ss}-{uuid}.jsonl
For example, with prefix events/, a blob might be named events/2026/04/27/12-34-56-3f1a8b9d-....jsonl (timestamp in UTC). Include a trailing / if you want the prefix to act as a virtual folder.
Batch Size
Records accumulate in memory until the batch reaches this size, at which point the node serializes them as newline-delimited JSON and uploads the batch as a single blob. Larger batches reduce the number of blob writes but increase memory usage and end-to-end latency.
Examples
Example: Archive Workflow Output
Use this node when you want to persist every record processed by the workflow to Azure Blob Storage for long-term storage or downstream processing.
- Provide either the storage connection string or a credential containing the account name and key.
- Enter the destination container name.
- Set Blob Name Prefix to a folder that identifies the workflow (for example
archive/orders/). - Increase Batch Size if the workflow is high-volume and you want fewer, larger blobs.
Example: Date-Partitioned Output
The default blob naming already includes a yyyy/MM/dd path component, so a prefix such as landing/ produces day-partitioned virtual folders without any extra configuration. Downstream tools that expect Hive-style partitions can read directly from the produced layout.
Output Format
The sink writes blobs as newline-delimited JSON (.jsonl). Each line in the blob is one record from the workflow batch.
Related Nodes
- Azure Blob Storage Source: Read blobs from an Azure Blob Storage container
- GCS Sink: Write workflow records to a Google Cloud Storage bucket
- S3 Sink: Write workflow records to AWS S3