📄️ Eval Node
The Eval node is a powerful transformation component in ZephFlow that applies expressions to modify, filter, or enrich incoming events. It uses the Fleak Eval Expression Language (FEEL) to transform d
📄️ Filter Node
The `filter` node allows you to selectively process events in your ZephFlow pipeline based on specific conditions. It evaluates each incoming event against an expression written in the Fleak Eval Expr
📄️ Assertion Node
The `assertion` node allows you to validate data flowing through your ZephFlow pipeline against specified conditions. It evaluates each incoming event against an expression written in the Fleak Eval E
📄️ Parser Node
The `parser` node is a versatile component in ZephFlow that extracts structured data from string fields in your events. It can parse various log formats into structured key-value pairs, allowing you t
📄️ Kafka Source Node
The `kafkaSource` node in ZephFlow enables you to consume data from Kafka topics, providing a seamless integration point for processing streaming data from Kafka in your data pipelines.
📄️ Kafka Sink Node
The `kafkaSink` node in ZephFlow enables you to publish processed data to Kafka topics, providing a seamless integration point for sending data from your pipeline to Kafka streams.
📄️ SQS Source Node
The `sqssource` node continuously reads messages from an Amazon SQS queue using long polling and emits them as events into the pipeline. Messages are automatically deleted after successful processing.
📄️ SQS Sink Node
The `sqssink` node sends pipeline records as messages to an Amazon SQS queue using the batch send API. Each record becomes one SQS message. Both standard and FIFO queues are supported.
📄️ JDBC Source Node
The `jdbcsource` node reads data from any JDBC-compatible relational database. It supports both batch mode (one-time full table read) and streaming mode (continuous polling with watermark-based increm
📄️ JDBC Sink Node
The `jdbcsink` node writes records to any JDBC-compatible relational database table. It supports `INSERT` and `UPSERT` (insert-or-update) write modes and processes records in configurable batches with
📄️ GCS Source Node
The `gcssource` node reads objects from a Google Cloud Storage bucket and emits each object's decoded content as records in the pipeline. It is a batch source — it lists every matching blob in the buc
📄️ GCS Sink Node
The `gcssink` node writes pipeline records to a Google Cloud Storage bucket as newline-delimited JSON (`.jsonl`) blobs. It batches records in memory and uploads each batch as a single object, organize
📄️ Elasticsearch Source Node
The `elasticsearchsource` node queries an Elasticsearch index using the scroll API and emits each matching document as a record in the pipeline. It is a batch source — it pages through every hit until
📄️ Elasticsearch Sink Node
The `elasticsearchsink` node writes pipeline records to an Elasticsearch index using the Bulk API. Records are buffered in memory and flushed as a single NDJSON bulk request when the batch reaches `ba
📄️ Azure Blob Storage Source Node
The `azureblobsource` node reads blobs from an Azure Blob Storage container and emits each blob's decoded content as records in the pipeline. It is a batch source — it lists every matching blob in the
📄️ Azure Blob Storage Sink Node
The `azureblobsink` node writes pipeline records to an Azure Blob Storage container as newline-delimited JSON (`.jsonl`) blobs. It batches records in memory and uploads each batch as a single blob, or
📄️ Splunk Source Node
The `splunkSource` node in ZephFlow enables you to ingest data from Splunk Enterprise or Splunk Cloud Platform by executing SPL search queries against the Splunk REST API.
📄️ LDAP Source Node
The `ldapsource` node queries an LDAP/Active Directory server and emits each matching entry as a record in the pipeline. It supports paged result sets for large directories and flexible search scopes.
📄️ S3 Sink Node
The `s3Sink` node in ZephFlow enables you to write processed data to Amazon S3 (or S3-compatible storage) with built-in batching, date-partitioned keys, and multiple encoding formats.
📄️ Delta Lake Sink Node
The `deltalakeSink` node in ZephFlow enables you to write processed data directly to a Delta Lake table on cloud storage or HDFS.
📄️ Databricks Sink Node
The `databricksSink` node in ZephFlow enables you to ingest processed data into Databricks Unity Catalog tables using the Databricks SQL Compute engine.