📄️ Fleak Eval Node
The Fleak Eval Node (Evaluation Node) allows you to transform, extract data from your workflow events using expressions. It is designed to be a flexible step where you can manipulate data without writ
📄️ Filter Node
The filter command allows you to selectively process events in your ZephFlow pipeline based on specific conditions. It evaluates each incoming event against an expression written in the Fleak Eval Exp
📄️ Parser Node
The Parser Node is used to transform raw text data (strings) into structured data (JSON objects). This is an essential step in log processing, allowing you to extract specific fields like IP addresses
📄️ SQL Node
The Fleak SQL Node allows you to manipulate and transform data within your event workflows using standard SQL syntax. Built on FleakSQL, a custom-built engine designed specifically for event processin
📄️ Kafka Source Node
The Kafka Source node allows your workflow to ingest real-time streaming data from an Apache Kafka cluster. It acts as a starting point for your data pipeline, continuously polling a specified topic f
📄️ Splunk Source Node
The Splunk Source connector allows you to ingest data directly from Splunk Enterprise or Splunk Cloud Platform into your workflow. It functions by submitting a search job to the Splunk API and retriev
📄️ Kafka Sink Node
The Kafka Sink Node allows you to write processed data records from your Fleak workflow directly into an Apache Kafka topic.
📄️ Delta Sink Node
The Delta Lake Sink Node enables you to write processed workflow data directly into a physical storage location in the Delta Lake table format.
📄️ Databricks Sink Node
The Databricks Sink Node allows you to ingest processed data directly into Databricks Unity Catalog tables. Unlike direct file writers, this node leverages the Databricks SQL Compute engine to ensure