How DaFT Works

[DaFT] - [How] - [Install] - [Config] - [JSON & XML] - [Extend] - [Implement]

DaFT simplifies data integration by collecting, normalising, and exporting data from multiple sources into a structured format suitable for monitoring and analysis. It follows a modular approach, allowing seamless extension and configuration to suit various environments.

Workflow Overview

DaFT operates in three key stages:

  1. Data Collection – Fetching data from different sources such as databases, APIs, and files.
  2. Data Normalisation – Transforming the collected data into a unified structure.
  3. Data Export – Delivering the processed data in formats suitable for Prometheus, OpenMetrics, or JSON-based consumers.

1. Data Collection

DaFT supports multiple input sources:

Each source is defined in the configuration file, specifying connection details and query parameters. The system executes queries, reads files, or fetches API responses as required.

2. Data Normalisation

Once the data is collected, DaFT processes it to ensure consistency:

This stage guarantees that all data, regardless of its origin, is transformed into a predictable and usable format.

3. Data Export

After normalisation, DaFT exports data in various formats:

Users can access the data via HTTP endpoints, ensuring compatibility with existing monitoring and analytics systems.

Key Benefits

By following this structured approach, DaFT ensures that data integration remains consistent, reliable, and scalable.