[DaFT] - [How] - [Install] - [Config] - [JSON & XML] - [Extend] - [Implement]
DaFT simplifies data integration by collecting, normalising, and exporting data from multiple sources into a structured format suitable for monitoring and analysis. It follows a modular approach, allowing seamless extension and configuration to suit various environments.
Workflow Overview
DaFT operates in three key stages:
- Data Collection – Fetching data from different sources such as databases, APIs, and files.
- Data Normalisation – Transforming the collected data into a unified structure.
- Data Export – Delivering the processed data in formats suitable for Prometheus, OpenMetrics, or JSON-based consumers.
1. Data Collection
DaFT supports multiple input sources:
- Databases – MySQL, MSSQL, PostgreSQL/Redshift
- Monitoring Systems – Prometheus/OpenMetrics
- Web and File-Based Formats – JSON, XML, CSV
Each source is defined in the configuration file, specifying connection details and query parameters. The system executes queries, reads files, or fetches API responses as required.
2. Data Normalisation
Once the data is collected, DaFT processes it to ensure consistency:
- Converts disparate data structures into a standardised schema.
- Filters and extracts relevant fields based on user-defined configurations.
- Ensures numerical and string data types are appropriately handled.
- Applies security measures to validate input and prevent attacks such as SQL injection and directory traversal.
This stage guarantees that all data, regardless of its origin, is transformed into a predictable and usable format.
3. Data Export
After normalisation, DaFT exports data in various formats:
- JSON Output – Default format for Telegraf and custom integrations.
- Prometheus/OpenMetrics Format – Suitable for direct ingestion by monitoring tools.
- Custom Exporters – Extendable to support additional output formats.
Users can access the data via HTTP endpoints, ensuring compatibility with existing monitoring and analytics systems.
Key Benefits
- Flexible Configuration – Easily define and manage data sources.
- Extensible Architecture – Add new handlers and exporters as needed.
- Lightweight & Efficient – Optimised for high-performance data processing.
- Secure Data Handling – Built-in protection against common security vulnerabilities.
By following this structured approach, DaFT ensures that data integration remains consistent, reliable, and scalable.