Anodot Agent Kafka Collector

Overview & Main Concepts
    What pipelines do
    Basic Flow
    Prerequisites

OVERVIEW & MAIN CONCEPTS

Use the Anodot Kafka to Metric 2.0 agent to stream Kafka messages to Anodot via Anodot’s REST API v2.0.

  • Source - Where you want your data to be pulled from. Available sources: kafka.
  • Destination - Where you want to put your data. Available destinations: http client - Anodot rest api endpoint.
  • Pipeline - pipelines connect sources and destinations with data processing and transformation stages.

What Pipelines Do

  • Take data from source.
  • If destination is http client - every record is transformed to JSON object according to specs of Anodot metric 2.0 protocol.
  • Values are converted to floating point numbers.
  • Timestamps are converted to unix timestamp in seconds.

Basic Flow

  1. Add an Anodot api token.
  2. Create a source.
  3. Create a pipeline.
  4. Run the pipeline.
  5. Monitor the pipeline status.

Prerequisites

  1. Docker & docker-compose.
  2. A Kafka Stream platform; the data source.
  3. An active Anodot account; the data destination.
  4. Persistent volumes: 250Kb for every pipeline

 

Please follow the installation and configuration steps in our Github Repo

Was this article helpful?
0 out of 0 found this helpful