Links

Kafka

LOGIQ.AI can run streaming analytics from one or more Kafka topics. Both push and pull modes are supported. The data in Kafka can be pushed to LOGIQ if the Kafka servers are running inside the customer network. If the Kafka endpoint is accessible to the LOGIQ network LOGIQ can pull the data directly from Kafka.

PUSH Method: Pushing data using logstash

Push data to Kafka
Logstash instances deployed in the customer network can read data from Kafka and push it to LOGIQ. Use the below configuration to read from Kafka.

Input Logstash configuration

input {
kafka {
bootstrap_servers => "localhost:9092"
topics => "test_topic"
}
}

Output Logstash Configuration

output {
http {
url => "https://logiq-dns-or-ip/v1/json_batch"
headers => { "Authorization" => "Bearer <Auth token>" }
http_method => "post"
format => "json_batch"
content_type => "json_batch"
pool_max => 300
pool_max_per_route => 100
socket_timeout => 60
}
}
You can additionally control the data organization by specifying additional fields
filter {
mutate {
add_field => { "cluster_id" => "demo-http-test" }
add_field => { "namespace" => "namespace_name" }
add_field => { "app_name" => "application_name" }
}
}

PULL Method : Pulling Data from Kafka topics

LOGIQ can pull data using Kafka Input Plugins. This method requires the Kafka endpoint to be reachable from LOGIQ network.
Kafka endpoint, Topic name, namespace, and application name are needed to configure the Kafka input plugin. Namespace and Application define how the data is partitioned in LOGIQ, see here for more information.
Kafka Input App Extension
Multiple Pull extensions can be created and running simultaneously
Creating a Kafka pull application can be done from the "Explore" -> "App Extensions"
Kafka/Confluent App Extension
The plugin configuration page provides the various options to pick a Kafka topic, consumer group etc.