Kafka
LOGIQ can run streaming analytics on any Kafka streams. The data in Kafka can be pushed to LOGIQ if the Kafka servers are running inside the customer network. If the Kafka endpoint is accessible to the LOGIQ network LOGIQ can pull the data directly from Kafka.

Pushing data to Logiq

Push data to Kafka
Logstash instances deployed in the customer network can read data from Kafka and push it to LOGIQ. Use the below configuration to read from Kafka.

Input Logstash configuration

1
input {
2
kafka {
3
bootstrap_servers => "localhost:9092"
4
topics => "test_topic"
5
}
6
}
Copied!

Output Logstash Configuration

1
output {
2
http {
3
url => "https://logiq-dns-or-ip/v1/json_batch"
4
headers => { "Authorization" => "Bearer <Auth token>" }
5
http_method => "post"
6
format => "json_batch"
7
content_type => "json_batch"
8
pool_max => 300
9
pool_max_per_route => 100
10
socket_timeout => 60
11
}
12
}
Copied!
You can additionally control the data organization by specifying additional fields
1
filter {
2
mutate {
3
add_field => { "cluster_id" => "demo-http-test" }
4
add_field => { "namespace" => "namespace_name" }
5
add_field => { "app_name" => "application_name" }
6
}
7
}
Copied!

Pulling Data from Kafka

LOGIQ can pull data using Kafka Input Plugins. This method requires the Kafka endpoint to be reachable from LOGIQ network.
Kafka endpoint, Topic name, namespace, and application name are needed to configure the Kafka input plugin. Namespace and Application define how the data is partitioned in LOGIQ, see here for more information.