Azure Event Hubs
Azure Events Hubs is a big data streaming platform and event ingestion service capable of receiving and processing millions of events per second. You can integrate your LOGIQ instance into your event hubs to transform, analyze, and store data received by them.
Setting up data ingestion from your Azure Event Hubs into LOGIQ involves the following steps.
  • Creating an Azure storage account.
  • Creating an Event Hubs namespace and event hub
  • Configuring Logstash to forward logs to your LOGIQ instance

Creating an Azure storage account

To create an Azure storage account, do the following.
  1. 1.
    Log into your Azure portal and select Storage accounts.
  2. 2.
    On the Storage accounts page, click Create.
  3. 3.
    Under Project details on the Basics tab, select the Subscription and Resource group for this new storage account.
  4. 4.
    Under Instance details, set a unique Storage account name and an appropriate Region.
  5. 5.
    Click Review + create.
  6. 6.
    Once the storage account is created, navigate to the Access Keys section.
  7. 7.
    Click Show keys.
  8. 8.
    Note down the Key and Connection string under key1.

Creating an Event Hubs namespace and event hub

An Event Hubs namespace provides a unique scoping container within which you can create one or more event hubs. To create an Event Hubs namespace and an event hub within it, do the following.
  1. 1.
    On your Azure portal, click Create a resource > All services > Event Hubs > Add.
  2. 2.
    Under the Project Details on the Basics tab of the Create Namespace page, select the Subscription and Resource group for this new Event Hubs namespace.
  3. 3.
    Under Instance details, provide a Namespace name, select a Location and set the Pricing tier to Standard.
  4. 4.
    Click Review + create.
  5. 5.
    Review the configuration, click Create and wait for the namespace to be created.
  6. 6.
    After the namespace is created, click Go to resource.
  7. 7.
    Select Event Hubs in the left menu on the namespace page and then click + Event Hub.
  8. 8.
    Provide a Name for the event hub.
  9. 9.
    Set Partition Count and Message Retention to 1.
  10. 10.
    Set Capture to On.
  11. 11.
    Set Time window (minutes) to 5.
  12. 12.
    Set Size window (MB) to 300.
  13. 13.
    Under Capture Provider, select Azure Storage Account.
  14. 14.
    Click Select Container and select the storage account you created in the previous step.
  15. 15.
    Click Create.
  16. 16.
    After the Event Hub is created, navigate to Shared Access Policies.
  17. 17.
    Select your shared access policy and note down the Primary key and Connection string.

Configuring Logstash to forward logs

The final step is configuring Logstash to forward event logs from your Azure Event Hub to LOGIQ. Download and store the followingflattenJSON.rb file. We will use this file while configuring Logstash.
flattenJSON.rb
431B
Text
Copy the following Logstash configuration and edit the fields listed in the table below the code.
1
input {
2
azure_event_hubs {
3
event_hub_connections => ["<Event hub connection string"]
4
threads => 5
5
decorate_events => true
6
storage_connection => "<Storage connector configuration>"
7
initial_position => "look_back"
8
initial_position_look_back => 72000
9
}
10
}
11
12
output { stdout { codec => rubydebug } }
13
14
filter {
15
json {
16
source => "message"
17
remove_field => "message"
18
}
19
split { field => "records" }
20
date {
21
match => ["[records][time]", "ISO8601"]
22
target => "@timestamp"
23
}
24
ruby {
25
path => "<Path_to_flattenJSON.rb>"
26
script_params => { "field" => "records" }
27
}
28
mutate {
29
split => { "records.RoleLocation" => " " }
30
join => { "records.RoleLocation" => "-" }
31
add_field => { "namespace" => "events" }
32
add_field => { "proc_id" => "%{[records.category]}" }
33
add_field => { "cluster_id" => "azure" }
34
add_field => { "app_name" => "activity-logs" }
35
add_field => { "message" => "%{[records.operationName]}" }
36
remove_field => [ "records.message", "records.time" ]
37
}
38
}
39
output {
40
http {
41
url => "http://<LOGIQ_endpoint>/v1/json_batch"
42
headers => { "Authorization" => "Bearer <LOGIQ_ingest_token> " }
43
http_method => "post"
44
format => "json_batch"
45
content_type => "json_batch"
46
pool_max => 2000
47
pool_max_per_route => 100
48
socket_timeout => 300
49
}
50
}
Copied!
Field
Description
<Path_to_flattenJSON.rb>
Local file path where you saved the flattenJSON.rb file you downloaded.
<LOGIQ_endpoint>
Your LOGIQ instance endpoint
<LOGIQ_ingest_token>
Your LOGIQ ingest token
Last modified 1mo ago