GCP Cloud Logging
This article describes the process to export logs from GCP Cloud Logging to LOGIQ.
To set up log forwarding from GCP Cloud Logging to LOGIQ, you must:
  • Create a user-managed service account
  • Create a Cloud Pub/Sub topic
  • Create a log sink and subscribe it to the Pub/Sub topic
  • Create a VM for Logstash
The examples in this document use the gcloud command-line interface. Google Cloud APIs must be enabled through the Services and APIs page in the console before they can be used with gcloud. To perform the steps in this tutorial, enable the following APIs:
  • Compute Engine
  • Pub/Sub
  • Identity and Access Management (IAM)
  • Cloud Logging

Create a service account

Activate and login to the Cloud Shell. Create a service account to attach to the VM
The commands use project name gcp-customer-1.Replace it with a valid project name from the target account.
1
gcloud iam service-accounts create logstash --display-name="Logstash to Logiq"
Copied!
Provide IAM permissions allowing the new service account to access Pub/Sub using the pubsub.subscriber role.
1
gcloud projects add-iam-policy-binding gcp-customer-1 \
2
--member serviceAccount:[email protected] \
3
--role roles/pubsub.subscriber
4
5
//Replace the customer name 'gcp-customer-1' with a valid one
Copied!

Create a Pub/Sub topic and subscription

Create a Pub/Sub topic where Cloud Logging will send events to be picked up by Logstash using the following command.
1
gcloud pubsub topics create logiq-topic
Copied!
Next, create a subscription by running the following command.
1
gcloud pubsub subscriptions create logstash-sub --topic=logiq-topic \
2
--topic-project=gcp-customer-1
Copied!

Create a log sink

Create a log sink to be used to export logs to the new Pub/Sub topic.
1
gcloud logging sinks create \
2
logstash-sink pubsub.googleapis.com/projects/gcp-customer-1/topics/logiq-topic
3
4
//Response
5
Created [https://logging.googleapis.com/v2/projects/scalesec-dev/sinks/logstash-sink].
6
Please remember to grant `serviceAccount:[email protected]` Pub/Sub
7
Publisher role to the topic.
8
More information about sinks can be found at /logging/docs/export/
Copied!
The second part of the output is a reminder to verify that the service account used by Cloud Logging has permission to publish events to the Pub/Sub topic.
1
gcloud beta pubsub topics add-iam-policy-binding logiq-topic \
2
--member serviceAccount:[email protected] \
3
--role roles/pubsub.publisher
Copied!

Create the Logstash VM

Create a VM to run logstash to pull logs from the Pub/Sub logging sink and send them to ElasticSearch:
1
gcloud compute --project=gcp-customer-1 instances create logstash \
2
--zone=us-west1-a \
3
--machine-type=n1-standard-1 \
4
--subnet=default \
6
--scopes="https://www.googleapis.com/auth/cloud-platform" \
7
--image-family=ubuntu-1804-lts \
8
--image-project=ubuntu-os-cloud \
9
--boot-disk-size=10GB \
10
--boot-disk-type=pd-ssd \
11
--boot-disk-device-name=logstash
Copied!
Once the VM is running, SSH into the VM and then configure Logstash, as shown below.
1
gcloud compute ssh logstash --zone us-west1-a
2
3
sudo apt-get update
4
sudo apt-get -y upgrade
5
sudo apt -y install openjdk-8-jre-headless
6
echo "export JAVA_HOME=\"/usr/lib/jvm/java-8-openjdk-amd64\"" >> ~/.profile
7
sudo reboot
Copied!
After a few moments, the VM will complete its reboot and can be accessed again via gcloud.
1
gcloud compute ssh logstash --zone us-west1-a
Copied!
Install Logstash as shown below.
1
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
2
echo "deb https://artifacts.elastic.co/packages/6.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list
3
sudo apt-get update
4
sudo apt-get install logstash
5
6
# Install Logstash google_pubsub plugin
7
cd /usr/share/logstash
8
sudo -u root sudo -u logstash bin/logstash-plugin install logstash-input-google_pubsub
Copied!

Configure Logstash

Logstash comes with no default configuration. Create a new file /etc/logstash/conf.d/logstash.conf with these contents, modifying values as needed:
1
input
2
{
3
google_pubsub {
4
project_id => "gcp-customer-1"
5
topic => "logiq-topic"
6
subscription => "logstash-sub"
7
include_metadata => true
8
codec => "json"
9
}
10
}
11
filter {
12
json {
13
source => "message"
14
}
15
mutate {
16
add_field => { "namespace" => "%{[resource][labels][project_id]}" }
17
add_field => { "cluster_id" => "dev" }
18
add_field => { "app_name" => "%{[resource][labels][service_name]}" }
19
add_field => { "proc_id" => "%{[resource][labels][revision_name]}" }
20
}
21
}
22
output {
23
http {
24
url => "http://<logiq-endpoint>.logiq.ai/v1/json_batch"
25
headers => { "Authorization" => "Bearer <SECURE_INGEST_TOKEN>" }
26
http_method => "post"
27
format => "json_batch"
28
content_type => "json_batch"
29
pool_max => 2000
30
pool_max_per_route => 100
31
socket_timeout => 300
32
}
33
}
34
Copied!
You can obtain an ingest token from the LOGIQ UI as described here. You can customize the namespace and cluster_id in the Logstash configuration based on your needs.
GCP Cloud Run Logs
Your GCP Cloud Logging logs will now be forwarded to your LOGIQ instance.
Last modified 21d ago