Links

AWS S3

LOGIQ S3 Import App Extension (PULL)

LOGIQ.AI can ingest data directly from any S3 compatible object storage. Head over to the App extensions to create an object importer app extension.
You can find App extensions under the Explore menu
Once inside the App extension menu, select AWS S3/Comptabile Object store
AWS S3/Compatible object store
Create the extension and provide the settings for accesing your object store bucket. The settings menu provides options that allow customization that is specific to vendor object store implementations.
Configuring access to the bucket
And that is all you need. You data from the Object store bucket will show up as a flow in the LOGIQ.AI platform
Viewing the object store data import in "Explore" as a Flow

LOGIQ S3 exporter Lambda function (PUSH)

Creating a Lambda function

LOGIQ provides a CloudFormation template to create the LOGIQ S3 exporter Lambda function.
https://logiqcf.s3.amazonaws.com/s3-exporter/cf.yaml
You can also download the CloudFormation template from our client-integrations GitHub repository.
This CloudFormation stack creates a Lambda function and its necessary permissions. You must configure the following attributes.
Parameter
Description
APPNAME
Application name - a readable name for LOGIQ to partition logs.
CLUSTERID
Cluster ID - a readable name for LOGIQ to partition logs.
NAMESPACE
Namespace - a readable name for LOGIQ to partition logs.
LOGIQHOST
IP address or hostname of the LOGIQ server.
INGESTTOKEN
JWT token to securely ingest logs. Refer here to generate ingest token.

Configuring S3 trigger

Once the CloudFormation stack is created, navigate to the AWS Lambda function (logiq-s3-logs-exporter) and add a S3 trigger.
On the Add trigger page, select S3. Next, select the Bucket you'd like to forward logs from and add a Prefix.
Once this configuration is complete, any new log files in the S3 bucket will be streamed to the LOGIQ cluster.