The Google Cloud Platform (GCP) Logs integration automatically forwards logs from  GCP to LogicMonitor’s LM Logs, enabling real-time collection and analysis without manual setup or custom pipelines.

To ingest Google Cloud Provider (GCP) logs, do the following:

  • Install the LogicMonitor integration—Create a virtual machine (VM) to install the LogicMonitor ingestion integration with the Google Cloud Shell Terminal.
    For more information, see Use the Cloud Shell Terminal from Google.
  • Configure the log forwarder—Configure the VM to forward GCP logs to LogicMonitor.
  • Export logs to Pub/Sub—Create a sink in GCP and use Pub or Sub to filter and export logs for ingestion by LogicMonitor.
    For more information, see Route logs to supported destinations from Google.

Metadata for Google Cloud Platform Technologies for Log Ingestion

The following metadata is added by default to logs along with the raw message string:

Metadata keyDescription
severitySeverity level for the event. The values are “Informational”, “Warning”, “Error”, and “Critical”.
logNameResource name of the log to which this event belongs.
categoryLog category for the event. Typical log categories include “Audit”, “Operational”, “Execution”, and “Request”.
_typeService, application, or device or VM responsible for creating the event.
labelsLabels for the event.
resource.labelsLabels associated with the resource to which the event belongs.
httpRequestThe HTTP request associated with the log entry, if any.

If you need additional metadata, you can use the following configuration for plugin fluent-plugin-lm-logs-gcp:

  • Use the following in the fluentd/td-agent config file:
<filter pubsub.publish>
    @type gcplm
    metadata_keys severity, logName, labels, resource.type, resource.labels, httpRequest, trace, spanId, custom_key1, custom_key2
    use_default_severity true
</filter>
  • To add static metadata, use the record transformer. Use the following to fluentd.conf:
<filter pubsub.publish>
  @type record_transformer
  <record>
    some_key some_value
    tag ${tag} # can add dynamic data as well
  </record>
</filter>

For more information about the configurations for the plugin, see lm-logs-fluentd-gcp-filter on GitHub.

Requirements for Configuring Google Cloud Platform Log Ingestion

To configure GCP log ingestion, you must have the following:

  • A LogicMonitor API token to authenticate all requests to the log ingestion API
    For more information, see Adding an API Token.
  • Access to the GCP account created in your LogicMonitor portal from which logs are forwarded
  • A GCP account with the following IAM roles to create and configure the required GCP resources:
    • roles/compute.admin—Create virtual machines
    • roles/logging.configWriter—Create log sinks
    • roles/pubsub.editor—Create Pub/Sub topics and subscriptions
      For more information, see Cloud Logging roles and permissions from Google.

Installing the LogicMonitor Integration

  1. In your Google Cloud account, select Activate Cloud Shell.
    This displays the Cloud Shell Terminal.
  2. In Terminal, run the following command to select the project:
gcloud config set project [PROJECT_ID]
  1. Run the following command to install the integration:
source <(curl -s https://raw.githubusercontent.com/logicmonitor/lm-logs-gcp/master/script/gcp.sh) && deploy_lm-logs

Installing the integration creates the following resources:

  • A Virtual Machine (VM) named lm-logs-forwarder
  • A Pub or Sub topic named export-logs-to-logicmonitor and a pull subscription

Note: You are prompted to confirm the VM deployment region, which must already be configured in your project. Ensure the deployed VM has the required permissions to access Pub or Sub resources.

Configuring the Log Forwarder

  1. Navigate to Compute Engine > VM Instances and select lm-logs-forwarder.
  2. On Remote access, select SSH.
  3. Add the SSH into the VM (lm-logs-forwarder) and run the following command:
export GCP_PROJECT_ID="GCP_PROJECT_ID" export LM_COMPANY_NAME="LM_COMPANY_NAME" export LM_COMPANY_DOMAIN="${LM_COMPANY_DOMAIN}" export LM_ACCESS_ID="LM_ACCESS_ID" export LM_ACCESS_KEY="LM_ACCESS_KEY" source <(curl -s https://raw.githubusercontent.com/logicmonitor/lm-logs-gcp/master/script/vm.sh)

Note: Note: If LM_COMPANY_DOMAIN is not provided or is set as empty string, the default is “logicmonitor.com”. The supported domains for this variable are as follows:

  • lmgov.us
  • qa-lmgov.us
  • logicmonitor.com

Exporting Logs from Logging to PubSub

  1. On the Logging page, filter the logs you want to export.

Recommendation: Use filters to remove logs that contain sensitive information before sending to LogicMonitor.

  1. Select Actions > Create sink.
  2. In Sink details, provide a name.
  3. In Sink destination, choose Cloud Pub/Sub as the destination and select export-logs-to-logicmonitor.

Note: The export-logs-to-logicmonitor is the file that is created during installation. The Pub or Sub can be in a different project.

  1. Select Create sink
    You can view the logs stream into the LM Logs page.

Logs are pushed to the LM Logs API using Fluentd with the provided lm-logs-fluentd-gcp-filter plugin. This plugin subscribes to the Pub or Sub topic, retrieves incoming logs, and adds metadata before creating an HTTP POST request to the LM Logs API to complete the ingestion process.

14-day access to the full LogicMonitor platform