Send Google Cloud Audit Logs To Amazon EventBridge

Chris Parlette

Chris Parlette

Oct 12, 2022
Send Google Cloud Audit Logs To Amazon EventBridge
Send Google Cloud Audit Logs To Amazon EventBridge

In a very similar previous post we showed you how to get Oracle Cloud Audit logs to AWS EventBridge by building the event flow using the TriggerMesh API. The sky is really the limit with the connectors, filtering, splitting and transformation capability that TriggerMesh provides. In this new post we show you how to get Google Cloud Audit logs to AWS EventBridge and display them in a CloudWatch log stream.

In this post we will be looking at a specific example of extracting audit log messages from Google Cloud Audit Logs and sending them to AWS EventBridge from where they can be sent to AWS CloudWatch for example. Throughout this example, keep in mind that you can change any piece of this bridge to suit your specific needs.  For example, the source could be Oracle Cloud Infrastructure (OCI) or Azure Activity Logs, and the target could be your on-premises data center or a SaaS application like SecureWorks Taegis.

We will keep this straightforward with two parts:

  • Getting the logs from Google Cloud
  • Sending them to AWS eventbridge

Transformation to the desired schema can be done with the TriggerMesh transformation engine.

Get Google Cloud Audit Logs

Using Google Pub/Sub is the easiest way to connect TriggerMesh to Google Cloud and consume Audit logs. Google Cloud Audit logs can be exported via the “Log Routing” option in Cloud Logging to a Google Pub/Sub topic. From there the logs can be consumed via a Pub/Sub client. To set this up, you’ll need to set up a Pub/Sub topic in Google Cloud, create a log router to get the Google Cloud logs onto the topic, then use TriggerMesh to pull the messages off of the topic.

Create a Pub/Sub topic

Once you’re logged in to Google Cloud, search for “Pub/Sub” in the top search bar and choose the Pub/Sub option in the Products section.  From here, click on the button near the top of the page labeled Create Topic.

Create a Pub/Sub subscription

From the left menu in the Pub/Sub product, choose Subscriptions, then click the button at the top to Create Subscription. Enter a name and choose the topic you just created, then see if any of the wizard options need to change for your environment. As far as TriggerMesh is concerned, those options can all be left as the defaults.

Log Router

Search for “Logging” in the search bar of the Google Cloud console to go to Logging.  From here, click on “Log Router” from the left menu, then choose to Create Sink from the top of the page.  The sink name should be something relevant to your configuration.  The sink service should be “Cloud Pub/Sub Topic”, which lets you choose the topic you created in an earlier step.

For the log filter, you can choose to have no filter if you want all logs sent to TriggerMesh.  Alternatively, you can choose the logs.  For example, if I just want IAM audit logs, I can use a filter of: protoPayload.serviceName="".

TriggerMesh Setup

With the Google Cloud log router configured, you can consume the audit logs using the TriggerMesh GoogleCloudPubSubSource event source. Below is a sample TriggerMesh object manifest that connects to your Google Cloud Pub/Sub topic. This manifest is deployed on a Kubernetes cluster that runs the TriggerMesh platform:

kind: GoogleCloudPubSubSource
  name: sample
  topic: projects/project-name/topics/topicname
  subscriptionID: pubsubsubscription_id

    value: >-
        "type": "service_account",
        "project_id": "project-name",
        "private_key_id": "abcde01234567899eadfa1160724ce3c32c8f1a6",
        "private_key": "-----BEGIN PRIVATE KEY-----\PRIVATE_KEY_DETAILS_HERE\n-----END PRIVATE KEY-----\n",
        "client_email": "",
        "client_id": "100178283984303432828",
        "auth_uri": "",
        "token_uri": "",
        "auth_provider_x509_cert_url": "",
        "client_x509_cert_url": ""

      kind: Broker
      name: default

The only parts of the above file you’ll need to change are the “topic”, “subscriptionID”, and“serviceAccountKey” fields. The serviceAccountKey can be stored in a Kubernetes secret and referenced in the manifest.

  • topic - This is the name of the Pub/Sub topic that was created above
  • subscriptionID - this is the name of the Subscription that was created above
  • serviceAccountKey - The best way to set this up is to go to your Google Cloud IAM section and create a service account with permissions to the Pub/Sub topic and subscription.  It will let you download the serviceAccountKey for use here.

Send Logs to AWS EventBridge

Now that the logs are being sent from Google Cloud to TriggerMesh, we can transform them with our transformation engine and we can send the events to AWS EventBridge using our AWSEventBridgeTarget object.  Here is the kubernetes manifest:

kind: AWSEventBridgeTarget
  name: eb
  arn: arn:aws:events:us-west-1:043466334429:event-bus/foobar
      name: awscreds
      key: aws_access_key_id
      name: awscreds
      key: aws_secret_access_key

The only thing you’ll need to change is the “arn”, “aws_access_key_id”, and “aws_secret_access_key”. Note that the authentication can also be done with a restricted scope IAM role with only access to EventBridge.

In the EventBridge console you setup a rule to send events with the type `` to a target pointing to a CloudWatch log stream. In the snapshot below notice the event pattern specifying the TriggerMesh event type of the Pub/Sub source.

With the EventBridge rule in place, Google Cloud Audit logs consumed by TriggerMesh will be sent to AWS CloudWatch. The snapshot below shows an Google Cloud Audit log message in the CloudWatch console.


Multi-cloud scenarios are becoming more common. Cyber security events and the need to ingest them centrally before transformation and dispatching for analysis is a prime use-case. In this blog we showed you the steps to connect Google Cloud Audit logs to AWS CloudWatch via AWS EventBridge. Similar Bridges can easily be configured to send your security events to other endpoints like SecureWorks Taegis or Azure Sentinel.

Create your first event flow in under 5 minutes