

Since the start of the TriggerMesh adventure we have believed that the future IT landscape was going to be made of multiple Clouds being used by organizations. This is true because of the need to manage different SaaS offerings or because of company acquisitions or simply because of a best of breed approach where you choose the best Cloud service for the job.
What this means for Cyber security efforts is that organizations want to organize all their Cloud security events, transform them in standardized schemas like the recently announced Open Cybersecurity Schema Framework (OCSF) and send them to their security solution for analytics and threat detection, like Azure Sentinel, Oracle Cloud Guard or SecureWorks Taegis.
With TriggerMesh, you can accomplish this ingestion of security events from your multiple cloud providers (e.g GCP, AWS, Azure), transform the events in specific schemas and send those events to any destination (called Target in TriggerMesh lingo). The collection, transformation and dispatching of events in TriggerMesh is called a Bridge.
In this post we will be looking at a specific example of extracting audit log messages from Oracle Cloud Infrastructure (OCI) and sending them to AWS EventBridge from where they can be sent to AWS CloudWatch for example. Throughout this example, keep in mind that you can change any piece of this bridge to suit your specific needs. For example, the source could be Google Cloud Audit Logs or Azure Activity Logs, and the target could be your on-premises data center or a SaaS application like SecureWorks Taegis.
We will keep this straightforward with two parts:
Transformation to the desired schema can be done with the TriggerMesh transformation engine.
Oracle Cloud Audit logs can be exported via the service connector hub of Oracle. From there the logs can be consumed via a Kafka client. Using Kafka is the easiest way to connect TriggerMesh to OCI and consume Audit logs. To set this up, you’ll need to set up a Kafka stream in OCI, create a service connector to get the OCI logs onto the stream, then use TriggerMesh to pull the messages off of the stream.
Once you’re logged in to OCI, search for “Streaming” in the top search bar and choose the Streaming option in the Services section. If you don’t have a compartment chosen on the left side, choose one now, Then, choose the button to “Create Stream”. Most options can stay as defaults, but you’ll want to name the stream and choose the DefaultPool as the Stream Pool.
Next, head to the “Service Connectors” service by using the search bar at the top. Click on “Create Service Connector” to start the creation wizard. Here are the items that need to be set up:
With the Oracle Cloud Infrastructure service connector configured, you can consume the audit logs using the TriggerMesh Kafka event source. Below is a sample TriggerMesh API manifest that connects to your OCI service connector. This manifest is deployed on a Kubernetes cluster that runs the TriggerMesh platform:
The only parts of the above file you’ll need to change are the “topics, “groupID”, “username” and “bootstrapServer” fields. The password is stored in a Kubernetes secret and referenced in the manifest.
Now that the logs are being sent from OCI to TriggerMesh, we can transform them with our transformation engine and we can send the events to AWS EventBridge using our AWSEventBridgeTarget API. Here is the API manifest:
The only thing you’ll need to change is the “arn”, “awsApiKey”, and “awsApiSecret”. Note that the authentication can also be done with a restricted scope IAM role with only access to EventBridge
In the EventBridge console you setup a rule to send events with the type `io.triggermesh.kafka.event` to a target pointing to a CloudWatch log stream. In the snapshot below notice the event pattern specifying the TriggerMesh event type of the Kafka source.
With the EventBridge rule in place, OCI Audit logs consumed by TriggerMesh will be sent to AWS CloudWatch. The snapshot below shows an OCI Audit log message in the CloudWatch console.
Multi-cloud scenarios are becoming more common. Cyber security events and the need to ingest them centrally before transformation and dispatching for analysis is a prime use-case. In this blog we showed you the steps to connect Oracle Cloud Audit logs to AWS CloudWatch. Similar Bridges can easily be configured to send your security events to other endpoints like SecureWorks Taegis or Azure Sentinel.