Send Oracle Cloud Audit Logs to Amazon EventBridge

Chris Parlette

Chris Parlette

Sep 30, 2022
Send Oracle Cloud Audit Logs to Amazon EventBridge
Send Oracle Cloud Audit Logs to Amazon EventBridge

Since the start of the TriggerMesh adventure we have believed that the future IT landscape was going to be made of multiple Clouds being used by organizations. This is true because of the need to manage different SaaS offerings or because of company acquisitions or simply because of a best of breed approach where you choose the best Cloud service for the job.

What this means for Cyber security efforts is that organizations want to organize all their Cloud security events, transform them in standardized schemas like the recently announced Open Cybersecurity Schema Framework  (OCSF) and send them to their security solution for analytics and threat detection, like Azure Sentinel, Oracle Cloud Guard or SecureWorks Taegis.

With TriggerMesh, you can accomplish this ingestion of security events from your multiple cloud providers (e.g GCP, AWS, Azure), transform the events in specific schemas and send those events to any destination (called Target in TriggerMesh lingo). The collection, transformation and dispatching of events in TriggerMesh is called a Bridge.

In this post we will be looking at a specific example of extracting audit log messages from Oracle Cloud Infrastructure (OCI) and sending them to AWS EventBridge from where they can be sent to AWS CloudWatch for example.  Throughout this example, keep in mind that you can change any piece of this bridge to suit your specific needs.  For example, the source could be Google Cloud Audit Logs or Azure Activity Logs, and the target could be your on-premises data center or a SaaS application like SecureWorks Taegis.

We will keep this straightforward with two parts:

  • Getting the logs from Oracle Cloud
  • Sending them to AWS eventbridge

Transformation to the desired schema can be done with the TriggerMesh transformation engine.

Get Oracle Cloud Infrastructure (OCI) Logs

Oracle Cloud Audit logs can be exported via the service connector hub of Oracle. From there the logs can be consumed via a Kafka client.  Using Kafka is the easiest way to connect TriggerMesh to OCI and consume Audit logs.  To set this up, you’ll need to set up a Kafka stream in OCI, create a service connector to get the OCI logs onto the stream, then use TriggerMesh to pull the messages off of the stream.

Stream Creation

Once you’re logged in to OCI, search for “Streaming” in the top search bar and choose the Streaming option in the Services section.  If you don’t have a compartment chosen on the left side, choose one now,  Then, choose the button to “Create Stream”.  Most options can stay as defaults, but you’ll want to name the stream and choose the DefaultPool as the Stream Pool.

Service Connector Creation

Next, head to the “Service Connectors” service by using the search bar at the top.  Click on “Create Service Connector” to start the creation wizard.  Here are the items that need to be set up:

  • Connector Name - This can be anything relevant to your project.
  • Description - Optional
  • Resource Compartment - This should be auto-filled based on your selected compartment.
  • Source - Choose “Logging”
  • Target - Choose “Streaming”
  • Source configuration - This is dependent on your needs.  For testing, we set the Log Group to “_Audit” and a Log Filter Task to use the Service/Attribute Name of “Identity SignOn” and Event Type/Attribute Value of “Interactive Login; Federated Interactive Login”.
  • Target configuration - Choose the stream you created in the step before this.
  • If there is a warning about a policy, choose to create a default policy

TriggerMesh Setup

With the Oracle Cloud Infrastructure service connector configured, you can consume the audit logs using the TriggerMesh Kafka event source. Below is a sample TriggerMesh API manifest that connects to your OCI service connector. This manifest is deployed on a Kubernetes cluster that runs the TriggerMesh platform:

apiVersion: sources.triggermesh.io/v1alpha1                
kind: KafkaSource               
metadata:                  
  name: oci-streaming-source               
spec:                  
  groupID: ocikafka-group                  
  bootstrapServers:                 
  - "cell-1.streaming.us-phoenix-1.oci.oraclecloud.com:9092"                  
  topics:                  
  - auditlogs                  
  auth:                    
    saslEnable: true                    
    tlsEnable: true                    
    securityMechanism: PLAIN                    
    username: "testoci/test@triggermesh.com/ocid1.streampool.oc1.phx.amaaaaaayqnaa"                    
    password:                        
      valueFromSecret:                          
        name: oci-kafka                          
        key: pass                  
  sink:                      
    ref:                            
      apiVersion: eventing.knative.dev/v1                            
      kind: Broker                            
      name: default
      

The only parts of the above file you’ll need to change are the “topics, “groupID”, “username” and “bootstrapServer” fields. The password is stored in a Kubernetes secret and referenced in the manifest.

  • TOPICS - This is the name of the Stream that was created above
  • PASSWORD - The best way to set this up is to go to your OCI profile and create an Auth Token.  The auth token will have a password generated for you.
  • BOOTSTRAP_SERVERS - Go to Streaming -> Stream Pools -> DefaultPool, then click on the “Kafka Connection Settings” on the left menu.  This will show the bootstrap server you should use
  • USERNAME - On this same “Kafka Connection Settings” page, you’ll find a full username (in the format of <Tenancy>/<email>/<OCID>

Send Logs to AWS EventBridge

Now that the logs are being sent from OCI to TriggerMesh, we can transform them with our transformation engine and we can send the events to AWS EventBridge using our AWSEventBridgeTarget API.  Here is the API manifest:

apiVersion: targets.triggermesh.io/v1alpha1
kind: AWSEventBridgeTarget
metadata:
  name: eb
spec:
  arn: arn:aws:events:us-west-1:043466334429:event-bus/foobar
  awsApiKey:
    secretKeyRef:
      name: awscreds
      key: aws_access_key_id
  awsApiSecret:
    secretKeyRef:
      name: awscreds
      key: aws_secret_access_key

       

The only thing you’ll need to change is the “arn”, “awsApiKey”, and “awsApiSecret”. Note that the authentication can also be done with a restricted scope IAM role with only access to EventBridge

In the EventBridge console you setup a rule to send events with the type `io.triggermesh.kafka.event` to a target pointing to a CloudWatch log stream. In the snapshot below notice the event pattern specifying the TriggerMesh event type of the Kafka source.

With the EventBridge rule in place, OCI Audit logs consumed by TriggerMesh will be sent to AWS CloudWatch. The snapshot below shows an OCI Audit log message in the CloudWatch console.

Conclusion

Multi-cloud scenarios are becoming more common. Cyber security events and the need to ingest them centrally before transformation and dispatching for analysis is a prime use-case. In this blog we showed you the steps to connect Oracle Cloud Audit logs to AWS CloudWatch. Similar Bridges can easily be configured to send your security events to other endpoints like SecureWorks Taegis or Azure Sentinel.

Create your first event flow in under 5 minutes