Automated Wazuh Log Archival on GCP Storage Buckets
Introduction:
Imagine a scenario where you're striving to optimize the efficiency and cost-effectiveness of your Wazuh SIEM (Security Information and Event Management) setup. The challenge at hand? Automating the archival process of cold logs and securely storing them in a cloud bucket. Why? Because cloud buckets offer a far more attractive solution for long-term data storage compared to conventional methods.
As I delved into this endeavor, I quickly realized that a significant portion of storage on the SIEM was being consumed by indexes and archived logs and alerts. To tackle this, I set out to automate the process. However, it wasn't as straightforward as I initially thought. Uploading these files via a script necessitated the use of a token file from a service account, adding a layer of complexity.
That's when it hit me – why not harness the power of Google Cloud Platform's (GCP) Identity and Access Management (IAM) and role customization? By doing so, I could adhere to the principle of least privilege while tightly controlling access rights. The goal was clear: even if a service account key were compromised, its impact would be rendered irrelevant.
Join me on this journey as we explore how GCP's IAM and role creation can be your secret weapon in achieving robust security and cost efficiency in your SIEM setup.
Why Securing the Buckets are important:
Securing Google Cloud Platform (GCP) buckets is essential for safeguarding sensitive data and maintaining the integrity of your cloud-based resources. GCP buckets, often used to store files and objects, require robust security measures for several critical reasons.
Firstly, data breaches can have severe consequences, including financial losses and reputational damage. GCP buckets may contain confidential customer information, proprietary business data, or intellectual property. Securing these buckets with strong access controls and encryption is vital to prevent unauthorized access or data leaks.
Secondly, compliance with data protection regulations is a top priority. Many industries are subject to stringent data privacy laws, such as GDPR or HIPAA. Ensuring GCP bucket security helps you meet these requirements by protecting personal and sensitive information from exposure or unauthorized processing.
Additionally, securing GCP buckets helps defend against cyber threats. Malicious actors are continually seeking vulnerabilities to exploit. Properly configured buckets with restricted access and continuous monitoring can thwart unauthorized access attempts, reducing the risk of data breaches and cyberattacks.
Configurations:
- Create a Custom role.
- Create a Service account with a custom IAM condition.
- Setup a Secure GCP Bucket for Archival.
- Download The JWT token of the Service Account.
- Setup the Script on Wazuh Server.
1) Create a Custom role
This steps plays an important role from the beginning , limiting the actions a given service account can use. Since we are going to use the Service account token file in a script.
Step 1: Click on "Create role"
Step 2: Set the title of role:
Step 3:Add in the following two permissions and click on done:
storage.buckets.get
storage.object.create
2) Create a Service account with a custom IAM condition:
Step 1: Click on "Create Service Account":
Step 2: setup the Service account name:
Step 3: Add the custom role created earlier and click on "Add IAM Condition"
Step 4: Set the name of the condition and then in the "Condition Editor" add the following line:
Step 5: Click on Done:
3) Setup a Secure GCP Bucket for Archival
Step 1: Go to Cloud Storage -> Buckets -> Click on "Create"
Step 2: Set the name of the Bucket:
Step 3: Set the Location Type of the Bucket:
Step 4: Set the Storage class as "Archive"
Step 5: Set the Access to Objects:
Steps 6: Set the Retention Policy:
Step 7 Click on create and on the pop up set the "Public access prevention" as Enforce:
5) Download The JWT token of the Service Account:
Step 1: Click on the Keys tab under Service account and click on Create New Key
Step 2: Click on the Keys tab under Service account and click on Create New Key:
Note: The json File will be downloaded on your system now
5) Setup the Script on Wazuh Server.
Setup the Script on the Github repo from the read me file:
https://github.com/moizl599/wazuh_log_gcp_backup
Final Thoughts:
In wrapping up our exploration of this setup, it's abundantly clear that our primary objective has been fortifying the security of our bucket, especially when it comes to potential vulnerabilities over public networks. Cloud buckets, while powerful tools, can be susceptible to misconfiguration mishaps that could compromise data integrity. Additionally, our emphasis has been firmly placed on minimizing the exposure of our service account, a critical element in this equation.
Why all this emphasis on hardening the bucket, you ask? The answer lies in the heart of our script, where we rely on the JWT token for seamless operations. By ensuring the robust security of our bucket and keeping our service account well-guarded, we've not only minimized risks but also set the stage for a more resilient and dependable system.
In essence, our journey to secure, automate, and optimize the archival process has been a triumph in bolstering our defenses while embracing the versatility of cloud technology. As we conclude this chapter, let's remember that in the ever-evolving landscape of cloud computing, security remains a constant priority, and our journey is far from over.
Comments
Post a Comment