vRealize Log Insight Cloud

Monitoring Google Cloud Platform (GCP) Logs with vRealize Log Insight Cloud

Many companies need to monitor services and applications across numerous platforms and devices. Consequently, this can be difficult without common tools across clouds. If you are managing and monitoring a multi-cloud environment, you are not alone and VMware is here to help. vRealize Log Insight Cloud creates structure from unstructured data and supports monitoring Google Cloud Platform (GCP) logs in addition to Azure and AWS. We know many companies move to multi-cloud solutions to meet demands across different teams, applications, and geographies and we want this to be easier to manage.

Previously I wrote about monitoring and analyzing AWS and Azure logs using Log Insight. Now, let’s look at Google Cloud Platform. I can use Log Insight to analyze my log data and configure alerts to send notifications when specific events occur. You can also check out this short video (under 2 minutes) that runs through the configuration steps live.

Let’s get started.

Navigate to Log Sources within vRealize Log Insight Cloud

Log Sources has in-product documentation for configuring log collection from various services. I will walk through the setup for Cloud Storage. Navigate to Log Sources > GCP > Storage.

Graphical user interface, application, TeamsDescription automatically generated

Follow the Setup Instructions

Here I have the steps to configure / export logs for Google Cloud.

Graphical user interface, text, application, TeamsDescription automatically generated

Create a Topic

Log in to the Google Cloud Console. You can search for topics under “search product and resources”. Click on Create Topic.

Provide a name for the Topic ID and uncheck Add a default subscription.

Graphical user interface, text, applicationDescription automatically generated

Create a Sink

Search for Logs Explorer or select it from the left pane. This is where we can create our sink. You can filter the logs in the query builder, or simply send all logs. Sinks can be created for a number of services at the project, organization, or folder level. For this example we are focused on storage logs, so we enter the filter resource.type=”gcs_bucket” and then click on Run Query. (Note: If we simply want all the logs for a project, not just storage, we could simply create a sink without a resource.type filter.)

Graphical user interface, applicationDescription automatically generated

Click on actions to the right of Query Results. Click on Create Sink.

Graphical user interface, text, applicationDescription automatically generated

Enter the sink name and description and click on Next.

Graphical user interface, text, application, emailDescription automatically generated

Add the sink destination info. Select Sink Service: Cloud Pub/Sub Topic. Select the Pub/Sub Topic you just created. For me, this is storage_logs. Click on Create Sink. (Note: when you click on create sink it will automatically add the filter you used in Logs Explorer for creating the Sink.)

Graphical user interface, text, application, emailDescription automatically generated

The sink is created and you will see the Logs Router information.

Graphical user interfaceDescription automatically generated

Create a Subscription

Go to the topic you added and click on the inverted ellipsis to the right of the Topic and click on create a subscription.

Graphical user interface, text, application, emailDescription automatically generated

Enter a unique subscription ID and select the delivery type as Push. Enter the endpoint URL: https://data.mgmt.cloud.vmware.com/le-mans/v1/streams/ingestion-pipeline-stream?token=<API Key>. You can leave everything else as is and scroll down and then click on Create. (Note: API keys are created in Log Insight Cloud – Configuration > API Keys.)

Graphical user interface, text, application, emailDescription automatically generated

Verify Log Flow

Now you can navigate back to Log Insight Cloud and click on Log Sources > GCP > Cloud Storage. Click on the logs tab and you should see logs flowing to Log Insight Cloud. (Note: You can adjust the time frame to the right if the environment doesn’t have a lot of activity.)

Graphical user interface, applicationDescription automatically generated

Enable the Content Pack

Navigate to Content Packs > Cloud Services and enable the appropriate content pack for the services you have configured by moving the toggle to the right, in this case, Cloud Storage.

Graphical user interface, applicationDescription automatically generated

Review Log Messages

Now I can filter/search logs using Explore Logs. I can use the log_type field to return only GCP logs. (I could also use starts with Azure for all Azure logs or starts with AWS for all AWS logs.)

Graphical user interface, applicationDescription automatically generated

Visualize the Data in Dashboards

I can view this data in a dashboard that is included with the Content Pack that I just enabled. To view the log data associated with a dashboard, I can click on the inverted ellipsis located in the upper right-hand corner of the widget. By viewing the underlying log message, I can find out who deleted the bucket.

Graphical user interface, applicationDescription automatically generated

View Log Message

Now we’re back in Explore Logs and can review the log message that was shown in the dashboard.

Graphical user interface, applicationDescription automatically generated

When I expand the message, I can review activity details, the who, what, when, where, of the event. This shows that [email protected] (aka me) deleted the bucket fd12 on 2021-03-03 at 10:19:36 GMT-06:00. All of the fields have automatically been extracted by Log Insight.

Graphical user interface, text, applicationDescription automatically generated

Create an Alert

I have the query for a deleted bucket prepopulated from the content pack. (This was the query used in the dashboard widget.) This may be the type of activity I want to be alerted on. Click on the ! alert icon in the upper right-hand corner.

Graphical user interface, applicationDescription automatically generated

Note: By default, an alert will be shown in the console under Triggered Alerts. I can also add email notifications or use a webhook for Slack, Pager Duty, or other 3rd party applications.

Enter the trigger conditions. For example, I may want a notification when the event happens in real-time, when it happens more than once in 30 minutes, when it happens more than 2 times in 5 minutes, you get the idea. I can also set the severity level of the alert and I can have multiple severity types depending on the trigger conditions I have selected.

Graphical user interface, text, applicationDescription automatically generated

Now I have all my cloud logs in one easy to search solution with out-of-the-box content to make my job easier. Log Insight Cloud offers a unified log analytics tool for ‘all the clouds!’ – on-prem, VMC on AWS, native AWS, Azure, and GCP. Managing a multi-cloud environment doesn’t need to give you a headache! Let Log Insight do the heavy lifting for you. Visualize data in dashboards, enable out-of-the-box alerts or create custom content. Visit our website for more information or sign up for a free 30 day trial. We also have free training available at VMware Pathfinder.