vRealize Log Insight Cloud Log Analytics Log Insight

Monitor Azure Logs with vRealize Log Insight Cloud

Microsoft Azure offers a cloud computing platform that is used for testing, building, deploying and managing applications and servers that are hosted by Microsoft. Companies can host Active Directory, mobile, storage, data, messaging, media, ML, management, and developer services. Furthermore, each service has logging capabilities. With this in mind, vRealize Log Insight Cloud provides out of the box functions to collect logs across Azure services and streams them centrally to Log Insight Cloud. As a result, we’ve removed an obstacle to managing a multi-cloud platform by centralizing your logging to one solution, no matter the source, on-prem, hybrid, and multi-cloud. 

 

Azure Log Sources

You can view the details on how to configure log forwarding from Azure by viewing the instructions under Log Sources for the given application. 

 

Example: Azure Search Service Step by Step Instructions

 

Verify You Have Logging Enabled for the Service

If you don’t have logging enabled already, go to the Diagnostics Settings section under the monitoring tab of your existing Search Service, click Add Diagnostic Settings.

 

Verify or Configure Diagnostic Settings

At a minimum, OperationLogs should be selected.

For destination details select “Archive to a storage account”. 

 

Create Azure Function

The Azure function provides the parameters required for sending log data to Log Insight Cloud. It’s easy to deploy the function right from the instructions:

 

Enter Appropriate Values for Deployment

Configure the Resource Group, API_Url and API_Token properties. Then, click Review + create and when the validation passes, click Create to set up the services using the ARM template. Note: You will need administrative rights to create an API key under Configuration > API keys. Enter the URL: https://data.mgmt.cloud.vmware.com/le-mans/v1/streams/ingestion-pipeline-stream 

Select the Storage Account

An Azure storage account contains all of your Azure Storage data objects: blobs, files, queues, tables, and disks. It is accessible from anywhere in the world over HTTP or HTTPS. Once the function has been created by using the ARM template, you need to add the storage account connection string as an environment variable in the newly created function app.

*Note: If you’ve just created the new resource group the values for the storage account are already populated and you can skip this step. 

 

Copy the Connection String

A connection string includes the authorization information required for your application to access data in an Azure Storage account at runtime using Shared Key authorization. To fetch the storage account connection string, go to the storage account created earlier and click Access Keys under the Settings option. Then, copy the connection string under “key1”

 

Click on the Function App

Azure Functions is a serverless solution that allows you to write less code, maintain less infrastructure, and save on costs.

Go to the Configuration section under the Settings option of your created function app. Click (+) New Application Setting and enter a name for the environment variable and for the value field, add the copied connection string.
Click OK and then click Save to save the configuration.

 

Record the Path to Logs for the Storage Account

Click on the Storage Account for our resource group. Click on Containers.

Note the name of the container.

 

Edit the Trigger for blobStorageFunction

The Blob storage trigger starts a function when a new or updated blob is detected. The function sends the logs to Log Insight Cloud.

Go to the Azure Function (with the Blob Storage Trigger) associated with your function app. Click the Integration tab under the Developer section and then click Azure Blob Storage Trigger to edit the trigger details.

 

Update the Path

Set the Path as Container name of your storage account where the service logs are being stored/{name} and select the above created connection string from the Storage Account Connection drop-down menu.
Click Save.

 

Verify Log Flow

Go to Log Sources and select the application you are configuring. Click on the Logs tab.

(Filter: Search for logs in interactive analytics using the filter provided in Log Sources: log_type contains “azure_log” AND logsource contains “blob_storage” AND event_provider contains “azure_search)

 

Explore Logs

We are filtering on the log_type. This field is extracted from the log message automatically. Using this filter, we are simply searching for all Azure logs. I’ve chosen to group these logs by the event_provider (source application in Azure). I can choose any extracted field to group the log messages by different categories such as hostname, source, log_type, etc… I can save this query and add it to a dashboard. This search has returned results for Network, AVS (Azure VMware Solution), Search, and Storage logs being monitored by Log Insight Cloud.

Build a Dashboard

I’ve added a few more queries to my dashboard so I can visualize the log data. I have a log stream widget for all Azure Logs and other widgets have dashboards that include all search, storage, and web logs and one filter for all logs grouped by event source.

 

vRealize Log Insight Cloud makes it easy to monitor your public cloud logs with out of the box functions for log collection. Monitor your on-prem, hybrid, and multi-cloud logs including AWS, Azure, and GCP on one platform.  Log Insight Cloud is a sophisticated log analytics tool that provides data visualization with dashboards and configurable alerts to notify you of events of interest. If you’d like to learn more about Log Insight Cloud, visit our website and sign up for our free trial.