Senior Technical Account Manager
As a VMware TAM tech lead for vRrealize Log Insight (vRLI), I am asked by many customers how they can forward particular events, mostly security-based ones, to a SIEM product such a Splunk without breaking the bank. vRLI is not in the business of security event management, it is understandable why customers want to use a separate product for security event management, and Log Insight for root cause analysis and troubleshooting.
The thinking behind this approach is that Security teams want to continue to use familiar tools that they have a high degree of experience and understanding, like Splunk, for security information management. While Operation Teams want to continue to use vRLI to monitor events in the environment and have access to all the powerful dashboards and vRealize Operations integration that comes with the VMware suite of products.
Through this approach, customers quickly discover the cost-savings they can achieve by fine-tuning and defining what Splunk ingests, to only the events the security team’s needs. The vRLi license is not based on ingestion rates as in the Splunk’s pricing model. The end result is that organizations don’t have to write larger and larger checks to Splunk.
1. Configuring vRealize Log Insight to send events to Splunk
In this example, we are going to send sudo privilege escalation security events from a Linux machine that has a vRLI agent installed on it and is being ingested to a vRLI 4.5 instance, to Splunk. Our first step in the process is to configure Splunk to receive syslog events from vRLI.
First, we need to navigate to ‘Administration -> Event Forwarding’ in Log Insight and configure a new event forwarding destination. Select ‘+ New Destination’
Next, at the event forwarding configuration screen, we input what kind of events we want to forward to Splunk, and our Splunk server parameters. My Splunk server’s hostname is ‘ubuntu01.home.dom’, and I added a custom tag to the events forwarded to Spunk called ‘security_event=linux’, so we have a way to easily identify what kind of event it is, and we can run queries with that tag to easily find them in Splunk.
Another important thing to note is that in our scenario, all event forwarding happens before ingestion into Log Insight. This means that the events cannot be manipulated before they are forwarded. This is a very important factor when forwarding to a SIEM tool such as Splunk.
I created a simple filter that forwards all Linux sudo events to Splunk. In a production environment, your filters will probably be a little more complex and discriminating, but this works for the example. Finally, I left TCP/514 and everything else default and was able to successfully test a connection.
Once we are done, we can click ‘Save’ and we will be brought back to the event forwarding section of Log Insight, complete with a new forwarding rule. You might see events already starting to flow to Splunk, but we aren’t done yet. Those events are going to Splunk and then promptly being dropped. We must configure the Splunk server to actually ingest our events.
2. Configuring Splunk to Receive Events from vRealize Log Insight
Now that Log Insight is sending events to Splunk, we need to configure Splunk to ingest them. This is a little more work than it was to configure Log Insight, but once we configure Splunk we are done.
First, we log into Splunk as an administrator and navigate to ‘Messages -> Add Data -> Data inputs’.
Under Type, we choose ‘TCP’, since the events are coming from Log Insight over TCP/514.
This will bring us to the ‘Add Data’ screen, where we just need to specify the port and source server. I added ‘loginsight.home.dom’ as the server to accept connections from, since that is the hostname of my Log Insight machine. Click ‘Next’ when done.
Next, we configure our input settings. I chose ‘linux_audit’ as the source type, and ‘DNS’ as the host value so that all the events that come from Log Insight are marked with the DNS hostname. When you’ve verified all your settings, click ‘Review’.
Finally, we review our settings for accuracy and click ‘Submit’.
If all goes well, we should get a confirmation box that our new input source has been successfully created. We can now start searching for our Linux sudo events.
3. Searching for our vRealize Log Insight Events in Splunk
If we start a new search in Splunk, and input the ‘linux_audit’ custom tag that we specified when configuring our forwarding destination in Log Insight, we will start to see our sudo events flowing into Splunk.
If we focus on one event and look at the details, we can see that my user elevated his rights to root using sudo on machine ‘ubuntu01’, running on ESX host ‘esx01’, in cluster ‘Lab’ at 4:47PM on 1/5/2018.
It’s as simple as that. Now our security team can use Splunk to create dashboards, trigger alerts and run security reports from events from our Log Insight instance without the extra expense that comes with sending all of our company’s events to Splunk.
This is a simple example, and there is no limit to how complex one can go with event forwarding. The objective was to show that Log Insight can live in harmony with other enterprise log management applications that have their own specific and dedicated functions. Many security teams throughout the enterprise rely on Splunk for security event management, which is one of its strengths.
vRealize Log lnsight delivers a powerful and cost-effective solution. It provides heterogeneous and scalable log management with actionable dashboards, sophisticated analytics, and broad third-party extensibility. It allows greater operational visibility and faster troubleshooting. It gives organizations the freedom to utilize the tools they depend on to meet their needs and requirements coupled with effective log management and analytics.
Nico Guerrera is a Senior Technical Account Manager for VMware living in Connecticut. He started with VMware in 2016. He has been working with VMware products and software since he graduated college in 2005 and has obtained every VCP certification from VI 3.0 on to vSphere 6.5. He is also a member of the TAM Tech Lead team for Cloud Management.