Hi, is there a "standard" way of correlating data from different sources? For example, I have a metadata source and an event source. The metadata source has data such as "ServiceName" or "Location", and an IP address. The event source are logs which have a host, and I would like to get some aggregation data based on the metadata source...
meta:
ipaddress=1.2.3.4,location=xyz,service=foo
event:
host=1.2.3.4,loglevel=WARN,message="something"
If i wanted to get chart the count of different log levels by location, what would the best approach be? have tried sub-searches but that works for filtering, would i need some sort of dynamic lookup?
Is there a reason a standard stats
search wouldn't work for you? Something like
(index=index1 sourcetype=...) OR (index=index2, sourcetype=...)
|eval groupingIP = coalesce(ipaddress,host)
| stats values(location) as location, values(loglevel) as loglevel by groupingIP, someOtherUniqueField
| stats count by location, loglevel
This will work for you if you can come up with "someOtherUniqueField" to tie a meta log to an event log, otherwise values
will take in duplicate issues and not know how to handle them, and list
also isn't a great option because you're leaning on weird multivalue fields as opposed to creating one true "event issue" to be counted. This could be a _time field, some other unique ID, etc.
Essentially, you pull in all of your logs, create a field to "join" on (although join
isn't great given how splunk is architected), in this case I call it groupingIP and I use a coalesce
statement, it could also have been an if
or case
statement, group them together, take the values (unique list of values) of location and loglevel by IP, uniqueGroupField, and then count by location and loglevel.
Hope this helps!
Give this a try:
your_search_with_ipaddress_field | join left=L right=R where L.ipaddress=R.host [search your_search_with_host] | stats count by location loglevel