Getting Data In

Custom Filtering across multiple Logs

Explorer

Hi Splunk community,

For Log A, I would like to extract out all the values of a specific field that matches a specific condition.

Then with the values extracted from Log A, I would like to use it to filter Log B.

Can this be done? If so, how to? If not, any possible alternatives?

Thank you.

Tags (1)
0 Karma
1 Solution

Communicator

Subsearch will allow you to do this as long as the count of items you will be filtering from log A are limited. See subsearch and format

... but note that there is a limit to how many results you will be able to include in a subsearch. By default the limit is 10000 results and a search time of 60 seconds, defined in limits.conf

You may want to use outputlookup ( outputlookup ) to generate a lookup table if the data from log A doesn't change very often. The generated lookup table can be used in the subsearch - the subsearch command using a pre-generated lookup would be something along the lines of:

index=logs sourcetype="log B" [ | inputlookup log_a_lookup | format ]

View solution in original post

0 Karma

Communicator

Subsearch will allow you to do this as long as the count of items you will be filtering from log A are limited. See subsearch and format

... but note that there is a limit to how many results you will be able to include in a subsearch. By default the limit is 10000 results and a search time of 60 seconds, defined in limits.conf

You may want to use outputlookup ( outputlookup ) to generate a lookup table if the data from log A doesn't change very often. The generated lookup table can be used in the subsearch - the subsearch command using a pre-generated lookup would be something along the lines of:

index=logs sourcetype="log B" [ | inputlookup log_a_lookup | format ]

View solution in original post

0 Karma

Explorer

Hi wenthold,

Thank you for replying.

Can the command to filter from Log A and the lookup lines be nested together to form 1 command line that can perform the 2 commands simultaneously?

My situation is for my logs,

Timestamp | Field1 | Field2

I want to extract all the distinct values in Field2 for rows that hits a specific condition in Field1 for a specific Log.

Then on another log, or even a combination of logs, then extract out rows that has all the distinct values drawn out just now.

So far I can only extract the distinct values but unable to even see all of them or even export it to something like a csv/txt file.

If nesting could be done and I could perform both tasks in a single command, then that will be very great.

Thank you very much.

0 Karma

Communicator

You can use a subsearch without the intermediate step of generating a lookup table.

For example - let's say you have field1 in "log A" and you want to find all examples where field2 in "log B" match field1 from "Log A". The subsearch will find those unique values from "log A" and rename the field from field1 to field2.

index=logs sourcetype="log B" [ search sourcetype="log A" | dedup field1 | fields field1 | rename field1 as field2 | format ]

You can remove the rename command from the subsearch if the field name is the same in Log A and Log B. It may help to visualize what's actually happening by building the subsearch first as a regular search.

Note that the subsearch will only use up to 10000 results by default. Anything after the first 10000 results will be ignored, unless limits.conf is adjusted.

Also note that both searches will be run over the same time range. If you include earliest and latest search operators in the base search query in your subsearch you can adjust that:

    index=logs sourcetype="log B" [ search sourcetype="log A" earliest=-14d@d latest=now | dedup field1 | fields field1 | rename field1 as field2 | format ]
0 Karma