Splunk Search

How to create a subsearch which would generate a csv file and use this as input for another search string at same point in time.

Veeruswathi
Explorer

Hi all,

I would like to generate the csv file form one search and use that as lookup file for another query .

Here the problem is i want the csv file to update the same time i run the second query !

Any ideas on this would be much appreciated.

Thanks,
Swathi

Tags (1)
0 Karma
1 Solution

Azeemering
Builder

Allright, so you want to create an inputlookup file I assume.

First create the inputlookup by running a base search/cache and write it to a csv file

Example:

tag=web url=*
| eval list="mozilla"
| `ut_parse_extended(url,list)`
| stats earliest(_time) as earliest latest(_time) as latest by ut_domain
| outputlookup previously_seen_domains.csv

Once you have the baseline, we can create a search that compares the data and will update the CSV file with the new data:

tag=web url=* earliest=-15m
| eval list="mozilla"
| `ut_parse_extended(url,list)`
| stats earliest(_time) as earliest latest(_time) as latest by ut_domain
| inputlookup append=t previously_seen_domains.csv
| stats min(earliest) as earliest max(latest) as latest by ut_domain
| outputlookup previously_seen_domains.csv

The key here is the append=t command. (use google 🙂

View solution in original post

0 Karma

Veeruswathi
Explorer

Thankyou for you reply Azeemering.

Should i run the base-search to create the lookup for all time only once?? or should i save it somewhere.(If yes, as what?)

Thanks,
Swathi

0 Karma

Azeemering
Builder

Allright, so you want to create an inputlookup file I assume.

First create the inputlookup by running a base search/cache and write it to a csv file

Example:

tag=web url=*
| eval list="mozilla"
| `ut_parse_extended(url,list)`
| stats earliest(_time) as earliest latest(_time) as latest by ut_domain
| outputlookup previously_seen_domains.csv

Once you have the baseline, we can create a search that compares the data and will update the CSV file with the new data:

tag=web url=* earliest=-15m
| eval list="mozilla"
| `ut_parse_extended(url,list)`
| stats earliest(_time) as earliest latest(_time) as latest by ut_domain
| inputlookup append=t previously_seen_domains.csv
| stats min(earliest) as earliest max(latest) as latest by ut_domain
| outputlookup previously_seen_domains.csv

The key here is the append=t command. (use google 🙂

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...