Reporting

Is it possible to iterate over a scheduled search?

LordLeet
Path Finder

Hello guys, I was wondering if it was possible to iterate over a scheduled search.

Let's say I have a scheduled search that I want to run for each distinct value of a field and then collect the results on different indexes.

it would be like:

index=_internal field=A <...search...> | collect index=myIndexA

then for B:

index=_internal field=B <...search...> | collect index=myIndexB

Since it will be a high volume of data, I can't collect all the data to the same index then extract from there.
Is there a way to do this using a script or with Splunk parameters?

Thanks in advance

0 Karma
1 Solution

sundareshr
Legend

Have you tried the map command? http://docs.splunk.com/Documentation/Splunk/6.1.2/SearchReference/Map

Something like,

index=_internal | stats count by field | eval idx="myIdx".field | table field idx | map search="index=_internal fields=$field$ ... search ... | collect index=$idx$"

View solution in original post

0 Karma

sundareshr
Legend

Have you tried the map command? http://docs.splunk.com/Documentation/Splunk/6.1.2/SearchReference/Map

Something like,

index=_internal | stats count by field | eval idx="myIdx".field | table field idx | map search="index=_internal fields=$field$ ... search ... | collect index=$idx$"
0 Karma

LordLeet
Path Finder

Thanks sundareshr, I managed to do what I wanted with the map function.

Best Regards.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...