Hi All!
Here's my scenario: I'm searching 24 hours worth of data, but due to load I can only search in 4 hour increments (very large dataset). I table the results then dedup the data. After I output to a CSV file for each increment.
I'm using an accelerated datamodel for this normally but having to get weeks worth of data in 4h increments is painful. But is it possible to define a search range (start=7/4/2018:00:00:00, end=7/4/2018:23:59:59), then define the time increment (4h), maybe even max concurrent searches set to 1 or 2.
Now with that, the single search will perform 6 searches total in 4 hour increments, with each outputting to csv.
I've been doing some research on this and looks like I may be able to accomplish this with "gentimes" and "map", but I can't get that working at all so not sure if I'm doing it right. My attempt at using these is below (not with the datamodel, but I have the equivalent datamodel search handy if I can use it in here):
| gentimes start=7/4/2018:00:00:00 end=7/4/2018:23:59:59 increment=4h
| map maxsearches=1 search="search earliest=$starttime$ latest=$endtime$ index=something host=something action!=allowed <other search criteria> | dedup <my fields> | table <my fields> | outputcsv [ | stats count | addinfo | eval filename=strftime(info_min_time, "filename_%d_%m_%y_%H_%M_%S") | return $filename]"
This gives me "no results found" each time. If I run the basic search then I get my results output to csv just fine... Any help would be greatly appreciated!