Splunk Search

Why Long running searches, with time spent in ReducePhaseExecutor and PreviewExecutor

agneticdk
Path Finder

Hi guys

 

I have an installation on Splunk 8.1.2 where we have XmlWinEventLog data ingested.

When we run this search:

 

 

 

index=wineventlog sourcetype=XmlWinEventLog earliest=-1h latest=now | stats count by host

 

 

 

It takes extremely long time to complete.

Comparing the search.log to other non-XmlWinEventLog searches I can see that the above search, have the following in the search.log:

 

 

 

04-27-2021 14:43:35.526 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:43:40.090 INFO  PreviewExecutor - Finished preview generation in 0.000353521 seconds.
04-27-2021 14:43:41.126 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:43:46.526 INFO  PreviewExecutor - Finished preview generation in 0.001657283 seconds.
04-27-2021 14:43:47.625 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:43:50.866 INFO  PreviewExecutor - Finished preview generation in 0.001701492 seconds.
04-27-2021 14:43:51.926 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:43:53.786 INFO  PreviewExecutor - Finished preview generation in 0.001716926 seconds.
04-27-2021 14:43:54.825 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:43:58.417 INFO  PreviewExecutor - Finished preview generation in 0.00166631 seconds.
04-27-2021 14:43:59.426 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:44:01.082 INFO  PreviewExecutor - Finished preview generation in 0.002049016 seconds.
04-27-2021 14:44:02.125 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:44:07.007 INFO  PreviewExecutor - Finished preview generation in 0.001083249 seconds.
04-27-2021 14:44:08.025 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:44:08.141 INFO  PreviewExecutor - Finished preview generation in 0.002117643 seconds.
04-27-2021 14:44:09.225 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:44:12.264 INFO  PreviewExecutor - Finished preview generation in 0.003432417 seconds.
04-27-2021 14:44:13.525 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:44:19.424 INFO  PreviewExecutor - Finished preview generation in 0.002008858 seconds.
04-27-2021 14:44:20.825 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:44:29.207 INFO  PreviewExecutor - Finished preview generation in 0.001904259 seconds.
04-27-2021 14:44:30.926 INFO  ReducePhaseExecutor - ReducePhaseExecutor=1 action=PREVIEW
04-27-2021 14:44:34.602 INFO  PreviewExecutor - Finished preview generation in 0.001884954 seconds.

 

 

 

 

That is a full minute where it is in this state. Now the above search for one hour takes about 63 seconds, and 60 seconds in this state, that tells me that something is odd.

The ReducePhaseExecutor and PreviewExecutor messages are not seen in search.log if I run the same search on the _internal index.

I hve another installation that runs 8.1.1 that has the exact same behavior.

Now if I just run the same search, but without the "by host" - the search takes 4.1 seconds to complete. It is really strange and unexpected behavior just adding "by host" will up the search time with aboute one minute.

 

Could anyone tell me what the above is, I guess it has something to do with "preview" but it does not make sense taking that amount of time, preferable some link to some documentation on what it is,  and why it is present in only searches in xml logs. I have tested on other types of data, cisco, _json, _internal, and none of those have the issue with running these executors, and spending a lot of time on it.

 

Thank you

André

Labels (2)

marand
Explorer

Seeing the exact same behaviour with slow searches in Windows XmlWinEventLog data on 8.1.2.

Splunk_TA_Windows:  8.1.1

0 Karma

agneticdk
Path Finder

Can add that the log from the indexers for the slow search has a "fantazillion" of these events, that accumulated almost matches the 60 seconds:

 

04-28-2021 07:58:03.038 INFO  CalcFieldProcessor - Not computing calculated field 'user_group' because it is not required
04-28-2021 07:58:03.038 INFO  CalcFieldProcessor - Not computing calculated field 'vendor' because it is not required
04-28-2021 07:58:03.038 INFO  CalcFieldProcessor - Not computing calculated field 'vendor_product' because it is not required
04-28-2021 07:58:04.073 INFO  SearchOperator:kv - "splunk_internal_kv_mode" value found 0 times in bucket.
04-28-2021 07:58:04.073 INFO  CalcFieldProcessor - Not computing calculated field 'Error_Code' because it is not required
04-28-2021 07:58:04.073 INFO  CalcFieldProcessor - Not computing calculated field 'Group_Domain' because it is not required

Both CalcFieldProcessor and disabled kv extracts - propably because of fast mode. The same field is logged over 1000 times with disabling or not computing, during the search.

 

And another note is that running in smart mode is also slower than expected, but I concentrate on fast mode as that should be, well fast. So the problem is not only fast mode, but fast mode is easier to see what is happening in the search.log's.

0 Karma

dschroeter
Explorer

Did you find any solution for this?

I can see the same behaviour for some other fields but its in a kind of loop to get some Calculated fields disabled and some keys added.

Its doing it for 10 to 15 seconds hundrets of times for the same field, sometimes longer, what results in a long search time.

It is not doing it for every event. There is not difference if i get 100 events back or 10K. The iteration is done up to a limit i think, or for each index maybe??

 

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...