All Apps and Add-ons

Reduce time spent in regexreplacement queue?

sowings
Splunk Employee
Splunk Employee

I have a situation where my Splunk feed is coming in all via syslog, sourcetyped as syslog, yet containing many different kinds of data. I've set up index-time TRANSFORMS in my props.conf to split out these various events to have new sourcetypes, and then sending those sourcetypes to an appropriate index.

Then I look at my indexing metrics in Splunk on Splunk. I see that the indexer queue is taking approx. 20% CPU time, but the regexreplacementqueue is at nearly 25%! (Note that this is a 24-CPU system--is the percentage 25% of one CPU, or 25% of the total available processing power?)

Does this (relatively) high use of CPU time in the regexreplacement queue suggest that my regexes are inefficient? Are there suggestions to keeping this processing queue a little less busy?

1 Solution

hexx
Splunk Employee
Splunk Employee

Is the percentage 25% of one CPU, or 25% of the total available processing power?

That percentage represent the approximate usage of one CPU core.

Does this (relatively) high use of CPU time in the regexreplacement queue suggest that my regexes are inefficient?

Possibly, that does seem to be on the high side for the regexreplacement processor. That being said, what really matters is:

  • Is the regexreplacement processor being a bottleneck? This is true if the typing queue fills up and remains saturated while the indexing queue is starved or near empty.
  • What is the proportion of the CPU usage of the indexer processor vs. the regexreplacement processor? In your situation, it seems that indexer processor is using less CPU power than regexreplacement, which is indeed a bit imbalanced.

Are there suggestions to keeping this processing queue a little less busy?

Yes, but they are not always trivial to implement:

  • Reduce the number of index-time transformations using regular expressions
  • Avoid regular expressions in index-time transformations that operate against _raw
  • Optimize your regular expressions in index-time transformations to run faster. Anchoring can sometimes do wonders to prevent unnecessary back-tracking.

View solution in original post

hexx
Splunk Employee
Splunk Employee

Is the percentage 25% of one CPU, or 25% of the total available processing power?

That percentage represent the approximate usage of one CPU core.

Does this (relatively) high use of CPU time in the regexreplacement queue suggest that my regexes are inefficient?

Possibly, that does seem to be on the high side for the regexreplacement processor. That being said, what really matters is:

  • Is the regexreplacement processor being a bottleneck? This is true if the typing queue fills up and remains saturated while the indexing queue is starved or near empty.
  • What is the proportion of the CPU usage of the indexer processor vs. the regexreplacement processor? In your situation, it seems that indexer processor is using less CPU power than regexreplacement, which is indeed a bit imbalanced.

Are there suggestions to keeping this processing queue a little less busy?

Yes, but they are not always trivial to implement:

  • Reduce the number of index-time transformations using regular expressions
  • Avoid regular expressions in index-time transformations that operate against _raw
  • Optimize your regular expressions in index-time transformations to run faster. Anchoring can sometimes do wonders to prevent unnecessary back-tracking.

sowings
Splunk Employee
Splunk Employee

Thanks for the feedback.

It doesn't appear that the regexreplacement pipeline is becoming an issue. Typing queue is not full, and not blocking earlier queues.

In this instance, the regexreplacement pipeline is in fact taking more time than the indexer pipeline, but I'm going to write that off to "good IO" and maybe poor regexes. I'll see if they can be improved.

0 Karma
Get Updates on the Splunk Community!

Splunk Education - Fast Start Program!

Welcome to Splunk Education! Splunk training programs are designed to enable you to get started quickly and ...

Five Subtly Different Ways of Adding Manual Instrumentation in Java

You can find the code of this example on GitHub here. Please feel free to star the repository to keep in ...

New Splunk APM Enhancements Help Troubleshoot Your MySQL and NoSQL Databases Faster

Splunk Observability has two new enhancements to make it quicker and easier to troubleshoot slow or frequently ...