I recently started trying to set up some field extracts for a few of our events. In this case, the logs are pipe delimited and contain only a few segments. What I've found that most of these attemp...
See more...
I recently started trying to set up some field extracts for a few of our events. In this case, the logs are pipe delimited and contain only a few segments. What I've found that most of these attempts result in an error with rex regarding limits in limits.conf. For example: this record: 2022-02-03 11:45:21,732 |xxxxxxxxxxxxxxx.xxxxxx.com~220130042312|<== conn[SSL/TLS]=274107 op=26810 MsgID=26810 SearchResult {resultCode=0, matchedDN=null, errorMessage=null} ### nEntries=1 ### etime=3 ### When I attempt to use a pipe delimited field extract (for testing) the result is this error: When I toss this regex (from the error) into regex101 (https://regex101.com/r/IswlNh/1) it tells me it requires 2473 steps, which is well above the default 1000 for depth_limit... How is it that an event with 4 segments delimited by pipe is so bad? I realize there are 2 limits (depth_count/match_count) in play here and I can increase them, but nowhere can I find recommended values to use as a sanity check. I also realize I can optimize the regex, but as I am setting this up via UI using the delimited option, I don't have access to the regex at creation time. Not to mention, many of my users are using this option as they are not regex gurus... So my big challenge/question is... Where do I go from here? My users are going to use this delimited options, which evidently generates some seriously inefficient regex under the covers. Do I increase my limit(s), and if so what is a sane/safe value? Is there something I'm missing? Thanks!