Splunk Search

Regex not working?

msarro
Builder

I'm trying to get a time prefix working for the following event:

00:13:11:ee:b7:5e~00:13:11:ee:b7:5d~123.net~123.net~1~4 (passWithWarnings)~1 (Operational)~12 (Normal)~8 (Missing Battery)~~~~~~~~1 (Unknown)~123 2.0 / 123 1.0  HW_REV: 32; VENDOR: 123 INTERACTIVE, L.L.C.; BOOTR: 5.01; SW_REV: 6.1.95; MODEL: 123~123 INTERACTIVE, L.L.C.~123~123~5.01~6.1.95~32~~123.net~123.net~2~2010-11-18 01:40:57

I've used regexr and regex coach to verify that my regular expression is valid, however splunk seems to be choking. Its a simple regex:

^.*~

That brings me right up to the last character prior to the date. Any ideas? Also, the data above has been heavily sanitized so the format may be slightly different, but I've verified the regex still works.

Tags (2)

Genti
Splunk Employee
Splunk Employee

Seems like your regex is ok. hence it must be something else.. from the docs:

[<spec>]
DATETIME_CONFIG = <filename relative to $SPLUNK_HOME>
MAX_TIMESTAMP_LOOKAHEAD = <integer>
TIME_PREFIX = <regular expression>
TIME_FORMAT = <strptime-style format>

MAX_TIMESTAMP_LOOKAHEAD = <integer>

    * Specify how far (how many characters) into an event Splunk should look for a timestamp.
    * Default is 150 characters. 

What is your current MAX_TIMESTAMP_LOOKAHEAD value? Have you set your own value or are you using the default one? From what i see you need at least 350 (and if your actual event/log is longer might need to be larger then that).

Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...