Splunk Search

Creating universal search-time field extraction across all data

hcanivel
Explorer

I'd like to generate a time-based field that is human-readable in all my aggregate filtering, specifically after piping into stats.

I'm generally aggregating these events into a summary report and I wanted to easily view the timestamps of the first and latest occurrences. Definitely going to be using this new field pretty extensively in reports. However, I'm finding myself doing an eval time=strftime(_time, ...) pretty repetitively, ergo very much interested in extracting this once and for all.

I found this splunkbase answer:
http://splunk-base.splunk.com/answers/8505/is-it-possible-to-use-wildcards-in-sourcetype-propsconf-s...

Is this [(?:::)*] format still the recommended method to wildcard all sourcetypes?

1 Solution

alacercogitatus
SplunkTrust
SplunkTrust

I tested this. Limited success, but that might just be my test system....

In props.conf
[host::*]
priority = 100
EVAL-readabletime = strftime(_time, "%H:%M")

Then do: your_search | stats latest(readabletime) earliest(readabletime) by host

View solution in original post

alacercogitatus
SplunkTrust
SplunkTrust

I tested this. Limited success, but that might just be my test system....

In props.conf
[host::*]
priority = 100
EVAL-readabletime = strftime(_time, "%H:%M")

Then do: your_search | stats latest(readabletime) earliest(readabletime) by host

alacercogitatus
SplunkTrust
SplunkTrust

It did, but since source may contain path separators, I simplified the regex to not worry about path separators.

0 Karma

hcanivel
Explorer

didn't this say source::* before?

0 Karma

hcanivel
Explorer

Yes, I can also use macros, but I'd still have to include macro in every single search, which only reduces but not eliminates the whole repetitive task that I don't like. And macros are delicate in that you have to prepend your search with them generally. Using them outside of that norm gets really tricky...in other words, not clean enough for my taste.

Wasn't encouraged mostly because most don't have a really good use case for things.

As for your second point, I'm starting to build this similar type of stats aggregation for all different types of sourcetypes. Thanks for input!

0 Karma

linu1988
Champion

I am not sure splunk has come up with anything like this, but as i read the comments it was a hack/ not encouraged. If you like to use the eval so often , why not create a MACRO instead? pass the variable get the result..

Secondly, in the ealier posts they were actually having series name, which is easy to put like abc*, But what happends when u have sources like a.log,b.log,c.log etc.... will end up . will not be helpful/proper as per implementation from my point of view. I will happily take other inputs. Thanks.

Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Build the Future of Agentic AI: Join the Splunk Agentic Ops Hackathon

AI is changing how teams investigate incidents, detect threats, automate workflows, and build intelligent ...

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...