Splunk Search

Trying to take specific subsets of data. (15 minute resolution rather than 5 minute being stored in splunk).

richmond
New Member

I have some building occupancy data generated from our wireless network that is logged as one record per floor of each building every 5 minutes.
I'm now trying to export a spreadsheet to another user who has requested a 15 minute resolution of this data. I believe I'm looking for a WHERE clause where the minutes of _time for the event fall within a set of ranges. The data isn't aligned to the clock very well so the first record could be in minute 0 to minute 5, so second would need to be 10-15, etc. But my skill level with using these functions is falling short here. 🙂

Tags (1)
0 Karma

to4kawa
Ultra Champion
| bin span=5min _time

try bin to aggregate on each time span.

| timechart span=15min count

However, timechart can be aggregated on any time span.

If you only count the event, here is fastest way:

| tstats count where index=your sourcetype=yours by _time span=15min 

https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Bin
https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Timechart
https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Tstats

tauliang
Communicator

You can specify a sampling ratio of 3 by activating event sampling for your search in Splunk Web.

  1. In Splunk Web, below the Search bar, click No Event Sampling.
  2. You can use one of the default ratios or specify a custom ratio. a. To use one of the default ratios, click the ratio in the Sampling drop-down. b. To specify a custom ratio, click Custom and type the ratio value, e.g, in you case, 3. Then click Apply. Note, the ratio value must be a positive integer greater than 1.

If you 'd like it to be more widely applied, specify the value in ui-prefs.conf. More details could be found in Splunk Docs.

richmond
New Member

However this seems to assume that my data is ordered nicely and have it fall nicely on the 1/3 boundary. It's imported from a CSV which is output from another system that orders things oddly and this approach seems to mangle my data.
If I try to sort the data I'm limited to 10,000 of the ~75,000 rows I need

0 Karma

tauliang
Communicator

In this case you can do some manual filtering in the query to return events from certain times, e.g., only the events that landed on 0th, 15th, 30th or 45th minutes of any given hour. An example is below:

index=_internal
| eval eventMin=strftime(_time,"%M")
| table _time, eventMin
| search (eventMin=0 OR eventMinute=15 OR eventMinute=30 OR eventMinute=45)

of course, you can adjust the value of eventMinute to whatever you see fit.

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...