Getting Data In

Search to get difference of data from current month and previous month?

nicksrulz
Explorer

Hi Legends,

I want to know is this type of splunk query possible to create?

We want a query which will pull 2 types of data, for ex. I ran a query by selecting time-picker as last 4 hours, so it will pull the data of last 4 hours from current time i.e. 09/03/2023 11:30 AM to 09/03/2023 03:30 PM now along with this data it should also pull last month's data for similar timeframe i.e. 09/02/2023 11:30 AM to 09/02/2023 03:30 PM.

The purpose of this query is to see the Month on Month growth .

Labels (1)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

You can use the info times to modify your index search

<your index>     [| makeresults
    | fields - _time
    | addinfo
    | eval row=mvrange(0,2)
    | mvexpand row
    | eval earliest=relative_time(info_min_time, "-".row."mon")
    | eval latest=relative_time(info_max_time, "-".row."mon")
    | fields earliest latest]

The subsearch is executed first to retrieve the earliest/latest (info_min_time/info_max_time) and create two rows with the times one month apart. Bear in mind that if the current month is longer than the previous month you could end up with two dates in the same month.

nicksrulz
Explorer

@ITWhisperer Thanks for your response, I tried your query its giving me earliest and latest time frame. The Ask over here is that I need data i.e. Count of my request on these time frame, for e.g. below query give me total hits of my API :


index="index_name" AND apiName!="" AND apiName=API_NAME
| search responseCode!=NULL
| stats count as Hits by apiName
| rename apiName as "API Name"
| table Hits

Now from time picker if i select 4Hrs , then I want a query which will pull 2 types of data, 

a.) it will pull the data of last 4 hours from current time i.e. 13/03/2023 11:30 AM to 13/03/2023 03:30 PM

b.) it should also pull last month's data for similar timeframe i.e. 13/02/2023 11:30 AM to 13/02/2023 03:30 PM.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
index="index_name" AND apiName!="" AND apiName=API_NAME     [| makeresults
    | fields - _time
    | addinfo
    | eval row=mvrange(0,2)
    | mvexpand row
    | eval earliest=relative_time(info_min_time, "-".row."mon")
    | eval latest=relative_time(info_max_time, "-".row."mon")
    | fields earliest latest]
| bin _time span=1d
| search responseCode!=NULL
| stats count as Hits by _time apiName
| rename apiName as "API Name"

nicksrulz
Explorer

@ITWhisperer Wow!!! this is exactly  what i needed.

Need one more help , both days count  are getting store in one variable i.e.  Hits, _time and in apiName like below :

_time apiName Hits

2023-02-14apiName15416
2023-03-14apiName16441

 

How can i segregate in two separate variables? So that i can able to calculate %age increase in Hits.  

I know i am asking a lot , since i am very new in Splunk. Please help me out .

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
index="index_name" AND apiName!="" AND apiName=API_NAME     [| makeresults
    | fields - _time
    | addinfo
    | eval row=mvrange(0,2)
    | mvexpand row
    | eval earliest=relative_time(info_min_time, "-".row."mon")
    | eval latest=relative_time(info_max_time, "-".row."mon")
    | fields earliest latest]
| bin _time span=1d
| search responseCode!=NULL
| stats count as Hits by _time apiName
| xyseries apiName _time Hits
| rename apiName as "API Name"

The issue with this is that your column names will be epoch times i.e. seconds since 1st Jan 1970. To fix this you could do something like this

| foreach 1*
    [| eval newname=strftime(<<FIELD>>,"%F")
    | eval {newname}='<<FIELD>>'
    | fields - <<FIELD>> newname]

This assumes that epoch times begin with 1, which they will for the next decade or so

nicksrulz
Explorer

@ITWhisperer Thanks for your continuous support , i tried your query and got below output :

index="indexName" AND apiName!="" AND apiName=apiName[| makeresults
| fields - _time
| addinfo
| eval row=mvrange(0,2)
| mvexpand row
| eval earliest=relative_time(info_min_time, "-".row."mon")
| eval latest=relative_time(info_max_time, "-".row."mon")
| fields earliest latest]
| bin _time span=1d
| search responseCode!=NULL
| stats count as Hits by _time apiName
| xyseries apiName _time Hits
| rename apiName as "API Name"
| foreach 1*
[| eval newname=strftime(Hits,"%F")
| eval oldname='Hits'
| fields - oldname newname]

API Name 1676379600 1678798800

apiName2330222722

 

How can i set column names in human readable  Variable , instead of numbers ? 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try using the code exactly as I showed with the <<FIELD>> and braces around newname.

<<FIELD>> is substituted by the foreach command for each field name in the list. Putting braces around the fieldname on the left hand side of the eval uses the contents of the field as the new field name.

0 Karma

bowesmana
SplunkTrust
SplunkTrust

I'm just doing a dashboard on subsearches with "the good", "the bad" and "the ugly" uses of subsearches - I always tend to view them in the bad to ugly category, but this is a good example and it's a great use case for "macro-isation".

0 Karma

bowesmana
SplunkTrust
SplunkTrust

The way to calculate a second time range based on the first time range from the picker is to have a background search that will do that calculation and make a new token that can be used to search both current and previous range.

Here are two examples that show how it's done.

https://community.splunk.com/t5/Dashboards-Visualizations/How-to-create-dynamic-label-based-on-time-...

https://community.splunk.com/t5/Getting-Data-In/How-to-count-events-for-specific-time-period-now-and...

 

Get Updates on the Splunk Community!

.conf25 Community Recap

Hello Splunkers, And just like that, .conf25 is in the books! What an incredible few days — full of learning, ...

Splunk App Developers | .conf25 Recap & What’s Next

If you stopped by the Builder Bar at .conf25 this year, thank you! The retro tech beer garden vibes were ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...