Dashboards & Visualizations

Multiple Time Spans

sujata_nandi
Explorer

I am trying to make a query which will give me the result of unique file names with month in column and a time span of 1 hour in row. Below is my query :
index="app_cleo_db"
origname="GEAC_Payroll*"
| rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)\d{8}_\d{6}\.xml\""
| search origname="*.xml"
| eval Date = strftime(_time, "%Y-%m-%d %H:00:00")
| eval DateOnly = strftime(_time, "%Y-%m-%d")
| transaction DateOnly, origname
| timechart count by DateOnly

But it is giving me an output with date as well as timestamp in the row like below:

_time 2023-12-02 2023-12-03

2023-12-02 00:00:0080
2023-12-02 00:30:0000
2023-12-02 01:00:0007
2023-12-02 01:30:0000
2023-12-02 02:00:0060
2023-12-02 02:30:0000
2023-12-02 00:00:0020
2023-12-03 00:30:0005
2023-12-03 01:00:0000
2023-12-03 01:30:00020
2023-12-03 02:00:0000
2023-12-03 02:30:00340

 

I want the result to look like below

_time 2023-12-02 2023-12-03

00:00:0000
01:00:0000
 02:00:0000
03:00:0000
Labels (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

The span of a timechart is controlled with the syntax

| timechart span=1h count

your example allows timechart to choose its own span based on the data volume.

You can format _time after the timechart

| eval _time=strftime(_time, "%H:%M:%S")

Note that if you do that, you will not be able to show that on a timechart, as _time is no longer a _time field in Splunk.

 

0 Karma

sujata_nandi
Explorer

Hi,

Thank you for your help, I tried to workout your recommendation and the query looks like below:

index="app_cleo_db"
origname="GEAC_Payroll*"
| rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)\d{8}_\d{6}\.xml\""
| search origname="*.xml"
| eval Date = strftime(_time, "%Y-%m-%d %H:00:00")
| eval DateOnly = strftime(_time, "%Y-%m-%d")
| transaction DateOnly, origname
| timechart span=1h count by DateOnly
| eval _time=strftime(_time, "%H:%M:%S")

But this is still giving me the time for both the dates if I try to run my query for 2 days :

_time 2023-12-02 2023-12-03

00:00:0000
01:00:0000
02:00:0000
03:00:0000
00:00:0000
01:00:0000
02:00:0000
03:00:0010
0 Karma

PickleRick
SplunkTrust
SplunkTrust

No, wait.

The timechart works with time automatically. You don't add "by DateOnly" because then it will treat your DateOnly field as a categorizing field.

| timechart span=1h count by DateOnly

This will count how many values _for each value of DateOnly field_ is per each span (in your case - per each hour).

See this run-anywhere example:

| makeresults count=2
| streamstats count
| eval _time=_time-count*7200
| fields - count

This will give you two timestamps - one two hours ago and one for hours ago.

If you simply do

| timechart span=1h count by hour

You'll get a decent result showing you that for each of those hours you got one event. Which is OK.

But if you do somehing akin to what you did before which means your whole example would look like this:

| makeresults count=2
| streamstats count
| eval _time=_time-count*7200
| fields - count
| eval DateHour=strftime(_time,"%H")
| timechart span=1h count by DateHour

Your results will turn to this (I ran this at 13:58):

_time 09 11

2023-12-05 09:0010
2023-12-05 11:0001

Because within the 9-10 hour you have your DateHour of "09" and no encounters of value "11" there (hence the corresponding counts. And within the hour 11-12, you have 0 and 1 counts.

So if you want to have your timechart with the time formatted properly, you don't add the "by DateTime" part.

You simply do

| timechart span=1h count

And only _then_ you format your time to the way you want to display it. For example

| fieldformat _time=strfile(_time,"%H")
0 Karma

sujata_nandi
Explorer

Hi, I tried your solution but it didn't work. How do I modify the query to get the desired output.

for below Query

index="app_cleo_db" 
origname="GEAC_Payroll*" 
| rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)\d{8}_\d{6}\.xml\""
| search origname="*.xml"
| eval Date = strftime(_time, "%Y-%m-%d %H:00:00")
| eval DateOnly = strftime(_time, "%Y-%m-%d")
| transaction DateOnly, origname
| timechart span=1h count by DateOnly

 

I am getting below output

time               2023-12-02  2023-12-03
2023-12-02 00:00  0           0
2023-12-02 01:00  0           0
2023-12-02 02:00  0           0
2023-12-02 03:00  0           0
2023-12-02 04:00  0           0
2023-12-02 05:00  0           0
2023-12-02 06:00  0           0
2023-12-02 07:00  1           0
2023-12-02 08:00  0           0
2023-12-02 09:00  0           0
2023-12-02 10:00  2           0
2023-12-02 11:00  1           0
2023-12-02 12:00  1           0
2023-12-02 13:00  1           0
2023-12-02 14:00  3           0
2023-12-02 15:00  4           0
2023-12-02 16:00  0           0
2023-12-02 17:00  0           0
2023-12-02 18:00  0           0
2023-12-02 19:00  0           0
2023-12-02 20:00  0           0
2023-12-02 21:00  0           0
2023-12-02 22:00  0           0
2023-12-02 23:00  0           0
2023-12-03 00:00  0           0
2023-12-03 01:00  0           0
2023-12-03 02:00  0           0
2023-12-03 03:00  0           0
2023-12-03 04:00  0           1
2023-12-03 05:00  0           3
2023-12-03 06:00  0           202
2023-12-03 07:00  0           52
2023-12-03 08:00  0           141
2023-12-03 09:00  0           188
2023-12-03 10:00  0           256
2023-12-03 11:00  0           185
2023-12-03 12:00  0           121
2023-12-03 13:00  0           52
2023-12-03 14:00  0           32
2023-12-03 15:00  0           9
2023-12-03 16:00  0           0
2023-12-03 17:00  0           0
2023-12-03 18:00  0           0
2023-12-03 19:00  0           0
2023-12-03 20:00  0           0
2023-12-03 21:00  0           0
2023-12-03 22:00  0           0
2023-12-03 23:00  0           0

but i want like below output like this where the 00:00 to 23:00 is stable

time   2023-12-02  2023-12-03     
00:00  0           0
01:00  0           0
02:00  0           0
03:00  0           0
04:00  0           1
05:00  0           3
06:00  0           202
07:00  1           52
08:00  0           141
09:00  0           188
10:00  2           256
11:00  1           185
12:00  1           121
13:00  1           52
14:00  3           32
15:00  4           9
16:00  0           0
17:00  0           0
18:00  0           0
19:00  0           0
20:00  0           0
21:00  0           0
22:00  0           0
23:00  0           0

 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Again - don't use the "by DateTime" clause.

Do a normal timechart and then - if you want to wrap it by day, use the timewrap command.

0 Karma

sujata_nandi
Explorer

I removed the "by DateTime" clause and used the timewrap clause, it is giving me the output for last 24 hours correctly however I only receive files on the weekends and if I try to use this command then it's giving me too many unwanted fields with no values.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

You can do

| timechart span=1h count
| where count>0
| timewrap 1day

 To filter out "empty" results.

0 Karma

sujata_nandi
Explorer

Hi,

Thank you very much for your help. Below is the final query and it is giving me the required output, however I am not able to open the events on a separate tab.

index="app_cleo_db"
origname="GEAC_Payroll*"
| rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)\d{8}_\d{6}\.xml\""
| search origname="*.xml"
| eval Date = strftime(_time, "%Y-%m-%d %H:00:00")
| eval DateOnly = strftime(_time, "%Y-%m-%d")
| transaction DateOnly, origname
| timechart span=1h count
| where count>0
| timewrap series=exact time_format="%d-%m-%Y" 1day
| eval _time=strftime(_time, "%H:%M:%S")
| sort _time



0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud's AI Assistant in Action Series: Auditing Compliance and ...

This is the third post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how to ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...