Reporting

Issue with Splunk query

uagraw01
Motivator

I am working on the below query in which I want to calculate the lead_time in HH:SS. This query is giving me some results in statical mode but not giving any results with linechart. Please help me fix it.

Results with statical mode.

uagraw01_0-1721988295271.png

No results showing while using "line chart"

uagraw01_1-1721988332338.png

Below is the complete query:

index= abc
| eval completion_time=strptime(COMPLETED_TIMESTAMP, "%Y-%m-%dT%H:%M:%S.%3QZ")
| stats count by completion_time FULFILLMENT_START_TIMESTAMP _time
| eval lead_time = (completion_time - FULFILLMENT_START_TIMESTAMP)
| eval hours=floor(lead_time / 3600)
| eval minutes=floor((lead_time % 3600) / 60)
| eval formatted_minutes=if(minutes < 10, "0" . minutes, minutes)
| eval HH_MM = hours . ":" . formatted_minutes
| timechart max(HH_MM) as "Maximum" avg(HH_MM) as "Average" min(HH_MM) as "Minimum"
| eval hours=floor(Maximum / 3600)
| eval minutes=floor((Maximum % 3600) / 60)
| eval formatted_minutes=if(minutes < 10, "0" . minutes, minutes)
| eval max_HH_MM = hours . ":" . formatted_minutes
| eval hours=floor(Average / 3600)
| eval minutes=floor((Average % 3600) / 60)
| eval formatted_minutes=if(minutes < 10, "0" . minutes, minutes)
| eval avg_HH_MM = hours . ":" . formatted_minutes
| eval hours=floor(Minimum / 3600)
| eval minutes=floor((Minimum % 3600) / 60)
| eval formatted_minutes=if(minutes < 10, "0" . minutes, minutes)
| eval min_HH_MM = hours . ":" . formatted_minutes
| table _time max_HH_MM avg_HH_MM min_HH_MM

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

The values need to be numeric e.g. number of minutes, you can't use string values such as HH:MM to display on a chart

0 Karma

uagraw01
Motivator

@ITWhispererDo you mean like this ?

| eval max_HH_MM = tostring(max_HH_MM)

| eval avg_HH_MM = tostring(avg_HH_MM)

| eval min_HH_MM = tostring(min_HH_MM)
 
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

No, that is the complete opposite of what I am saying!

index= abc
| eval completion_time=strptime(COMPLETED_TIMESTAMP, "%Y-%m-%dT%H:%M:%S.%3QZ")
| stats count by completion_time FULFILLMENT_START_TIMESTAMP _time
| eval lead_time = (completion_time - FULFILLMENT_START_TIMESTAMP)
| timechart max(lead_time) as "Maximum" avg(lead_time) as "Average" min(lead_time) as "Minimum"

Keep the values numeric differences between timestamps - if you want, you could divide the values by 60 to get minutes 

uagraw01
Motivator

@ITWhisperer 

index=wma_bext TYPE=FULFILLMENT_REQUEST STATUS="Marshalling"
| eval completion_time=strptime(COMPLETED_TIMESTAMP, "%Y-%m-%dT%H:%M:%S.%3QZ")
| stats count by completion_time FULFILLMENT_START_TIMESTAMP _time
| eval lead_time = (completion_time - FULFILLMENT_START_TIMESTAMP)
| timechart max(lead_time) as "Maximum" avg(lead_time) as "Average" min(lead_time) as "Minimum"
| foreach Maximum Average Minimum [ eval <<FIELD>>_hours=round('<<FIELD>>'/3600, 2), <<FIELD>>_minutes=round('<<FIELD>>'/60, 2) ]



When I use above query to combine hours and minutes and again , then i used to write like this "| eval max_d = Maximum_hours. ":" .Maximum_minutes" and again it comes into a string mode. Please suggest me how I can showcase my results in HH:MM mode for maximum, average, minimum.

Below results currently I am getting by using above query.

uagraw01_0-1721996620345.png

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| foreach Maximum Average Minimum [ eval <<FIELD>>_duration=tostring(<<FIELD>>,"duration") ]

However, as I said before, you can't use these duration fields on a chart as they are strings not numbers

0 Karma

uagraw01
Motivator

@ITWhisperer 
Any alternative to showcase this in line chart ?

uagraw01_1-1721999697156.png

 

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Wouldn't you think if I knew of another way I would have mentioned it?

You can't use strings for values in charts.

uagraw01
Motivator

@ITWhisperer Thanks for your help and suggestion.

0 Karma
Get Updates on the Splunk Community!

SOC4Kafka - New Kafka Connector Powered by OpenTelemetry

The new SOC4Kafka connector, built on OpenTelemetry, enables the collection of Kafka messages and forwards ...

Your Voice Matters! Help Us Shape the New Splunk Lantern Experience

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Building Momentum: Splunk Developer Program at .conf25

At Splunk, developers are at the heart of innovation. That’s why this year at .conf25, we officially launched ...