Dashboards & Visualizations

How create line chart showing the count for Current week, prior week and third week?

SG
Path Finder

Hi Team,

I am planning to draw the trend of the webpage test speed index trend in Splunk which gives the performance of the speed index for the current week, last week, and last before week. I developed the below query to do the job, but I am not getting the trend and query giving some error. Can someone please help me to draw the trend?

 

index=nextgen sourcetype=lighthouse_json sourcetype=lighthouse_json datasource=webpagetest step="Homepage"
| eval myTime=case(test >= relative_time(now(), "-7d"), "CurrentWeek", test >= relative_time(now(), "-14d") AND test < relative_time(now(), "-7d", "PriorWeek", test >= relative_time(now(), "-21d") AND test < relative_time(now(), "-14d", "ThirdWeek", 1=1, "Other")
| stats  values(speedindex) as score by myTime 
| eval fast = if(score>0 AND score<2000,score,0)  
| eval moderate = if(score>=2000 AND score<2999,score,0) 
| eval slow = if(score>=3000,score,0)
| fields - score

 

Labels (1)
Tags (1)
0 Karma
1 Solution

SG
Path Finder

I could get data week wise using "timewrap" function in splunk.

 

index=nextgen sourcetype=lighthouse_json sourcetype=lighthouse_json datasource=webpagetest step="Homepage" earliest=-21d@d | timechart span=1h list(speedindex) as score | timewrap w

Source - https://docs.splunk.com/Documentation/Splunk/8.2.0/SearchReference/Timewrap

 

View solution in original post

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| stats  values(speedindex) as score by myTime 

This returns a multi-value field of all the unique scores in the time period. Is that what you were expecting?

0 Karma

SG
Path Finder

I modified the query to get speed index values as per the week 

index=nextgen sourcetype=lighthouse_json sourcetype=lighthouse_json datasource=webpagetest step="Homepage" earliest=-7d@d
| eval myTime=case(_time >= relative_time(now(), "-7d"), "CurrentWeek", _time >= relative_time(now(), "-14d") AND _time < relative_time(now(), "-7d"), "PriorWeek", _time >= relative_time(now(), "-21d") AND _time < relative_time(now(), "-14d"), "ThirdWeek", 1=1, "Other")
| stats  values(speedindex) as score by myTime

Now i am getting data in the statistics panel as below. 

SG_0-1621243492699.png

But i am not getting any trend in the visualization panel.

SG_1-1621243584977.png

 

0 Karma

SG
Path Finder

I got to know that "values" function is giving results as a string, that's the reason I am not getting any trend. 

index=nextgen sourcetype=lighthouse_json sourcetype=lighthouse_json datasource=webpagetest step="Homepage" earliest=-7d@d
| eval myTime=case(_time >= relative_time(now(), "-7d"), "CurrentWeek", _time >= relative_time(now(), "-14d") AND _time < relative_time(now(), "-7d"), "PriorWeek", _time >= relative_time(now(), "-21d") AND _time < relative_time(now(), "-14d"), "ThirdWeek", 1=1, "Other")
| stats  values(speedindex) as score by myTime

Can you please tell me instead of "values", which function i can use to get the trend?

0 Karma

SG
Path Finder

I could get data week wise using "timewrap" function in splunk.

 

index=nextgen sourcetype=lighthouse_json sourcetype=lighthouse_json datasource=webpagetest step="Homepage" earliest=-21d@d | timechart span=1h list(speedindex) as score | timewrap w

Source - https://docs.splunk.com/Documentation/Splunk/8.2.0/SearchReference/Timewrap

 

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...