I have a Splunk search that generates a table of errors. Each row is an error entry. I would like to compare the change of the entries from 2 time periods (24-48hrs ago vs last 24hrs) for spike in errors. Is it possible to do so with Splunk? thanks
Try something like this
index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-48h@h latest=now | dedup client_ip | eval Period=if(_time<relative_time(now(),"-24h@h"),"Prior","Current")| chart count over Error by Period
This should give you 3 columns - Error, Current (Count for last 24 hr) , Prior (Count for lat 48hr to 24 hr)
You can then compare your data for column Current and Prior. Rename the column after comparison to something more suitable for you.
Update#1
Try this
index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-48h@h latest=-24h@h | dedup client_ip | stats count by Error| eval Period="Prior" | append [search index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-24h@h latest=now | dedup client_ip | stats count by Error | eval Period="Current" ] | chart sum(count) over Error by Period
Try something like this
index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-48h@h latest=now | dedup client_ip | eval Period=if(_time<relative_time(now(),"-24h@h"),"Prior","Current")| chart count over Error by Period
This should give you 3 columns - Error, Current (Count for last 24 hr) , Prior (Count for lat 48hr to 24 hr)
You can then compare your data for column Current and Prior. Rename the column after comparison to something more suitable for you.
Update#1
Try this
index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-48h@h latest=-24h@h | dedup client_ip | stats count by Error| eval Period="Prior" | append [search index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-24h@h latest=now | dedup client_ip | stats count by Error | eval Period="Current" ] | chart sum(count) over Error by Period
Hi somesoni2,
thank you for your suggestion, it works great. I have 1 last question, in some of my panels, I like to use the statistical table, but the rows aren't sort. I tried | sort count or | sort sum(count) but it doesn't seem to work. Do you know how I can make it sorted? thanks
thanks for the suggestion. I tried your query but it yielded no result.
It works now. There was a minor type in the second part of the query. Need to add the * around wtt_err. thank you very much
Strange.. it should've worked if the base search and the fieldnames are correct. Can you provide any sample working query for the same data?
here is a working query:
index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-48h@h latest=-24h@h | dedup client_ip | stats count by Error | sort count +
thank you for your help
HI
try with two subsearches join with appendcols command.just complete the following search code
............. earliest=-(24-48)h | ..............|appendcols[search ............. earliest=-24h@h | ..............]
So I tried
index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-48h@h latest=-24h@h | dedup client_ip | stats count by Error | sort count + | appendcols [search index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-24h@h latest=-0h@h | dedup client_ip | stats count by Error | sort count +]
but it only show the data from the first query.
you can send your query ?
Sorry, here is the query I try to use. thanks
index=tto* sourcetype=access "Get wtt_err HTTP/1.1" NOT Error=111 earliest=-48h@h latest=-24h@h | dedup client_ip | stats count by Error | sort count +