Dashboards & Visualizations

How to make transaction starts with and end with the proper results?

anooshac
Communicator

Hi all, does anyone knows if there's any way to make transaction start and end with the proper results.

I have a transaction URL startswith=STATUS=FAIL endswith=STATUS=PASS.

The data has pattern like FAIL,PASS,FAIL,PASS,PASS,FAIL,FAIL,FAIL,PASS...

The transaction command doesn't work well.

My requirement is to get the immediate PASS URL after the FAIL one.

In a situation like FAIL...... PASS will take the last part of FAIL, PASS. I want it to take FAIL..............PASS.

Does anyone know how to do this?

Labels (2)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

There might be a more efficient way to do this but this might work for you

| streamstats count as start reset_on_change=true by status
| where start=1
| streamstats count(eval(status=="FAIL")) as fails by status
| eval fails=if(fails=0,null(),fails)
| filldown fails
| stats values(*) as * by fails

View solution in original post

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try something like this to get the first fail of a group of fails and the passes

| streamstats count as start reset_on_change=true by status
| where start=1 OR status="PASS"
0 Karma

anooshac
Communicator

I tried this and it gives the last fail from the group of fails. I want the first fail of URL which are from the pattern FAIL...............PASS.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Perhaps the events need to be sorted by time?

| sort 0 _time
| streamstats count as start reset_on_change=true by status
| where start=1 OR status="PASS"
0 Karma

anooshac
Communicator

Thank you.. I was able to get the data.Now in the list all the URLs which are SUCCESS also comes. How can i avoid that ? How can i group the FAIL and its immediate SUCCESS URLs? Because want to use the data of those URLs.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

There might be a more efficient way to do this but this might work for you

| streamstats count as start reset_on_change=true by status
| where start=1
| streamstats count(eval(status=="FAIL")) as fails by status
| eval fails=if(fails=0,null(),fails)
| filldown fails
| stats values(*) as * by fails
0 Karma

anooshac
Communicator

Thanks a lot.This is working. But one pair has the wrong data. The FAIL one is having the data of PASS url.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

I am not sure I understand - is it just that the data from both events are in "wrong" order sometimes? If so, use list instead of values

| stats list(*) as * by fails
0 Karma

anooshac
Communicator

2022-03-04_10h23_44.pnghi, I have one doubt. My data has some URL s which have status as "ABORTED". While grouping this will also be grouped. How to avoid this? I only want to group the FAIL URL with its immediate next PASS URL.   and it is listing the FAIL also even when there is no next PASS URL. How to avoid this?@ITWhisperer 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| where status!="ABORTED"
| streamstats count as start reset_on_change=true by status
| where start=1
| streamstats count(eval(status=="FAILURE")) as fails by status
| eval fails=if(fails=0,null(),fails)
| filldown fails
| stats list(*) as * by fails
0 Karma

anooshac
Communicator

2022-03-04_13h18_07.pngYes i got it. Is it possible to avoid the FAIL coming individually?There are no PASS URLs after that but it is still coming like that. Is there any way. I tried and i was not able to remove that.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| where mvcount(status) = 2
0 Karma

anooshac
Communicator

Hi @ITWhisperer , i have many URLs, so i want to get the details for all the URLs in the data. I tried like this, but it is not grouping same URLs.

 

| where status!="ABORTED"
| streamstats count as start reset_on_change=true by status URL
| where start=1
| streamstats count(eval(status=="FAILURE")) as fails by status URL
| eval fails=if(fails=0,null(),fails)
| filldown fails
| stats list(*) as * by fails URL| where mvcount(status) = 2

 

Can you please help me to find What is wrong in this query?

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

I don't understand what it is you are trying to do with these changes. Can you raise a new question explaining this new requirement?

0 Karma

anooshac
Communicator

Thank you so much!

0 Karma

anooshac
Communicator

Can you please help me! @ITWhisperer 

0 Karma
Get Updates on the Splunk Community!

Observability Highlights | January 2023 Newsletter

 January 2023New Product Releases Splunk Network Explorer for Infrastructure MonitoringSplunk unveils Network ...

Security Highlights | January 2023 Newsletter

January 2023 Splunk Security Essentials (SSE) 3.7.0 ReleaseThe free Splunk Security Essentials (SSE) 3.7.0 app ...

Platform Highlights | January 2023 Newsletter

 January 2023Peace on Earth and Peace of Mind With Business ResilienceAll organizations can start the new year ...