Reporting

Using Splunk for data science to count events between sequential stages in a Process

cindygibbs_08
Communicator

Alright guys I hope you are ready for this question because I almost lot my mind! btw THANK YOU SO MUCH FOR ALL THE HELP!

I have been working on this problem for WEEKS and I have to kindly ask for your help I am now helpning ut a company that has splunk for the iot stuff and they are a welding company and want me to use SPL to count the number of events (alerts) between sequeantial stages of a 3-stage process .. so let me please break it down for you..

Information related to the process: A test Subject is made to go through a 3 stage process with stages A, B and C respectively the first one being A, second B and finally C; a test Subject may abandon the process at stages A or B and then start again from point A, each time the process takes place a dataset is created with the IDENTIFICATION of the test subject, the TIMESTAMP in which the stage took place and a unique VISIT_CODE During any stage, a test subject may trigger an "ALERT" and this will be recorded with the TIMESTAMP, ALERT_CODE and test subject IDENTIFICATION.

WhatI need: to count how many ALERTS where generated by the test subjects between stages A and B, between stages B and C and finally how many ALERTS where generated after C. Please note that a test subject may at some point abandon the process to later on start again from point A.

To get the data from the process I do this:

 

index=bearing_P1 and source=PROBES
| table *

 

 

and I get

STAGETEST SUBJECTTIMESTAMPVISIT_CODE
AXYU-110BKO
AXYU-115JUJD
BXYU-120DUDH
AFF-0925KSIWJD
BFF-0930AJAKAM
CFF-0935ZISKS
AUU-8940NNXJD
BUU-8945DDUWO
AI-4450JIWIW
AW-655SHDN
BW-660IWOLS
CW-665JDDD
AU-9070DJDKSMS
BU-9075NDJSM
AT-8780DNDJDK

 

and for the triggered alerts I use

 

index=alerts source=probes_w1
| table *

 

 

and I get

TEST SUBJECTTIMESTAMPALERT_CODE
XYU-111AYUJ-151571406
XYU-112AYUJ-487008829
XYU-128AYUJ-211990388
FF-0932AYUJ-4177221842
W-656AYUJ-1300211351
W-663AYUJ-3014305494
I-4467AYUJ-4454800551
U-9073AYUJ-1079921935
U-9076AYUJ-3348911727
U-9079AYUJ-2381219626
T-8782AYUJ-4778326278
W-689AYUJ-3915716168

 

I want to be able to achieve something like this:

Alerts between Stages A & B including alerts from test subjects that abandoned the process in the attempt nth at stage AAlerts between Stages B & C including alerts from test subjects that abandoned the process in the attempt nth at stage BAlerts after stage C
AYUJ-151571406AYUJ-211990388AYUJ-3915716168
AYUJ-487008829AYUJ-3014305494 
AYUJ-1300211351AYUJ-3348911727 
AYUJ-1079921935AYUJ-4177221842 
AYUJ-4778326278AYUJ-2381219626 
AYUJ-4454800551  

 

I know this may seem imposible but if there is a way to have this done in splunk lets say for a period of time of one year that willl be so great, I have tried autoregress,  and  a bunch of commands but I have not gotten even an inch close to me desired utput plus I fear that if I do at somepoint the data will truncate... Thank you so much to everyone who can point me in the right direction

 

kindly,
Cindy

Labels (2)
0 Karma
1 Solution

ITWhisperer
Legend
| makeresults
| eval _raw="STAGE	TEST SUBJECT	TIMESTAMP	VISIT_CODE
A	XYU-1	10	BKO
A	XYU-1	15	JUJD
B	XYU-1	20	DUDH
A	FF-09	25	KSIWJD
B	FF-09	30	AJAKAM
C	FF-09	35	ZISKS
A	UU-89	40	NNXJD
B	UU-89	45	DDUWO
A	I-44	50	JIWIW
A	W-6	55	SHDN
B	W-6	60	IWOLS
C	W-6	65	JDDD
A	U-90	70	DJDKSMS
B	U-90	75	NDJSM
A	T-87	80	DNDJDK
A	Z-99	85	ZJDKSMS
B	Z-99	90	ZDJSM
A	Z-99	95	ZNDJDK"
| multikv forceheader=1
| eval _time=_time+TIMESTAMP
| append 
    [| makeresults
    | eval _raw="TEST SUBJECT	TIMESTAMP	ALERT_CODE
XYU-1	11	AYUJ-151571406
XYU-1	12	AYUJ-487008829
XYU-1	28	AYUJ-211990388
FF-09	32	AYUJ-4177221842
W-6	56	AYUJ-1300211351
W-6	63	AYUJ-3014305494
I-44	67	AYUJ-4454800551
U-90	73	AYUJ-1079921935
U-90	76	AYUJ-3348911727
U-90	79	AYUJ-2381219626
T-87	82	AYUJ-4778326278
W-6	89	AYUJ-3915716168
Z-99	86	ZYUJ-3915716168
Z-99	92	ZYUJ-4778326278
Z-99	98	ZYUJ-2381219626"
    | multikv forceheader=1
    | eval _time=_time+TIMESTAMP
    ]
| fields - linecount _raw



| sort _time
| eval A=if(STAGE="A",1,0)
| streamstats sum(A) as A by TEST_SUBJECT
| eval B=if(STAGE="B",1,0)
| streamstats sum(B) as B by TEST_SUBJECT A
| eval C=if(STAGE="C",1,0)
| streamstats sum(C) as C by TEST_SUBJECT A B
| eval stage=case(C>0,"After C",B>0,"Between B and C",A>0,"Bweteen A and B")
| stats list(ALERT_CODE) as alerts by stage
| transpose 0 header_field=stage

View solution in original post

ITWhisperer
Legend
| makeresults
| eval _raw="STAGE	TEST SUBJECT	TIMESTAMP	VISIT_CODE
A	XYU-1	10	BKO
A	XYU-1	15	JUJD
B	XYU-1	20	DUDH
A	FF-09	25	KSIWJD
B	FF-09	30	AJAKAM
C	FF-09	35	ZISKS
A	UU-89	40	NNXJD
B	UU-89	45	DDUWO
A	I-44	50	JIWIW
A	W-6	55	SHDN
B	W-6	60	IWOLS
C	W-6	65	JDDD
A	U-90	70	DJDKSMS
B	U-90	75	NDJSM
A	T-87	80	DNDJDK
A	Z-99	85	ZJDKSMS
B	Z-99	90	ZDJSM
A	Z-99	95	ZNDJDK"
| multikv forceheader=1
| eval _time=_time+TIMESTAMP
| append 
    [| makeresults
    | eval _raw="TEST SUBJECT	TIMESTAMP	ALERT_CODE
XYU-1	11	AYUJ-151571406
XYU-1	12	AYUJ-487008829
XYU-1	28	AYUJ-211990388
FF-09	32	AYUJ-4177221842
W-6	56	AYUJ-1300211351
W-6	63	AYUJ-3014305494
I-44	67	AYUJ-4454800551
U-90	73	AYUJ-1079921935
U-90	76	AYUJ-3348911727
U-90	79	AYUJ-2381219626
T-87	82	AYUJ-4778326278
W-6	89	AYUJ-3915716168
Z-99	86	ZYUJ-3915716168
Z-99	92	ZYUJ-4778326278
Z-99	98	ZYUJ-2381219626"
    | multikv forceheader=1
    | eval _time=_time+TIMESTAMP
    ]
| fields - linecount _raw



| sort _time
| eval A=if(STAGE="A",1,0)
| streamstats sum(A) as A by TEST_SUBJECT
| eval B=if(STAGE="B",1,0)
| streamstats sum(B) as B by TEST_SUBJECT A
| eval C=if(STAGE="C",1,0)
| streamstats sum(C) as C by TEST_SUBJECT A B
| eval stage=case(C>0,"After C",B>0,"Between B and C",A>0,"Bweteen A and B")
| stats list(ALERT_CODE) as alerts by stage
| transpose 0 header_field=stage

View solution in original post

cindygibbs_08
Communicator

@ITWhisperer  a true legend

0 Karma
.conf21 Now Fully Virtual!
Register for FREE Today!

We've made .conf21 totally virtual and totally FREE! Our completely online experience will run from 10/19 through 10/20 with some additional events, too!