Splunk Search
Highlighted

How to ensure a transaction with an update finished before another transaction with the same input.

New Member

In Splunk I have an application that updates a database. Currently there's been an issue with receiving a transaction1, then receiving transaction2, finishing transaction2, and then finishing transaction1.
Is there a way to get a list of transactions where this problem occured? I need to get the main field of the transactions so I can "refresh" the data.
Here's a thought on how to explain the issue but I know it doen't work and I'm aware I'm using other languages but it's just to demonstrate the logic I want to implement and need guidance to solve this issue:

index=something source=source
| eval table=[search index=something source=source
| stats min(time) AS earliest max(time) AS latest by TransactionID
| table MainField TransactionID earliest latest
| sort by earliest MainField]
| eval Field=if(table.row(x).MainField=table.row(x+1).Mainfield AND table.row(x+1).latest>table.row(x).latest,MainField,0)
|table Field

0 Karma
Highlighted

Re: How to ensure a transaction with an update finished before another transaction with the same input.

Champion

it would be helpful to see what your events look like. I'm confused a bit on what "mainfield" and whether that is how you determine if transaction1 and transaction2 are updating the same thing?

nonetheless, this seems like a situation where i would typically use autoregress. It allows you to copy values of a field from previous events into the current event. So if you sort your results the right way, you could then copy the previous mainfield into the current event, compare the two and filter as needed.

0 Karma
Highlighted

Re: How to ensure a transaction with an update finished before another transaction with the same input.

New Member

The following event is part of a transaction:

1529096358957,INFO,ENIService,STATSLOGENI,extractMessage-end,1529096358956,2018-06-15 16:59:18.956 EDT,8245125380144611,,,,,SS UPD ACCOUNT,,MDU6MDAwMjMyODc6MjAxODA2MTU6MTQ1OTE2MDA6ODI0NTEyMDA6TE9SVTowOQ==,612db256-5000-8026-434f-1e870def3580,56e51303-8dbc-44eb-8822-d6d32ee0d189,E41SS,2018-06-15T14:59:16 14:59:16,AccountUpdated,CBI L30V,PC13,,OPCSGSLBWCF19,824512005380,NodeNum,,,SSAccountENI,19,2518374

In here, I can get
TransactionID:
56e51303-8dbc-44eb-8822-d6d32ee0d189
And the MainField:
8245125380144611
Because this is an update for an account, the MainField is the accountID, and if any other transaction that is AccountUpdated has the same MainField, that means they are updating the same account.

I'll take a look at autoregress.

Thank you!!

0 Karma
Highlighted

Re: How to ensure a transaction with an update finished before another transaction with the same input.

Champion

not sure if this tracks exactly to your data, but maybe something along these line. essentially, group results by main event, getting the start and end times for each transaction. Sort those results by the main event and start times. Then use autoregress to add info from previous rows to the current row and filter from there.

you might be able to use the transaction command too, but that one really isn't in my wheelhouse

 index=whatever
| stats values(start) as start values(end) as end by mainEvent TransactionId
| sort mainEvent start
| autoregress p=1 end as prev_end
| autoregress p=1 start as prev_start
| autoregress p=1 mainEvent as prev_mainEvent
| where mainEvent=prev_mainEvent AND start > prev_start AND end < prev_end
0 Karma
Speak Up for Splunk Careers!

We want to better understand the impact Splunk experience and expertise has has on individuals' careers, and help highlight the growing demand for Splunk skills.