Dashboards & Visualizations

Extract fields from transaction

nadeige1
Engager

Hello,

 

I have logs with this format and I want to count the number of error code by SNR.

 

2021-01-25 12:59:18,355 - [INFO] SNR: 917173
2021-01-25 12:59:21,868 - [INFO] 0x100:S_Home
2021-01-25 12:59:22,312 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:22,314 - [INFO] 0x154:S_VACON
2021-01-25 12:59:22,316 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:22,629 - [INFO] new file cycle: 17320
2021-01-25 12:59:23,141 - [INFO] 0x154:S_VACON
2021-01-25 12:59:23,142 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:23,741 - [INFO] 0x154:S_VACON
2021-01-25 12:59:23,742 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:25,645 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:25,646 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:25,721 - [INFO] 0x100:S_Home
2021-01-25 12:59:27,095 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:27,102 - [INFO] 0x154:S_VACON
2021-01-25 12:59:27,104 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:27,425 - [INFO] new file cycle: 17321
2021-01-25 12:59:27,952 - [INFO] 0x154:S_VACON
2021-01-25 12:59:27,953 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:28,856 - [INFO] 0x154:S_VACON
2021-01-25 12:59:28,857 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:30,450 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:30,451 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:30,504 - [INFO] 0x100:S_Home
2021-01-25 12:59:31,624 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:31,625 - [INFO] 0x154:S_VACON
2021-01-25 12:59:31,627 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:31,951 - [INFO] new file cycle: 17322
2021-01-25 12:59:32,478 - [INFO] 0x154:S_VACON
2021-01-25 12:59:32,479 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:33,432 - [INFO] 0x154:S_VACON
2021-01-25 12:59:33,433 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:34,920 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:34,922 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:34,993 - [INFO] 0x100:S_Home
2021-01-25 12:59:36,321 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:36,325 - [INFO] 0x154:S_VACON
2021-01-25 12:59:36,327 - [INFO] 0x152:S_MQLOFF

2021-01-25 12:59:18,355 - [INFO] ADU identified SNR: 917175
2021-01-25 12:59:37,190 - [INFO] 0x154:S_VACON
2021-01-25 12:59:37,190 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:38,157 - [INFO] 0x154:S_VACON
2021-01-25 12:59:38,158 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:39,674 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:39,676 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:39,742 - [INFO] 0x100:S_Home
2021-01-25 12:59:40,902 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:40,904 - [INFO] 0x154:S_VACON
2021-01-25 12:59:40,906 - [INFO] 0x152:S_MQLOFF

In this case, the result I expect is the following result:

Error code 0x130: 4 occurrences for SNR 917173, 1 occurrence for SNR: 917175

Error code 0x154: 7 occurrences for SNR 917173, 3 occurrences for SNR: 917175

etc.

I use the transaction command in order to group the events:

transaction startswith="SNR"

But when doing this, the number of error code is reduced to the number of event, which is wrong for me. Do you have an idea how to reach the result I expect

Labels (1)
0 Karma
1 Solution

jacobpevans
Motivator

Greetings @nadeige1 ,

I have a few points to make.

First, try not to use transaction when not necessary. Splunk explicitly says this; the reason is that it is not as efficient as other commands that can often do the same thing, see here.

Second, I need to make two assumptions.

  1. The line 2021-01-25 12:59:18,355 - [INFO] ADU identified SNR: 917175 is a copy/paste from the first SNR line, and the timestamp should actually be more like 2021-01-25 12:59:36,777 so that your sample data is in _time sorted order.
  2. Error code 0x154: 7 occurrences for SNR 917173 is a miscalculation. It should really be 10.

If those assumptions are correct, here is a run-anywhere search that will get you what you're asking:

| makeresults
| eval _raw =
"2021-01-25 12:59:18,355 - [INFO] SNR: 917173
2021-01-25 12:59:21,868 - [INFO] 0x100:S_Home
2021-01-25 12:59:22,312 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:22,314 - [INFO] 0x154:S_VACON
2021-01-25 12:59:22,316 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:22,629 - [INFO] new file cycle: 17320
2021-01-25 12:59:23,141 - [INFO] 0x154:S_VACON
2021-01-25 12:59:23,142 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:23,741 - [INFO] 0x154:S_VACON
2021-01-25 12:59:23,742 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:25,645 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:25,646 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:25,721 - [INFO] 0x100:S_Home
2021-01-25 12:59:27,095 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:27,102 - [INFO] 0x154:S_VACON
2021-01-25 12:59:27,104 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:27,425 - [INFO] new file cycle: 17321
2021-01-25 12:59:27,952 - [INFO] 0x154:S_VACON
2021-01-25 12:59:27,953 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:28,856 - [INFO] 0x154:S_VACON
2021-01-25 12:59:28,857 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:30,450 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:30,451 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:30,504 - [INFO] 0x100:S_Home
2021-01-25 12:59:31,624 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:31,625 - [INFO] 0x154:S_VACON
2021-01-25 12:59:31,627 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:31,951 - [INFO] new file cycle: 17322
2021-01-25 12:59:32,478 - [INFO] 0x154:S_VACON
2021-01-25 12:59:32,479 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:33,432 - [INFO] 0x154:S_VACON
2021-01-25 12:59:33,433 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:34,920 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:34,922 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:34,993 - [INFO] 0x100:S_Home
2021-01-25 12:59:36,321 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:36,325 - [INFO] 0x154:S_VACON
2021-01-25 12:59:36,327 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:36,777 - [INFO] ADU identified SNR: 917175
2021-01-25 12:59:37,190 - [INFO] 0x154:S_VACON
2021-01-25 12:59:37,190 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:38,157 - [INFO] 0x154:S_VACON
2021-01-25 12:59:38,158 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:39,674 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:39,676 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:39,742 - [INFO] 0x100:S_Home
2021-01-25 12:59:40,902 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:40,904 - [INFO] 0x154:S_VACON
2021-01-25 12:59:40,906 - [INFO] 0x152:S_MQLOFF"
| rex max_match=0 "^(?<raw>.+)\n+"
| fields - _time _raw
| mvexpand raw
| rename raw as _raw
| eval _time = strptime(substr(_raw, 0, 23), "%Y-%m-%d %H:%M:%S,%N")

| sort 0 _time
| rex "SNR:\s+(?<snr_id>\d+)"
| rex "(?<error_code>0x\d+)"
| streamstats last(snr_id) as snr_id
| stats count by snr_id error_code
| sort 0 error_code snr_id

 

A few notes:

  • Everything above the double new line is just for the "run-anywhere" part of the search. You will only need the part of the search below the double new line (starting with | sort 0 _time)
  • I had to extract snr_id and error_code. Hopefully, you already have these fields extracted in your data and should use your field names instead.

This is what my output looks like:

snr_id	error_code	count
917173	0x100	4
917175	0x100	1
917173	0x130	4
917175	0x130	1
917173	0x151	3
917175	0x151	1
917173	0x152	10
917175	0x152	2
917173	0x154	10
917175	0x154	3
917173	0x156	3
917175	0x156	1
Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.

View solution in original post

0 Karma

jacobpevans
Motivator

Greetings @nadeige1 ,

I have a few points to make.

First, try not to use transaction when not necessary. Splunk explicitly says this; the reason is that it is not as efficient as other commands that can often do the same thing, see here.

Second, I need to make two assumptions.

  1. The line 2021-01-25 12:59:18,355 - [INFO] ADU identified SNR: 917175 is a copy/paste from the first SNR line, and the timestamp should actually be more like 2021-01-25 12:59:36,777 so that your sample data is in _time sorted order.
  2. Error code 0x154: 7 occurrences for SNR 917173 is a miscalculation. It should really be 10.

If those assumptions are correct, here is a run-anywhere search that will get you what you're asking:

| makeresults
| eval _raw =
"2021-01-25 12:59:18,355 - [INFO] SNR: 917173
2021-01-25 12:59:21,868 - [INFO] 0x100:S_Home
2021-01-25 12:59:22,312 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:22,314 - [INFO] 0x154:S_VACON
2021-01-25 12:59:22,316 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:22,629 - [INFO] new file cycle: 17320
2021-01-25 12:59:23,141 - [INFO] 0x154:S_VACON
2021-01-25 12:59:23,142 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:23,741 - [INFO] 0x154:S_VACON
2021-01-25 12:59:23,742 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:25,645 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:25,646 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:25,721 - [INFO] 0x100:S_Home
2021-01-25 12:59:27,095 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:27,102 - [INFO] 0x154:S_VACON
2021-01-25 12:59:27,104 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:27,425 - [INFO] new file cycle: 17321
2021-01-25 12:59:27,952 - [INFO] 0x154:S_VACON
2021-01-25 12:59:27,953 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:28,856 - [INFO] 0x154:S_VACON
2021-01-25 12:59:28,857 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:30,450 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:30,451 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:30,504 - [INFO] 0x100:S_Home
2021-01-25 12:59:31,624 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:31,625 - [INFO] 0x154:S_VACON
2021-01-25 12:59:31,627 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:31,951 - [INFO] new file cycle: 17322
2021-01-25 12:59:32,478 - [INFO] 0x154:S_VACON
2021-01-25 12:59:32,479 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:33,432 - [INFO] 0x154:S_VACON
2021-01-25 12:59:33,433 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:34,920 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:34,922 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:34,993 - [INFO] 0x100:S_Home
2021-01-25 12:59:36,321 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:36,325 - [INFO] 0x154:S_VACON
2021-01-25 12:59:36,327 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:36,777 - [INFO] ADU identified SNR: 917175
2021-01-25 12:59:37,190 - [INFO] 0x154:S_VACON
2021-01-25 12:59:37,190 - [INFO] 0x151:S_MQLON
2021-01-25 12:59:38,157 - [INFO] 0x154:S_VACON
2021-01-25 12:59:38,158 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:39,674 - [INFO] 0x152:S_MQLOFF
2021-01-25 12:59:39,676 - [INFO] 0x156:S_VACOFF
2021-01-25 12:59:39,742 - [INFO] 0x100:S_Home
2021-01-25 12:59:40,902 - [INFO] 0x130:S_Cycle
2021-01-25 12:59:40,904 - [INFO] 0x154:S_VACON
2021-01-25 12:59:40,906 - [INFO] 0x152:S_MQLOFF"
| rex max_match=0 "^(?<raw>.+)\n+"
| fields - _time _raw
| mvexpand raw
| rename raw as _raw
| eval _time = strptime(substr(_raw, 0, 23), "%Y-%m-%d %H:%M:%S,%N")

| sort 0 _time
| rex "SNR:\s+(?<snr_id>\d+)"
| rex "(?<error_code>0x\d+)"
| streamstats last(snr_id) as snr_id
| stats count by snr_id error_code
| sort 0 error_code snr_id

 

A few notes:

  • Everything above the double new line is just for the "run-anywhere" part of the search. You will only need the part of the search below the double new line (starting with | sort 0 _time)
  • I had to extract snr_id and error_code. Hopefully, you already have these fields extracted in your data and should use your field names instead.

This is what my output looks like:

snr_id	error_code	count
917173	0x100	4
917175	0x100	1
917173	0x130	4
917175	0x130	1
917173	0x151	3
917175	0x151	1
917173	0x152	10
917175	0x152	2
917173	0x154	10
917175	0x154	3
917173	0x156	3
917175	0x156	1
Cheers,
Jacob

If you feel this response answered your question, please do not forget to mark it as such. If it did not, but you do have the answer, feel free to answer your own post and accept that as the answer.
0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...