Splunk Search

How to split comma separated values in single event and make each value as one row of table with individual values

Naga1
Loves-to-Learn Lots

If I am having list of comma separated numbers in single splunk  event field:

I am having too many event fields like below,How Can I split these comma separated values and display them in table format I mentioned below?

Any suggestion here?

Sequence Numbers processed during this transaction : 00000000000000872510,00000000000000872511,00000000000000872512,00000000000000872513,00000000000000872514,00000000000000872515,00000000000000872516,00000000000000872517,00000000000000872518,00000000000000872519,00000000000000872520,00000000000000872521,00000000000000872522,00000000000000872523,00000000000000872524,00000000000000872525,00000000000000872526,00000000000000872527,00000000000000872528,00000000000000872529,00000000000000872530,00000000000000872531,00000000000000872532,00000000000000872533

 

How Can I split thiese comma separated values and display them individually in table like:

00000000000000872510

00000000000000872511

00000000000000872512

00000000000000872513

00000000000000872514

00000000000000872515

00000000000000872516

.

.likewise till

00000000000000872533

0 Karma

Naga1
Loves-to-Learn Lots

Hi @gcusello  & @bowesmana ,

 

Thanks for the prompt responses.

 

You both gave answers for static set of numbers. What if this set of field values are dynamically changing and combined with string as well.

In that case,How Can I separate this dynamic set of numbers from string and display them individually instead of comma separated in table format.

We crossed half of the solution,You already responded to split comma separated values and display them in table format.

I need to solution for below two case:

1. What if this set of numbers are dynamically logging into splunk using one fields(MessageText)?

2. These dynamic set of numbers combined with text like(Sequence Numbers processed during this transaction 🙂 in the field (MessageText).How can I separate only numbers and display them in above mentioned format like:

00000000000000872510

00000000000000872511

00000000000000872512

00000000000000872513

00000000000000872514

 

instead of 00000000000000872510,00000000000000872511,00000000000000872512,00000000000000872513,00000000000000872514 (these set of numbers are not static keeps on changing for each transaction and logging into splunk as individual event in (message text) field.)

0 Karma

bowesmana
SplunkTrust
SplunkTrust

@Naga1 

The split command is not for a static set of numbers, it will split whatever 'dynamic' numbers you have, whether that is

00000000000000872510,00000000000000872511,00000000000000872512,00000000000000872513,00000000000000872514

or

00000000000000999995,00000000000000999996,00000000000000999997,00000000000000999998,00000000000000999999

so perhaps you can give a clearer example of what your data might look like so we can understand what you mean by dynamic.

Are you trying to say that you have a single field called MessageText that may have

ABC,12345,XYZ,98765,Hello,444444,Goodbye,777777

and you want to extract all numeric sequences from it?

If so, give some examples of what the data will look like, so we can work out a suitable matching/extraction pattern

Naga2
Explorer

The pattern may be:

 

Sequence Numbers:00000000000000872510,00000000000000872511,00000000000000872512,00000000000000872513,00000000000000872514

I need to extract only numbers without comma and display them in table like:

00000000000000872510

00000000000000872511

00000000000000872512

00000000000000872513

00000000000000872514

0 Karma

Kingsly007
Loves-to-Learn Lots

Hello @bowesmana

Example
All number's are Numeric only 

message text fields from  1st event:
Sequence numbers

00000000000000872510,00000000000000872511,00000000000000872512,00000000000000872513,00000000000000872514

message text fields from  2nd event:
Sequence numbers

00000000000000872515,00000000000000872516,00000000000000872518,00000000000000872519,00000000000000872520

From the logs 00000000000000872517 was missing so need to check missing of sequence

that is  condition actually..

just need to check number is to be correct format ( one by one) or if its not correct need to throw alerts

please suggest using regex expression for this issue,

Below query i can able take first value from mentioned logs(events)

| rex field= cip:  Audit Message . Message Text"\D+(?<Sequence Number>\d+)" 
| table Sequence Number

Output :
00000000000000872510
00000000000000872515

but i need whole sequence number in statistic table  one by one,
Hope u understood 
Thanks in advance

0 Karma

bowesmana
SplunkTrust
SplunkTrust

I suspect also that you did not post your message text field, as that rex statement would not produce the results you gave due to \D+

Can you post your message text field completely

0 Karma

bowesmana
SplunkTrust
SplunkTrust

I'm confused ... why have you not just done

| eval "Sequence Number"=split('Message Text', ",")
| table Sequence Number

as advised earlier? Substitute the actual field name for Message Text above

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Are all these numbers in a single field or part of a larger raw event. Assuming these are in a single field in the event, then simply

| eval numbers=split(your_big_long_numbers_field, ",")

which will make a new field called numbers which will contain a multivalue field with all your split numbers in.

If you then want to make a new row for each of those numbers, use

| mvexpand numbers
0 Karma

Kingsly007
Loves-to-Learn Lots

Hello @gcusello Thanks for your approach,
I appreciate it , and  i have another question ,
if number's are Dynamic mean's how we can split comma separated values and display them individually in table

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Kingsly007,

please, next time, open a new question, even if on the same topic: you'll have a faster and probably better answer to your question!

in addition, at the end of the analysis, you can accept the answer and give more information for the other people of Community.

Anyway, could you better describe what you mean with "Dynamic"?

if you have comma divided values, the number of them isn't relevant.

Could you share a sample of your logs?

Ciao.

Giuseppe

0 Karma

Kingsly007
Loves-to-Learn Lots

Hello @gcusello ,
Regarding as per Subject,

Sequence number will be differ on every transaction log's ,
so how can we write log's for 
Values are all Dynamic ( not a same numbers on every transaction's) 
Every transaction logs( sequence number is different )

1 to n(last number) if missed any number's between  1 to N,
can you help on this, really thanks in advance

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Kingsly007,

Anyway, it's still not clear what you mean with "Dynamic"?

if you have comma divided values, the number of them isn't relevant.

Could you share a sample of your logs?

Ciao.

Giuseppe

0 Karma

Naga1
Loves-to-Learn Lots

SAMPLE LOGS:

<Create Timestamp>2023-08-31T04:45:02.212Z</Create Timestamp>

<Message Text>Sequence Numbers processed during this transaction : 00000000000000875119,00000000000000875120,00000000000000875121,00000000000000875122,00000000000000875123,00000000000000875124</Message Text>

<Create Timestamp>2023-08-31T03:45:02.083Z</Create Timestamp>

<Message Text>Sequence Numbers processed during this transaction : 00000000000000875115,00000000000000875116,00000000000000875117,00000000000000875118</Message Text>

 

<Create Timestamp>2023-08-31T02:45:01.909Z</Create Timestamp> <Message Text>Sequence Numbers processed during this transaction : 00000000000000875114</Message Text>

 

<Create Timestamp>2023-08-31T01:45:02.703Z</Create Timestamp> <Message Text>Sequence Numbers processed during this transaction : 00000000000000875112,00000000000000875113</Message Text>

Tags (1)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Your logs look like compliant XML.  I am guessing that you already have the field "Message Text".  If so, you can apply @bowesmana's technique

 

| eval SequenceNumber = mvindex(split('Message Text', " : "), 1)
| eval SequenceNumber = split(SequenceNumber, ",")
| mvexpand SequenceNumber

 

(Replace 'Message Text' with MessageText if that's the field name.)  Your sample logs should give you something like

SequenceNumber_time
000000000000008751192023-08-30 21:45:02.212
000000000000008751202023-08-30 21:45:02.212
000000000000008751212023-08-30 21:45:02.212
000000000000008751222023-08-30 21:45:02.212
000000000000008751232023-08-30 21:45:02.212
000000000000008751242023-08-30 21:45:02.212
000000000000008751152023-08-30 20:45:02.083
000000000000008751162023-08-30 20:45:02.083
000000000000008751172023-08-30 20:45:02.083
000000000000008751182023-08-30 20:45:02.083
000000000000008751142023-08-30 19:45:01.909
000000000000008751122023-08-30 18:45:02.703
000000000000008751132023-08-30 18:45:02.703

Here is an emulation that you can play with and compare with real data

 

| makeresults
| eval data = split("<Create Timestamp>2023-08-31T04:45:02.212Z</Create Timestamp>
<Message Text>Sequence Numbers processed during this transaction : 00000000000000875119,00000000000000875120,00000000000000875121,00000000000000875122,00000000000000875123,00000000000000875124</Message Text>

<Create Timestamp>2023-08-31T03:45:02.083Z</Create Timestamp>
<Message Text>Sequence Numbers processed during this transaction : 00000000000000875115,00000000000000875116,00000000000000875117,00000000000000875118</Message Text>

<Create Timestamp>2023-08-31T02:45:01.909Z</Create Timestamp>
<Message Text>Sequence Numbers processed during this transaction : 00000000000000875114</Message Text>

<Create Timestamp>2023-08-31T01:45:02.703Z</Create Timestamp>
<Message Text>Sequence Numbers processed during this transaction : 00000000000000875112,00000000000000875113</Message Text>", "

")
| mvexpand data
| rename data as _raw
| spath
| rename "Create./Create.Message" AS "Message Text"
| eval _time = strptime(Create, "%FT%H:%M:%S.%3Q%Z")
| rename Create AS "Create Timestamp"
``` data emulation above ```

 

0 Karma

Naga2
Explorer

Thanks

One more question , How Can I put logic to find missing number in this sequence of dynamically changing numbers:
It has no logic except it increase one by one.Is there any way to build logic for this increase by one number and Need to trigger an alert if it not increase by one- which indicates number missed.

00000000000000875115,00000000000000875116,00000000000000875118

In this case 00000000000000875117 is missing

00000000000000875117

00000000000000875117 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Assuming you have already extracted Message Text into a field called MessageText, try this

| rex field=MessageText max_match=0 "(?<SequenceNumber>\d+)"
| mvexpand SequenceNumber
| table SequenceNumber

Naga2
Explorer

Hello @ITWhisperer & @bowesmana  & @yuanliu ,

 

Thanks

One more question , How Can I put logic to find missing number in this sequence of dynamically changing numbers:
It has no logic except it increase one by one. Is there any way to build logic for this increase by one number and Need to trigger an alert if it not increase by one- which indicates number missed.

00000000000000875115,00000000000000875116,00000000000000875118

In this case 00000000000000875117 is missing

00000000000000875117

00000000000000875117 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try something like this

| rex field=MessageText max_match=0 "(?<SequenceNumber>\d+)"
| table SequenceNumber
| eval fullSequence = mvrange(tonumber(mvindex(SequenceNumber,0)),tonumber(mvindex(SequenceNumber,-1))+1)
| eval missing=mvmap(fullSequence,if(isnull(mvfind(SequenceNumber,fullSequence)),fullSequence,NULL()))

gcusello
SplunkTrust
SplunkTrust

Hi @Naga1,

please try the approach of my example:

| makeresults 
| eval my_field="00000000000000872510,00000000000000872511,00000000000000872512,00000000000000872513,00000000000000872514,00000000000000872515,00000000000000872516,00000000000000872517,00000000000000872518,00000000000000872519,00000000000000872520,00000000000000872521,00000000000000872522,00000000000000872523,00000000000000872524,00000000000000872525,00000000000000872526,00000000000000872527,00000000000000872528,00000000000000872529,00000000000000872530,00000000000000872531,00000000000000872532,00000000000000872533"
| makemv delim="," my_field
| fields - _time
| mvexpand my_field

Ciao.

Giuseppe

Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...