Splunk Search

how to avoid breaking event order of multi value field

sharif_ahmmad
Explorer

Hi community,

I am wondering, how can i keep the data of multi value field based in the order it happened, when showing it as search result using command like stats. For example, stats list() keeps event order intact. But stats values() sorts the values using lexicographical order.

Is there any other functions or ways that keeps the event order intact like stats list()?

0 Karma
1 Solution

to4kawa
Ultra Champion
index=_internal
| streamstats count
| eval tmp=count."#"._raw
| fields tmp
| table tmp

Hi, @sharif_ahmmad
I often attach the number and fields of streamstats as in the example, and separate them later.
If there are many duplicate values in stats list (), create it temporarily before using stats value ().

It is more difficult to maintain the order of the fields than the order of the events.

View solution in original post

to4kawa
Ultra Champion
| makeresults 
| eval _raw="1##121,5646435,2019-01-30 12:44:54,2019-01-30 12:56:58,jesse" 
| appendpipe 
    [ eval _raw="2##5654655|CommonText|121|45453235|Some Text|Text|Text|Text|Text|jesse|30-JAN-2019 12.58.51.00 PM||||"] 
| rex "^(?:1##)(?<Customer_ID>\d+),\d+,(?<Purchase_Time>\d{4}-\d\d-\d\d \d\d:\d\d:\d\d),.+,(?<Customer_Name>\w+$)"
| rex "^(?:2##)\d+\|\w+\|(?<Customer_ID>\d+)\|.*\|(?<Customer_Name>\w+)\|(?<Closing_Time>\d\d-\w+-\d{4} \d\d\.\d\d\.\d\d\.\d\d \w\w)\|*\|*\|*\|$"
| stats values(*) as * by Customer_ID
| table Customer_ID Customer_Name Purchase_Time Closing_Time

Regular expressions are hard.

Hi, @sharif_ahmmad
How about this?


ALT:

| makeresults 
| eval _raw="1##121,5646435,2019-01-30 12:44:54,2019-01-30 12:56:58,jesse" 
| appendpipe 
    [ eval _raw="2##5654655|CommonText|121|45453235|Some Text|Text|Text|Text|Text|jesse|30-JAN-2019 12.58.51.00 PM||||"] 
| eval msg=mvindex(split(_raw,"#"),2), No=mvindex(split(_raw,"#"),0) 
| eval split_tmp=if(No=1,split(msg,","),split(msg,"|")) 
| eval Customer_ID=case(No=1,mvindex(split_tmp,0),No=2,mvindex(split_tmp,2)) 
| eval Customer_Name=case(No=1,mvindex(split_tmp,-1),No=2,mvindex(split_tmp,9)) 
| eval Purchase_Time=case(No=1,mvindex(split_tmp,2),No=2,null()) 
| eval Closing_Time=case(No=1,null(),No=2,mvindex(split_tmp,10)) 
| stats values(*) as * by Customer_ID 
| table Customer_ID Customer_Name Purchase_Time Closing_Time

sharif_ahmmad
Explorer

@to4kawa Thanks a lot. Really Appreciate your effort. 🙂

0 Karma

to4kawa
Ultra Champion
index=_internal
| streamstats count
| eval tmp=count."#"._raw
| fields tmp
| table tmp

Hi, @sharif_ahmmad
I often attach the number and fields of streamstats as in the example, and separate them later.
If there are many duplicate values in stats list (), create it temporarily before using stats value ().

It is more difficult to maintain the order of the fields than the order of the events.

sharif_ahmmad
Explorer

Hi @to4kawa , here is what i find after running the commands,

1##121,5646435,2019-01-30 12:44:54,2019-01-30 12:56:58,jesse

2##5654655|CommonText|121|45453235|Some Text|Text|Text|Text|Text|jesse|30-JAN-2019 12.58.51.00 PM||||

Now this may not relevant to original question but how do i extract from this two line of text? Having trouble with this. Can't seem to extract the dates no matter what.

What i want is to extract 2019-01-33 12:44:54 from 1st row and 30-JAN-2019 12.58.51.00 PM from second row and then show them in the same row in output result. Like this?

Customer ID Customer Name Purchase Time Closing Time

121 jesse 2019-01-30 12:44:54 30-JAN-2019 12.58.51.00 PM

How do i do that?

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...