Getting Data In

How to split a single line event into multiple events at search time?

romaindelmotte
Explorer

Hi,

I have those kind of events indexed:

11/26/15 15:05:11.000 retrievePending=0 mergePending=1823 sendPending=43 resendPending=2

The numbers above are the count of pending tasks in different queues of an application.
Unfortunately, I cannot change the way the logs are written down in the log files, and wished it was something like this:

11/26/15 15:05:11.000 queue=retrieve pending=0
11/26/15 15:05:11.000 queue=merge pending=1823
11/26/15 15:05:11.000 queue=send pending=43
11/26/15 15:05:11.000 queue=resend pending=2

So, is there a way - at search time - to split my data into multiple events so I can use a by clause like below?

index=main sourcetype=queues host=web01 | timechart avg(pending) by queue

I've been looking for some time now, even playing with multi-value commands like mvzip, mkexpand, etc., but can't crack this one.

Any help would be appreciated.

Thanks,

Romain

Tags (2)
0 Karma
1 Solution

romaindelmotte
Explorer

Just find a way myself, actually.

 index=main sourcetype=queues host=web01
| eval fields=mvappend("retrieve:".retrievePending,"merge:".mergePending,"send:".sendPending,"resend:".resendPending)
| mvexpand fields
| makemv delim=":" fields
| eval queue=mvindex(fields,0)
| eval count=mvindex(fields,1)
| eval ratio=round((count/500)*100, 2)
| timechart avg(ratio) by queue

Any way to make that a bit more efficient, though?

View solution in original post

romaindelmotte
Explorer

Just find a way myself, actually.

 index=main sourcetype=queues host=web01
| eval fields=mvappend("retrieve:".retrievePending,"merge:".mergePending,"send:".sendPending,"resend:".resendPending)
| mvexpand fields
| makemv delim=":" fields
| eval queue=mvindex(fields,0)
| eval count=mvindex(fields,1)
| eval ratio=round((count/500)*100, 2)
| timechart avg(ratio) by queue

Any way to make that a bit more efficient, though?

fpavlovi
Explorer

It helped me as well, thank you for sharing!

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...