Getting Data In

How to split a single line event into multiple events at search time?

romaindelmotte
Explorer

Hi,

I have those kind of events indexed:

11/26/15 15:05:11.000 retrievePending=0 mergePending=1823 sendPending=43 resendPending=2

The numbers above are the count of pending tasks in different queues of an application.
Unfortunately, I cannot change the way the logs are written down in the log files, and wished it was something like this:

11/26/15 15:05:11.000 queue=retrieve pending=0
11/26/15 15:05:11.000 queue=merge pending=1823
11/26/15 15:05:11.000 queue=send pending=43
11/26/15 15:05:11.000 queue=resend pending=2

So, is there a way - at search time - to split my data into multiple events so I can use a by clause like below?

index=main sourcetype=queues host=web01 | timechart avg(pending) by queue

I've been looking for some time now, even playing with multi-value commands like mvzip, mkexpand, etc., but can't crack this one.

Any help would be appreciated.

Thanks,

Romain

Tags (2)
0 Karma
1 Solution

romaindelmotte
Explorer

Just find a way myself, actually.

 index=main sourcetype=queues host=web01
| eval fields=mvappend("retrieve:".retrievePending,"merge:".mergePending,"send:".sendPending,"resend:".resendPending)
| mvexpand fields
| makemv delim=":" fields
| eval queue=mvindex(fields,0)
| eval count=mvindex(fields,1)
| eval ratio=round((count/500)*100, 2)
| timechart avg(ratio) by queue

Any way to make that a bit more efficient, though?

View solution in original post

romaindelmotte
Explorer

Just find a way myself, actually.

 index=main sourcetype=queues host=web01
| eval fields=mvappend("retrieve:".retrievePending,"merge:".mergePending,"send:".sendPending,"resend:".resendPending)
| mvexpand fields
| makemv delim=":" fields
| eval queue=mvindex(fields,0)
| eval count=mvindex(fields,1)
| eval ratio=round((count/500)*100, 2)
| timechart avg(ratio) by queue

Any way to make that a bit more efficient, though?

fpavlovi
Explorer

It helped me as well, thank you for sharing!

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...