Splunk Search

Is it possible to split lines and re-use certain fields?

wvanloon
New Member

I have thise event:

ID=FAKE_ID_NAME,TS=1570441680,F1=1380,F2=60,F3=60,F4=1500

For my analysis it would be very usefull to get every field to a new line except ID and TS, so the desired output is:

ID=FAKE_ID_NAME,TS=1570441680,F1=1380;
ID=FAKE_ID_NAME,TS=1570441680,F2=60;
ID=FAKE_ID_NAME,TS=1570441680,F3=60;
ID=FAKE_ID_NAME,TS=1570441680,F4=1500;

How can I achieve this?

Tags (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

HI wvanloon,
try something like this:

| makeresults 
| eval ID="FAKE_ID_NAME", TS="1570441680", F1="1380", F2="60", F3="60", F4="1500"
| eval col=ID." ".TS
| stats values(F1) AS F1 values(F2) AS F2 values(F3) AS F3 values(F4) AS F4 BY col
| untable col field value
| rex field=col "^(?<ID>[^ ]*)\s+(?<TS>[^ ]*)"
| eval my_field=field."=".value
| table ID TS my_field

Bye.
Giuseppe

0 Karma

wvanloon
New Member

Thanks!

Another problem is that I don't know which fields i have for each event.
So it can be 1F, 2F, 3F, 4F or something totally different like 1S, 2S, 6S and so on. I still want to duplicate the TS and ID column.

Can that also be solved?

0 Karma

adonio
Ultra Champion

what is the problem you are trying to solve?

0 Karma

wvanloon
New Member

I need to join the events based on 2 fields ID and the name of the other fields like F1.

So I have an lookup-table with:

ID;INDEX;Value
FAKE_ID_NAME;F1;95

If you have any other ideas to solve this that would be great!

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...