I have specific events with rows and rows of MV data. They have a header and footer data but the bulk of the body is values for a given process on the host system. Analysts use these events to validate the host system processes. Presently they are copying/pasting the raw event text off the page and manually generating csv's from it. I would like to streamline this process for them by having Splunk spit out a CSV of the needed values. The body looks something like:
fieldnameA|fieldnameB|fieldnameC
|valueA1|valueB1|valueC1|
|valueA2|ValueB2|valueC2|
|valueA3|Valueb3|valueC3|...
This goes on for thousands to tens of thousands of lines. I can rex out the values into Multi value fields and table those results but the results are a single row. How would I create separate rows for each series of values so that the csv export provides separate rows?
You could do something like this, a bit fiddly, but generally functional.
| makeresults
| fields - _time
| eval data="fieldnameA|fieldnameB|fieldnameC
|valueA1|valueB1|valueC1|
|valueA2|ValueB2|valueC2|
|valueA3|Valueb3|valueC3|"
| makemv tokenizer="([^\n]+)\n?" data
| mvexpand data
| eval data=trim(data, "|")
| rex field=data "(?<f_0>[^|]*)\|(?<f_1>[^|]*)\|(?<f_2>.*)"
| streamstats c
| eval d=split(data,"|")
| foreach 0 1 2 [ eval val=mvindex(d,<<FIELD>>), tmp_<<FIELD>>_{val}=if(c=1,val,null()) ]
| foreach tmp_*_* [ eval <<MATCHSEG2>>=f_<<MATCHSEG1>> ]
| where c>1
| fields - d data val f_* tmp_* c
Up to the rex statement is setting up your example body, then
There are possibly other ways to achieve this, but this one seems to do what I believe you are wanting to do.
If you need to increase the field count, then change the rex statement to create f_X field names and add in the extra field numbers in the first foreach statement 0 1 2 ...
Have you tried using mvexpand?
Maybe you can share your search so it's easier to help, if you can't get it to work.
mvexpand hits limits when I use it. Let me see if I can grab the search.