Splunk Search

How to make table from key-value series?

fechnert
Explorer

HI,

I have this table with one column and 3 rows (could be more as this is a search result) and ther could be also more entries in a data set

date_minute:34,host:h_a,index:prod
date_minute:39,host:h_b,index:prod
date_minute:44,host:h_c:index:prod

date_minute   host   index     <--- these are the table headers
34    h_a    prod
39    h_b    prod
44    h_c    prod

if there is a line like:
date_minute:44,host:h_c:index:prod,user:test
the  user:test  should be added as new column (to have 4 column)

What is the best way to do this?

 

Labels (1)
0 Karma
1 Solution

Tom_Lundie
Contributor

@richgalloway's answer will do what you need it to but you just need to rename the "fields" column as _raw first to get the extract command to pick it up.

 

| rename fields as _raw
| extract kvdelim=":" pairdelim=","
| fields - _raw

 

This SPL and the SPL you have shared above is going to play up if the summary indexes feeding this search contain brackets, colons or commas.  It might be worth stripping those characters out in Macro X.

Here is a run anywhere for future reference:

 

| makeresults
| eval sample = "date_minute:34,host:h_a,index:prod|date_minute:39,host:h_b,index:prod|date_minute:44,host:h_c,index:prod"
| makemv sample delim="|"
| mvexpand sample
| fields - _time
| rename sample as fields
| rename fields as _raw
| extract kvdelim=":" pairdelim=","
| fields - _raw 

View solution in original post

Tags (2)

fechnert
Explorer

to clarify: it is not a normal output whith known fields and number of fields.
the fields are dynamicly created from a different search. The result of this search is pasted to through a macro, done some work on the fields. Now I have the table with one column and an unkown/dynamic amount of fields and names. So I cannot be sure about the amount of fields or the names.
I tried with mv-operations, transpose or split operations but get stuck.

 

0 Karma

Tom_Lundie
Contributor

 Thanks for clarifying,


@fechnert wrote:

The result of this search is pasted to through a macro


What do you mean by this? I'm struggling to understand why there is no way to edit the upstream SPL in order to  get the fields structured nicely. Is there anyway you could share some sample SPL / Data showing the whole problem end-to-end please? I understand that you might need to obfuscate this.

0 Karma

fechnert
Explorer

Background: I like to store the output of a table crypted into another index.


example search from  user-1:
...
| table user,host,_time,eventcode, status
| 'macro X'

example search from user-2:
| table severity,proc,username,dest, source_ip
| 'macro X'

example search from user-3:
| table date_minute,host,index
| 'macro X'

user-3 search creates this before the macro call:
date_minute   host   index  
34             h_a    prod
39             h_b    prod
44             h_c    prod



macro X:
| ....
/|  now converting output of table into json field (thanks to this forum!)
| appendpipe
    [
     | eventstats count as r_ev
     | foreach *
        [ | eval jsonmv_ = if("<<FIELD>>" == "r_ev",jsonmv_ , mvappend(jsonmv_,"'<<MATCHSTR>>':'" + <<FIELD>> + "'")  ) ]
      | eval json_result = "{" + mvjoin(jsonmv_,",") + "}"
      | fields - jsonmv_
      | stats values(json_result) AS table_result by r_ev
      /## here the field "table_result"  for user-3 is a long string:
   {date_minute:34,host:h_a,index:prod} {date_minute:39,host:h_b,index:prod} {date_minute:44,host:h_c:index:prod}   /
      /| ##crypting the table_result ->output field is table_result_crypted/
      | table some_fiels , table_result_crypted
      | collect index=XXX 
]
      
Now I have the result of the user search from the user-1 or user-2 as a json-field crypted in another index.
In a dashboard I like to have a table as the user-1 or user-2  or user-3 had: in the beginning:

index=XXX
| /####decrypting the field into field "tab_res" : I have the same field content as table_result above /
| /##removing {} /
| rex mode=sed field=tab_res "s/}//g"
| rex mode=sed field=tab_res "s/{//g"
| eval fields=split(tab_res," ")
| mvexpand fields
|  table fields

---> here I have a table with 3rows, 1 column:
date_minute:34,host:h_a,index:prod
date_minute:39,host:h_b,index:prod
date_minute:44,host:h_c:index:prod

but I need to split in a dynamic way the rows into columns not knowing how much fields or lines I will have.
Nice would be to have something like this pseudo-code:
| split_into_columns delim=","  "header:cell_content"

Output
date_minute   host   index  
34             h_a    prod
39             h_b    prod
44             h_c    prod

I tried to split at delim "," (did not work!) and transpose the table.....not working....

Tom_Lundie
Contributor

@richgalloway's answer will do what you need it to but you just need to rename the "fields" column as _raw first to get the extract command to pick it up.

 

| rename fields as _raw
| extract kvdelim=":" pairdelim=","
| fields - _raw

 

This SPL and the SPL you have shared above is going to play up if the summary indexes feeding this search contain brackets, colons or commas.  It might be worth stripping those characters out in Macro X.

Here is a run anywhere for future reference:

 

| makeresults
| eval sample = "date_minute:34,host:h_a,index:prod|date_minute:39,host:h_b,index:prod|date_minute:44,host:h_c,index:prod"
| makemv sample delim="|"
| mvexpand sample
| fields - _time
| rename sample as fields
| rename fields as _raw
| extract kvdelim=":" pairdelim=","
| fields - _raw 
Tags (2)

fechnert
Explorer

Thank you!.

The key for solving this was the rename command to _raw!

But I have to remove the old fields completley - otherwise the table output (| table *  ---needed to display a stastic table ) will display ALL fields, _time, host, sourcetype, old_index etc.

I added just before the split-command:

| table tab_res
| fields tab_res

 

0 Karma

Tom_Lundie
Contributor

I'm a little confused about what's going on with the tables that you mentioned in your question. Let me know if I'm missing the beat here.

From what I'm reading you're starting out with data structured like this
1date_minute:44,host:h_c:index:prod1
key:value, comma-delimited
and you want to be able to format this data as a table? If so:

I suggest that you set-up field extractions for your sourcetype. Add the following to your props and transforms conf (replacing my_sourcetype with the name of the sourcetype in question).

props.conf

[my_sourcetype]
REPORT-kv_comma_colon

transforms.conf

[kv_comma_colon]
DELIMS = ",", ":"

 
Then you will be able to make any table that you want with:

index=my_index sourcetype=my_sourcetype
| table data_minute, host, index

Note that host and index are default fields so you might want to set-up a field alias for these.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Use the extract command to convert those key:value pairs into fields and values.

<<your search>>
| extract pairdelim="," kvdelim=":"
| table date_minute host index *

 

---
If this reply helps you, Karma would be appreciated.
Get Updates on the Splunk Community!

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...

Explore the Latest Educational Offerings from Splunk (November Releases)

At Splunk Education, we are committed to providing a robust learning experience for all users, regardless of ...

New This Month in Splunk Observability Cloud - Metrics Usage Analytics, Enhanced K8s ...

The latest enhancements across the Splunk Observability portfolio deliver greater flexibility, better data and ...