Splunk Search

How do I get 1 row per value in a basic search

HattrickNZ
Motivator

with the following search

index=core host="hostname" elementType=ET1 | stats values(randomField)

my output looks something like:

values(randomField)
117440512
117440515
117440516
117440517
117440519
117440520
117440521
117440530
117440531
...

if i download this in the csv format it will all all appear in 2 rows. I want to be able to have the values(randomField) on the 1st row and then 1 row per each subsequent value (e.g. 117440512 on row 2, 117440515 on row 3 ).

How can i alter my search to achieve this?

similar question asked here

Tags (2)
0 Karma
1 Solution

ramdaspr
Contributor

Is there any reason why you arent using the table command with a dedup following it?

Using values will force it to create a multi value field with distinct values which will give the output on one line.
Using table with dedup will give the unique output in individual rows

View solution in original post

ramdaspr
Contributor

Is there any reason why you arent using the table command with a dedup following it?

Using values will force it to create a multi value field with distinct values which will give the output on one line.
Using table with dedup will give the unique output in individual rows

emiller42
Motivator

++

If all you want is to isolate the values of randomField, then you simply need to pipe to table

index=core host="hostname" elementType=ET1 | table randomField

HattrickNZ
Motivator

@ramdaspr tks ...| table randomField | dedup randomField this worked for me.

Get Updates on the Splunk Community!

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

 Prepare to elevate your security operations with the powerful upgrade to Splunk Enterprise Security 8.x! This ...

Get Early Access to AI Playbook Authoring: Apply for the Alpha Private Preview ...

Passionate about security automation? Apply now to our AI Playbook Authoring Alpha private preview ...

Reduce and Transform Your Firewall Data with Splunk Data Management

Managing high-volume firewall data has always been a challenge. Noisy events and verbose traffic logs often ...