Splunk Search

How to list multiple fields results for multiple fields by hosts.

dfalone
Engager

Hi, I'm pretty new to Splunk and I'm creating a dashboard for one of my environments.  One thing I can't figure out is how to populate a table with entries from multiple fields into a  table sorted by host.  So it should look like this.

HOST             VOLUME NAMES

A                       ARC

B                       ARC, LIV, FOR

C                      LIV, FOR, FUN

The host and all of the volume names come from different fields.  Any help would be greatly appreciated.

 

Labels (5)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust
| stats values(vol_combine) as vol_combine by host

View solution in original post

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Can you share some sample events to show what you are dealing with?

0 Karma

dfalone
Engager

I tried the below but I get lots of entries with "empty" vol names and not sure how to also list by Host.

index=nasuni sourcetype=Nasuni_* | eval vol_combine = "" | fillnull value="" | foreach volumeTableDescription* [eval vol_combine=vol_combine." ".'<<FIELD>>'] | makemv vol_combine | table vol_combine

dfalone_0-1628024472151.png

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Can you share the raw events that you have?

0 Karma

dfalone
Engager

Yes, here is an example.  Thanks!

 

timestamp=1628083691 volumeTableDescription.0 = "IND" volumeTableDescription.1 = "NMG" volumeTableProvider.0 = "Amazon S3" volumeTableProvider.1 = "Amazon S3" volumeTableProtocol.0 = "proto_CIFS" volumeTableProtocol.1 = "proto_CIFS" volumeTableStatus.0 = "available" volumeTableStatus.1 = "available" volumeTableAccessibleData.0 = "792058246622" volumeTableAccessibleData.1 = "431253442376050" volumeTableUnprotectedData.0 = "248054173696" volumeTableUnprotectedData.1 = "0" volumeTableLastSnapshotStart.0 = "2021/Aug/04/12.16.31UTC" volumeTableLastSnapshotStart.1 = "2021/Aug/04/13.08.44UTC" volumeTableLastSnapshotEnd.0 = "2021/Aug/04/12.54.49UTC" volumeTableLastSnapshotEnd.1 = "2021/Aug/04/13.08.56UTC" volumeTableLastSnapshotDuration.0 = "2298" volumeTableLastSnapshotDuration.1 = "12" volumeTableLastSnapshotVersion.0 = "318" volumeTableLastSnapshotVersion.1 = "653391" volumeTableIsActive.0 = "1" volumeTableIsActive.1 = "1" volumeTableIsShared.0 = "0" volumeTableIsShared.1 = "0" volumeTableIsReadOnly.0 = "0" volumeTableIsReadOnly.1 = "0" volumeTableIsPinned.0 = "0" volumeTableIsPinned.1 = "0" volumeTableIsRemote.0 = "1" volumeTableIsRemote.1 = "1" volumeTableAvEnabled.0 = "0" volumeTableAvEnabled.1 = "0" volumeTableRemoteAccessEnabled.0 = "0" volumeTableRemoteAccessEnabled.1 = "0" volumeTableQuota.0 = "0" volumeTableQuota.1 = "0" volumeTableNumAVViolations.0 = "0" volumeTableNumAVViolations.1 = "0" volumeTableNumFileAlerts.0 = "0" volumeTableNumFileAlerts.1 = "0" volumeTableNumExports.0 = "0" volumeTableNumExports.1 = "0" volumeTableNumShares.0 = "5" volumeTableNumShares.1 = "5" volumeTableNumFtpdirs.0 = "0" volumeTableNumFtpdirs.1 = "0"

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Thanks. It looks like that event works with your processing in that you get a mv field with IND and NMG in. Do you have an example of an event that doesn't work, or is it just a case that you want to ignore events which don't work?

0 Karma

dfalone
Engager

Your question actually helped, I was using a * in my sourcetype.  I changed it to only look at Volume and now the spaces are gone.  It now looks like below.  How do I add a Host field and show volumes per host?

dfalone_0-1628086269756.png

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

host would normally be added by the forwarders/indexers so you should just be able to reference it

| table host vol_combine
0 Karma

dfalone
Engager

That works but now I have multiple entries for the same host reporting the same volume.  How do I get it to only list each host once with the respective volumes?

dfalone_0-1628089230750.png

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| stats values(vol_combine) as vol_combine by host
0 Karma

dfalone
Engager

Thank you very much!  This is exactly what I needed. 🙂

0 Karma
Get Updates on the Splunk Community!

Splunk App for Anomaly Detection End of Life Announcement

Q: What is happening to the Splunk App for Anomaly Detection?A: Splunk is officially announcing the ...

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...