Splunk Search

How to create multiple subevents out of a field ?

sbsbb
Builder

In one log line, I have multiple xml events
example :
logtime bla bal bla

  • How can I display them in a table view like ?
    eventid1 value1 value2
    eventid2 value1 value2
    eventid3 value1 value2

  • How can I make a count :
    count all event (all ids)
    ?

I can only make changes on the webmanager, I have no access to the server directly...

0 Karma
1 Solution

alacercogitatus
SplunkTrust
SplunkTrust

There is a command that can help you. It shall be called "Xpath"! Along with xpath is xmlkv.

main_search_for_xml | xpath outfield=event_id "//event/@id"|xmlkv| other_stuff

Then your "other_stuff" can be your stats commands.

stats dc(event_id) as "NumberOfDistinctEventIDs" count(event_id) as "NumberOfEvents"

and

stats values(event) by event_id

http://docs.splunk.com/Documentation/Splunk/5.0/SearchReference/Xpath
http://docs.splunk.com/Documentation/Splunk/5.0/SearchReference/Xmlkv

View solution in original post

alacercogitatus
SplunkTrust
SplunkTrust

There is a command that can help you. It shall be called "Xpath"! Along with xpath is xmlkv.

main_search_for_xml | xpath outfield=event_id "//event/@id"|xmlkv| other_stuff

Then your "other_stuff" can be your stats commands.

stats dc(event_id) as "NumberOfDistinctEventIDs" count(event_id) as "NumberOfEvents"

and

stats values(event) by event_id

http://docs.splunk.com/Documentation/Splunk/5.0/SearchReference/Xpath
http://docs.splunk.com/Documentation/Splunk/5.0/SearchReference/Xmlkv

Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...