Splunk Search

How to create a subsearch based on parent search?

eitangabay
New Member
 

I want to create subsearch based on parent fields search. I want to show only rows from cor_inbox_entry that includes keys.OrderID. (keys.OrderID is substring of fullBodID)

Example for fullBodID : infor-nid:infor:111:APRD00908_2022-09-06T12:01:26Z:?ProductionOrder&verb=Process&event=10545 

Example for keys.OrderID : APRD00908

index=elbit_im sourcetype=cor_inbox_entry 
| spath input=C_XML output=bod path=ConfirmBOD.DataArea.BOD 
| xpath outfield=fullBodID field=bod "//NameValue[@name='MessageId']" 
|appendpipe
    [ search "metadata.Composite"=ReportOPMes2LN 
    | search fullBodID = "*".keys.OrderID."*"] 
| table _time, fullBodID

Any idea?

Labels (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

Do you already have an extracted field keys.OrderID?

If you want to search for rows that contain a value for that field, then just add

| search keys.OrderID=*

or 

| where isnotnull('keys.OrderID')

after your xpath statement.

I am not sure you are talking about a subsearch, but perhaps really want a second search into the current results in the pipeline. 

You can add search or where clauses at any point in the pipeline to filter data as it travels through your search.

 

 

0 Karma

jdunlea
Contributor

Is there a specific value for keys.OrderID that you are looking for? Or are you trying to extract all the various values for keys.OrderID and then filter the parent data to find all the events which contain those different values for keys.OrderID?

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...