Splunk Search

How to pass the result of query1 to as a input string for the second query?

kiran007
Explorer

Need to pass the result of query1 to as a input string for the second query.

For the First query i'm getting output(x-corelation id) as a filed, that output field(x-corelation id) as a input for the second query to get errors.

Attached both queries below, Please find the attachments and help me out with the requirement. 

query 1:- 

index=cloud_ecp sourcetype="prod-ecp-aks-cluster-logs" "bookAppointmentRequest" | fields data.req.headers.xcorrelationid

 

query 2 :- 

index=cloud_ecp sourcetype="prod-ecp-aks-cluster-logs" <co-relationid of query1> "Error"

 

Note :- the co-relationid's are more than one, need to loop all those id's if any.

Labels (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

SPL is _not_ a procedural language. While sometimes you can "loop", you shouldn't do that unless you can't avoid it.

And in this case you probably can.

If I understand your searches correctly, you have many events containing the common field data.req.headers.xcorrelationid. Some of them might contain the string "bookAppointmentRequest" , some (possibly other ones) might contain "Error". And you only want those errors, for which other events with the same data.req.headers.xcorrelationid contain "bookAppointmentRequest".

There are several ways to do it.

First, most obvious but often not the best due to how Splunk works and subsearch limitations is to indeed use subsearch to generate a list of the ID's and only search for those ID's. However, with a relatively small subset of the events with "bookAppointmentRequest" string, it might be the most efficient solution. Be aware however, that subsearches can fail silently if you exceed execution time limit or number of returned results

index=cloud_ecp sourcetype="prod-ecp-aks-cluster-logs" "Error" [ index=cloud_ecp sourcetype="prod-ecp-aks-cluster-logs" "bookAppointmentRequest" | table data.req.headers.xcorrelationid ]

Second one would be to use a transaction command to group all events with the same ID and only search for those with Errors. Should work but transaction has its limitations as well and is a relatively "heavy" comand.

index=cloud_ecp sourcetype="prod-ecp-aks-cluster-logs"
| transaction data.req.headers.xcorrelationid
| search "bookAppointmentRequest" "Error"

Unfortunately, since you have to scan all events to find the correlation over the xcorrelationid, the search will not be very efficient either way but the preferable approaches usually involve stats or eventstats.

index=cloud_ecp sourcetype="prod-ecp-aks-cluster-logs"
| stats list(_raw) by data.req.headers.xcorrelationid
| search "bookAppointmentRequest" "Error

It's a similar approach to the transaction command but it works slightly differently internally.

Which approach is best here depends on the actual data.

kiran007
Explorer
 
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @kiran007,

let me understand, you want to filter search2, using the ids from search 1, is it correct?

I suppose that the field "data.req.headers.xcorrelationid" is present in both the searches and that search2 hasn't more than 50,000 results.

In this case you could try something like this:

index=cloud_ecp sourcetype="prod-ecp-aks-cluster-logs" "Error" [ search index=cloud_ecp sourcetype="prod-ecp-aks-cluster-logs" "bookAppointmentRequest" | fields data.req.headers.xcorrelationid ]

 if "data.req.headers.xcorrelationid" is named in a different way in the main search, you have to rename in the subsearch to be sure that the field name is the same in both searches.

If you could have more than 50,000 results from the subsearch, tell me because the solution is completely different.

Ciao.

Giuseppe

isoutamo
SplunkTrust
SplunkTrust
@gcusello should you add " | format" to then end of subquery?
0 Karma
Get Updates on the Splunk Community!

Federated Search for Amazon S3 | Key Use Cases to Streamline Compliance Workflows

Modern business operations are supported by data compliance. As regulations evolve, organizations must ...

New Dates, New City: Save the Date for .conf25!

Wake up, babe! New .conf25 dates AND location just dropped!! That's right, this year, .conf25 is taking place ...

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...