Getting Data In

How to combine two queries?

kranthimutyala
Path Finder

Hi All,

I have 2 different queries and I want to combine their results. These 2 queries return a single value output I want these 2 values in the same search result. Thanks for any help.

 

 

index=“abc” (TYPE="Run bot finished" OR TYPE="Run bot Deployed")  |  search STATUS=Successful TYPE="Run bot finished" | stats  count |rename count as Success_Count

index = “abc” RPAEnvironment = "prd"  ProcessName = "*" LogType = "*" TaskName = "*Main*"  (LogLevel=ERROR OR LogLevel=FATAL)
| eval Time = strftime(_time, "%Y-%m-%d %H:%M:%S")           
| eval LogDescription = trim(replace(LogDescription, "'", ""))         
| eval LogMessage = trim(replace(LogMessage, "'", ""))          
| eval TaskName = trim(replace(TaskName, "'", ""))           
| eval host=substr(host,12,4)           | eval Account=if(User!= "" ,User,LoginUser)          
| table  Time, LogNo, host, Account, LogType, LogMessage, TaskName ,ProcessName          
| rename  LogMessage as "Log Message", TaskName as "Task Name", host as "VDI"  | sort  - Time|stats count|rename count as Failure_Count

 

 

 

 

 

 

 

 

 

 

Labels (4)
Tags (2)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @kranthimutyala,

di you already explored the "append" command?

index=“abc” (TYPE="Run bot finished" OR TYPE="Run bot Deployed") STATUS=Successful TYPE="Run bot finished" 
| stats  count 
| rename count as Success_Count
| append [ search 
     index = “abc” RPAEnvironment = "prd"  ProcessName = "*" LogType = "*" TaskName = "*Main*"  (LogLevel=ERROR OR LogLevel=FATAL)
     | eval 
          Time = strftime(_time, "%Y-%m-%d %H:%M:%S"),
          LogDescription = trim(replace(LogDescription, "'", "")),
          LogMessage = trim(replace(LogMessage, "'", "")),
          TaskName = trim(replace(TaskName, "'", "")),
          host=substr(host,12,4),
          Account=if(User!= "" ,User,LoginUser)
     | stats count
     | rename count as Failure_Count ]

in this way you'll have the results of the two searches in two different rows of the same table, if you want to have them in a single row you can use "transpose".

Ciao.

Giuseppe

MrIncredible
Explorer

@gcusello @PickleRick And say if there are 3 or more queries to combine the results ... append will work ?

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @MrIncredible,

yes, but only id the results of each subsearch are less than 50,000.

if one answer solves your need, please accept one answer for the other people of Community or tell us how we can help you.

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated by all the Contributors;-)

pavi1611
Engager

what if ,it has more than 50k events

index="aaam_devops_elasticsearch_idx"
| search project = Einstein360_TicketsCreated_ElasticSearch_20210419
| search source.TransactionName ="ITGTicketCreated"
| dedup id
| timechart span=1d count as ITGTicketCreated
| join max=0 _time [| search index="aaam_devops_elasticsearch_idx"
| search project = Einstein360_TruckRollCreated_ElasticSearch_20210420
| search source.TransactionName = "Truck_Roll_Create_Result"
| timechart span=1d dc(id) as TruckRollCreated]

| stats values(ITGTicketCreated) as ITGTicketCreated values(TruckRollCreated) as TruckRollCreated by _time

 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

By default (unless the limit is reconfigured) join uses only 50000 results from the subsearch to join with the outer search and runs for only 60 seconds to generate those results. If your subsearch exceeds 60 seconds of execution time or generates more than 50k results, it's silently finalized and only the results returned so far (up to 50k) are used for join.

With other subsearch uses the limits can be lower - even down to 10k results.

That's one of the reasons subsearches are best avoided - since they are silently finalized, you're not getting any feedback that you're not getting full results from your search and you're not aware that your final results might be incomplete or plain wrong.

0 Karma

dtburrows3
Builder

If a subsearch has more than 50,000 events or takes longer than 1 minute (i think) to run it will auto-finalize. Occurrence of either of these scenarios will cause the data returned from the subsearch to be truncated and incomplete.

BTW I don't think the search you shared needs to use a join/subsearch, something like this will probably do the same thing.

index="aaam_devops_elasticsearch_idx" ((project="Einstein360_TicketsCreated_ElasticSearch_20210419" AND "source.TransactionName"="ITGTicketCreated") OR (project="Einstein360_TruckRollCreated_ElasticSearch_20210420" AND "source.TransactionName"="Truck_Roll_Create_Result"))
    | timechart span=1d
        dc(eval(case('project'=="Einstein360_TicketsCreated_ElasticSearch_20210419" AND 'source.TransactionName'=="ITGTicketCreated", id))) as ITGTicketCreated,
        dc(eval(case('project'=="Einstein360_TruckRollCreated_ElasticSearch_20210420" AND 'source.TransactionName'=="Truck_Roll_Create_Result", id))) as TruckRollCreated



pavi1611
Engager

Thanks, a lot... Its working  🙂 🙂 

0 Karma

MrIncredible
Explorer

Thanks @gcusello append is working more than 1 time also. My results are appearing as per attached screenshot (3 rows * 3 columns). I tried transpose command to make this in good format but somehow not able to remove last 2 columns. I mean what I am looking for, values should be shown like 3 rows & 2 columns. Is it possible with transpose command? can you help please

MrIncredible_0-1672990843829.png

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @MrIncredible,

please share your search and next time open a new question instead append your request to another question.

Ciao.

Giuseppe

0 Karma

MrIncredible
Explorer
0 Karma

kranthimutyala
Path Finder

@gcusello I want do a eval calculation like success_count - Failure count but its not working

0 Karma

gcusello
SplunkTrust
SplunkTrust

@kranthimutyala,

In this case, you have to put both the results on the same row and the you can use eval, something like this:

index=“abc” (TYPE="Run bot finished" OR TYPE="Run bot Deployed") STATUS=Successful TYPE="Run bot finished" 
| stats  count 
| rename count as Success_Count
| append [ search 
     index = “abc” RPAEnvironment = "prd"  ProcessName = "*" LogType = "*" TaskName = "*Main*"  (LogLevel=ERROR OR LogLevel=FATAL)
     | eval 
          Time = strftime(_time, "%Y-%m-%d %H:%M:%S"),
          LogDescription = trim(replace(LogDescription, "'", "")),
          LogMessage = trim(replace(LogMessage, "'", "")),
          TaskName = trim(replace(TaskName, "'", "")),
          host=substr(host,12,4),
          Account=if(User!= "" ,User,LoginUser)
     | stats count
     | rename count as Failure_Count ]
| stats values(Failure_Count) AS Failure_Count values(Success_Count) AS Success_Count
| eval diff=Success_Count-Failure_Count

Ciao.

Giuseppe

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @kranthimutyala,

sorry I forgot to add an information (as @PickleRick said) you can use a subquery if you have less than 50,000 results.

Ciao.

Giuseppe

0 Karma

PickleRick
SplunkTrust
SplunkTrust

I would be very cautious with appending such wildcarded subsearch. Especially over a longer time period the subsearch has a decent chance of timing out.

Since both searches pull from the same index, I'd try some conditional evaluation.

But the other - more "high-level" - question is if one of the searches gives Success count, the other gives Failure Count, is there any additional category of events? Or maybe they can be split easier between those two categories? (I don't suppose an event can be both Success and Failure)

0 Karma

kranthimutyala
Path Finder

@PickleRick  yes two events are different. And at the end I'm looking to take the difference between success count and Failure count , but it's not working with the query shared by @gcusello 

0 Karma
Get Updates on the Splunk Community!

Index This | Divide 100 by half. What do you get?

November 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Stay Connected: Your Guide to December Tech Talks, Office Hours, and Webinars!

❄️ Celebrate the season with our December lineup of Community Office Hours, Tech Talks, and Webinars! ...

Splunk and Fraud

Watch Now!Watch an insightful webinar where we delve into the innovative approaches to solving fraud using the ...