I am just a beginner in writing Splunk search queries and i really need your help on below case,
Lets say one of the batches triggers an request "123" at 11 AM and this information being captured in log abc.log (source=abc.log) and the information is stored as " request 123 successfully sent"and after this message generation there are some validation taken place .., during validation if any issues found (lets assume "transaction not completed and it is being rolled back"),that will be logged in error log xyz.log(source=xyz.log).
Now i want to trigger an alert when both the below conditions are met (time period 10.50 AM to 11.15 AM)
1) request 123 successfully sent (saysource=abc.log)
2) transaction not completed and it is being rolled back(source=xyz.log).
If I am understanding your question correctly there are several ways that you are able to do this. The easiest would most likely be by using the Transaction Command. This command allows you to correlate multiple events from a pool of events returned by a search. It will also allow you to specify what event a transaction should start with or end with and how much time should be allowed to pass before the events can no longer be considered the same "group".
Without having any additional specifics I would imagine your search would look something like this:
index=<some_index> (source=abc.log OR source=xyz.log)
| transaction user startswith="successfully sent" endswith="transaction not completed"
| where eventcount>1
If your startswith and endswith criteria can be found within a field then you can make the search more specific by adding ="successfully sent" or something of that nature to make the search more specific. You can also add the maxspan or maxpause criteria if you must.
The "user" field was used as an assumption. You will need to replace it with whatever field is common between the 2 events that you would like to group on.
The where clause here is to show only results that have met both transaction conditions.
Beware that transaction can be resource intensive. If you are piping a large number of events into it then you may want to consider grouping similar events with the stats command first to reduce the number of entries. If you take this approach then you must reorder the events by descending time with the sort command before piping them into the transaction command.