Hello,
I need to report on a set of lets say 4 different jobs regardless if there are event results for each one. If there are no events found for any of the jobs i need to evaluate the status as "not complete"
For example here is my search
index= source=
jobName=A OR
jobName=B OR
jobName=C OR
jobName=D
| stats latest(statusText) as latestStatus latest(filename) as latestFilename latest(timestamp) as latestTimestamp by jobName
| eval feedStatus=if(latestStatus!="SUCCESS", "Filewatch Not complete - Feed may not have been received", "Filewatch Complete")
| table jobName latestFilename latestStatus feedStatus latestTimestamp
Now I may get results for A, B, C and it will display in my table. But there may be no events found for Job D, nothing will be displayed for that job when in fact it should have a message along with it to evaluate as incomplete. How can I evaluate that there were no search results for this particular job and display it in the same table ?
I tried take a count of the search results for each job to evaluate the message when count=0, but nothing display if no events for the particular jobName was found.
Hi,
this is a common problem, since splunk does not return any event, if "nothing" happend, which is in your case a failure.
You can implement a workaround if you do have a fixed, static amount of jobNames you want to report on, as it looks like in your search.
For example, at the end of your search add dummy events for every jobName like | append [|makeresults | eval jobName ="A" | latestStatus = "Dummy"] in your search. Then you count with streamstats by jobName. If there is a result from the index the count will be 2 (since the real event + dummy event = 2). If there is no event in the index, the count is 1 (since only the dummy event is left).
From there on, you can manipulate the results with a little bit of eval and if logic to get your desired result.
The exact solution depends on your events and results. But it is a way to work around the problem.
Greetings
Tom
Hi,
this is a common problem, since splunk does not return any event, if "nothing" happend, which is in your case a failure.
You can implement a workaround if you do have a fixed, static amount of jobNames you want to report on, as it looks like in your search.
For example, at the end of your search add dummy events for every jobName like | append [|makeresults | eval jobName ="A" | latestStatus = "Dummy"] in your search. Then you count with streamstats by jobName. If there is a result from the index the count will be 2 (since the real event + dummy event = 2). If there is no event in the index, the count is 1 (since only the dummy event is left).
From there on, you can manipulate the results with a little bit of eval and if logic to get your desired result.
The exact solution depends on your events and results. But it is a way to work around the problem.
Greetings
Tom
Thank you this was very helpful