I have seven jobs that run at regular intervals, and I can see them in Splunk. However, when I use this search string to tell me when jobs are less than 7, I get nothing. I was hoping to get this to trigger when one or more of the jobs didn't work.
Here is the search string...
index="qlepiqure2_prod_admin*" source="/opt/hybris/hybris/log/tomcat/console*" qantasD*job | stats count by jobname | eval flag=case(jobname=="qantasDhlAsnConfirmationImportJob",1,jobname=="qantasDhlInventoryAdjustmentImportJob",1, jobname=="qantasDhlItemMasterExportJob",1,jobname=="qantasDhlSalesOrderAckImportJob",1,jobname=="qantasDhlSalesOrderConfImportJob",1,jobname=="qantasDhlSalesOrderExportJob",1,jobname=="qantasDhlStockOnHandImportJob",1)|eventstats sum(flag) as jobtotal | search jobtotal < 7
Try this. Trigger your alert if search count<7
index="qlepiqure2_prod_admin*" source="/opt/hybris/hybris/log/tomcat/console*" qantasD*job | stats dc(jobname) as flag
Hi William,
Can you please let me know how did you setup the integration between Hybris and SPLUNK ? using a third party connector or any other way? We have similar requirement , your answers would be helpful.
Thanks
Ravi
Try this. Trigger your alert if search count<7
index="qlepiqure2_prod_admin*" source="/opt/hybris/hybris/log/tomcat/console*" qantasD*job | stats dc(jobname) as flag