Splunk Search

Why is loadjob retruning null?

dezmadi
Path Finder

I have below query as query returning  null

 

<search id="dfLatencyOverallProcessingDelayBaseSearch">
<query>index="deng03-cis-dev-audit" | eval serviceName = mvindex(split(index, "-"), 1)."-".mvindex(split(host, "-"), 2) |search "data.labels.activity_type_name"="ViolationOpenEventv1" |spath PATH=data.labels.verbose_message output=verbose_message |
where verbose_message like "%overall_processing_delay%Dataflow Job labels%" | eval error=case(like(verbose_message,"%is above the threshold of 60.000%"), "warning", like(verbose_message,"%is above the threshold of 300.000%"), "failure") </query>
<earliest>$time.earliest$</earliest>
<latest>$time.latest$</latest>
<sampleRatio>1</sampleRatio>
<done>
<condition>
<set token="dfLatencyOverallProcessingDelay_sid">$job.sid$</set>
</condition>
</done>
</search>

Then

SomeQuery.append [ loadjob $dfLatencyOverallProcessingDelay_sid$ | eval alertName = "Dataflow-Latency-Overall processing high delay" | stats values(alertName) as AlertName values(serviceName) as serviceName count(eval(error=="failure")) as failureCount count(eval(error=="warning")) as warningCount]

If result from dfLatencyOverallProcessingDelay_sid are null, then AlertName is also coming as blank, I want this to be  "Dataflow-Latency-Overall processing high delay"

0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

Try something like this

append [ loadjob $dfLatencyOverallProcessingDelay_sid$ | eval alertName = "Dataflow-Latency-Overall processing high delay" | stats values(alertName) as AlertName values(serviceName) as serviceName count(eval(error=="failure")) as failureCount count(eval(error=="warning")) as warningCount | appendpipe [stats count as nullcount | where nullcount = 0 | eval alertName = "Dataflow-Latency-Overall processing high delay"]]

View solution in original post

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try something like this

append [ loadjob $dfLatencyOverallProcessingDelay_sid$ | eval alertName = "Dataflow-Latency-Overall processing high delay" | stats values(alertName) as AlertName values(serviceName) as serviceName count(eval(error=="failure")) as failureCount count(eval(error=="warning")) as warningCount | appendpipe [stats count as nullcount | where nullcount = 0 | eval alertName = "Dataflow-Latency-Overall processing high delay"]]
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Take Action Automatically on Splunk Alerts with Red Hat Ansible Automation Platform

 Are you ready to revolutionize your IT operations? As digital transformation accelerates, the demand for ...

Calling All Security Pros: Ready to Race Through Boston?

Hey Splunkers, .conf25 is heading to Boston and we’re kicking things off with something bold, competitive, and ...

Beyond Detection: How Splunk and Cisco Integrated Security Platforms Transform ...

Financial services organizations face an impossible equation: maintain 99.9% uptime for mission-critical ...