Splunk Search

Running loop on a search that takes one Id at a time

Rajyalakshmi
Explorer

Hi,

I have a lookup file that contains multiple Id's, I have a search that takes one Id at a time and returns the results. My requirement is to run this search for every Id in that lookup file and get a field into the lookup file with that value.

I've tried "map" command but its not working, do we have any limitation for this map command? suggest me any approach for implementing my requirement.

Thank you!

Rajyalakshmi Alluri

Labels (2)
0 Karma

Rajyalakshmi
Explorer

Here is the problem I'm facing with "Predict" command - I used map command as per your suggestion above which is dynamically predicting values for different ID's, however "predict" command is not retaining the ID value its just giving the predicted value but not the ID.

Is there a way for setting getting the predicted outcome along with its corresponding ID? please suggest... I need set the alerts for different ID's for predicted outcome value>=0.5

Awaiting for your response.

Thank you!

0 Karma

Rajyalakshmi
Explorer

Hi,

I'm awaiting for your response for my above problem... kindly help me with a solution

0 Karma

ericjorgensenjr
Path Finder

Something like the below should work.

| inputlookup yourlookup append=f | map [ search yoursearch id=$field_from_lookup$ | stats values(field_you_want_to_add) as name_of_field_in_lookup by id ] maxsearches=10000 | lookup yourlookup id | outputlookup yourlookup append=f

Rajyalakshmi
Explorer

Do "map" command has any limitation? my search query has multiple tables joined having sub-searches under it. Its giving me below error:

Error in 'table' command: Invalid argument: 'AirId=$ID$'
The search job has failed due to an error. You may be able view the job in the job inspector
 
0 Karma

ericjorgensenjr
Path Finder

Can you paste the SPL here? Will be easier to pinpoint the issue

0 Karma

Rajyalakshmi
Explorer

As requested, please find below SPL:

| inputlookup AirId_List.csv append=f | table AirId | rename AirId as ID
| map
[ search `itsi_event_management_group_index`
| eval id=mvindex(split(itsi_group_title,"_"),1)
| eval AirId=if(id=="KPI",mvindex(split(itsi_group_title,"_"),2),id)
| rex field=itsi_group_title "_\d+_(?<KPI>.*)"
| search AirId="$ID$" KPI IN ("*Appdynamics CPU/Memory*")
| stats values(itsi_first_event_id) as itsi_first_event_id values(itsi_first_event_time) as itsi_first_event_time latest(event_id) as event_id earliest(itsi_group_severity) as Severity count by AirId itsi_group_id itsi_group_title KPI
| eval Severity=case(Severity=1,"Information",Severity=2,"Normal",Severity=3,"Low",Severity=4,"Medium",Severity=5,"High",Severity=6,"Critical")
| rename itsi_group_title AS title
| where isnum(AirId)
| join type=outer event_id
[ search index="itsi_tracked_alerts"
| table event_id search CIName HealthRuleName PublisherEventSubType]
| eval Event_Created_Time =strftime(itsi_first_event_time, "%Y-%m-%d %H:%M:%S")
| table Event_Created_Time title AirId KPI search CIName HealthRuleName PublisherEventSubType episode_severity Severity itsi_group_id
| eval _raw=replace(search,"CIName=\"","Details=\" ###CIName: ")
| eval _raw=replace(_raw,"\n","###")
| extract pairdelim="" kvdelim="="
| fields - PublisherEventSubType
| eval _raw=replace(Details,"=",":")
| extract pairdelim="###" kvdelim=":"
| fields - _raw,Details,Event_Details
| table Event_Created_Time title AirId KPI search CIName HealthRuleName PublisherEventSubType episode_severity Severity itsi_group_id
| rename itsi_group_id as event_id
| join type=outer
[ search index=itsi_notable_audit "*severity*" sourcetype=itsi_notable:audit
| eval episode_severity=mvindex(split(mvindex(split(activity," "),2),"="),1)
| stats latest(episode_severity) as episode_severity by event_id ]
| eval episode_severity = if(isnull(episode_severity),Severity,episode_severity )
| rename Severity as Initial_Severity , episode_severity as Latest_Severity
| table Event_Created_Time title AirId KPI Initial_Severity Latest_Severity
| eval eventdate_secs=strptime(Event_Created_Time,"%Y-%m-%d")
| eval eventdate=strftime(eventdate_secs,"%Y-%m-%d")
| stats count as Episode_Count by AirId KPI eventdate
| sort -Episode_Count
| eval _time=strptime(eventdate,"%Y-%m-%d")
| timechart span=1d count(Episode_Count) as Episode
| fillnull
| predict "Episode" as Prediction algorithm=LLP5 holdback=0 future_timespan=2 upper95=upper95 lower95=lower95
| `forecastviz(10, 0, "Episode", 95)`
| eval Prediction=if(Prediction<0,0.0,round(Prediction,1))
| table _time Episode Prediction
AirId=$ID$
| stats values(Prediction) as Outage by AirId] maxsearches=10000
| lookup AirId_List.csv AirId
| outputlookup AirId_List.csv append=f
0 Karma

ericjorgensenjr
Path Finder

Your error is here, towards the bottom:

| table _time Episode Prediction
AirId=$ID$

I think you should change it to:

| table _time Episode Prediction AirId

Rajyalakshmi
Explorer

No luck.... it didn't work. Here is the error that I'm facing even after following the suggested one:

[subsearch]: command="predict", No data

This is the latest query which I'm using based on your suggestion:

| inputlookup AirId_List.csv append=f | table AirId | rename AirId as ID
| map
[ search `itsi_event_management_group_index`
| eval id=mvindex(split(itsi_group_title,"_"),1)
| eval AirId=if(id=="KPI",mvindex(split(itsi_group_title,"_"),2),id)
| rex field=itsi_group_title "_\d+_(?<KPI>.*)"
| search AirId="$ID$" KPI IN ("*Appdynamics CPU/Memory*")
| stats values(itsi_first_event_id) as itsi_first_event_id values(itsi_first_event_time) as itsi_first_event_time latest(event_id) as event_id earliest(itsi_group_severity) as Severity count by AirId itsi_group_id itsi_group_title KPI
| eval Severity=case(Severity=1,"Information",Severity=2,"Normal",Severity=3,"Low",Severity=4,"Medium",Severity=5,"High",Severity=6,"Critical")
| rename itsi_group_title AS title
| where isnum(AirId)
| join type=outer event_id
[ search index="itsi_tracked_alerts"
| table event_id search CIName HealthRuleName PublisherEventSubType]
| eval Event_Created_Time =strftime(itsi_first_event_time, "%Y-%m-%d %H:%M:%S")
| table Event_Created_Time title AirId KPI search CIName HealthRuleName PublisherEventSubType episode_severity Severity itsi_group_id
| eval _raw=replace(search,"CIName=\"","Details=\" ###CIName: ")
| eval _raw=replace(_raw,"\n","###")
| extract pairdelim="" kvdelim="="
| fields - PublisherEventSubType
| eval _raw=replace(Details,"=",":")
| extract pairdelim="###" kvdelim=":"
| fields - _raw,Details,Event_Details
| table Event_Created_Time title AirId KPI search CIName HealthRuleName PublisherEventSubType episode_severity Severity itsi_group_id
| rename itsi_group_id as event_id
| join type=outer
[ search index=itsi_notable_audit "*severity*" sourcetype=itsi_notable:audit
| eval episode_severity=mvindex(split(mvindex(split(activity," "),2),"="),1)
| stats latest(episode_severity) as episode_severity by event_id ]
| eval episode_severity = if(isnull(episode_severity),Severity,episode_severity )
| rename Severity as Initial_Severity , episode_severity as Latest_Severity
| table Event_Created_Time title AirId KPI Initial_Severity Latest_Severity
| eval eventdate_secs=strptime(Event_Created_Time,"%Y-%m-%d")
| eval eventdate=strftime(eventdate_secs,"%Y-%m-%d")
| stats count as Episode_Count by AirId KPI eventdate
| sort -Episode_Count
| eval _time=strptime(eventdate,"%Y-%m-%d")
| timechart span=1d count(Episode_Count) as Episode
| fillnull
| predict "Episode" as Prediction algorithm=LLP5 holdback=0 future_timespan=2 upper95=upper95 lower95=lower95
| `forecastviz(10, 0, "Episode", 99)`
| eval Prediction=if(Prediction<0,0.0,round(Prediction,1))
| table _time Episode Prediction AirId
| stats values(Prediction) as Outage by AirId] maxsearches=10000
| lookup AirId_List.csv AirId
| outputlookup AirId_List.csv append=f

Kindly help me with the solution.

Thank you!

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...