Deployment Architecture

How to add additional columns?

Miky
Explorer

Hi folks,

Just started using splunk lately and I'm stuck with this alert that I want to create, I've been told to add priority ( P1,P2,P3 column) and job alias from pw_job_mopping lookup be added to this alert that already exist as additional columns.

Any help will be appreciated 🙏.

 

Labels (1)
Tags (2)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi,

your request is just a little vague, could your share your search and the fields of your lookup?

Anyway, If in the lookup you have at least two columns: one is the "priority" and one is the "job", you could create something like @martinpu  hinted:

your_search
| lookup pw_job_mopping.csv job OUTPUT priority
| table _time job priority

Ciao.

Giuseppe

View solution in original post

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi,

your request is just a little vague, could your share your search and the fields of your lookup?

Anyway, If in the lookup you have at least two columns: one is the "priority" and one is the "job", you could create something like @martinpu  hinted:

your_search
| lookup pw_job_mopping.csv job OUTPUT priority
| table _time job priority

Ciao.

Giuseppe

0 Karma

Miky
Explorer

gcusello it works, the problem is that I'm getting only priority1 not priority2 and priority3. 

Just to be specific, under priority field I have 1,2,and3, but I'm getting only priority1.

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Miky,

good for you, see next time!

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated by all the contributors 😉

0 Karma

Miky
Explorer

Thanks,

0 Karma

martinpu
Communicator
index=A sourcetype=B source=C
| rename queue as Queue
| lookup local=true pw_map.csv Queue
| where queue_disabled!="Y"
| eval queue_depth_threshold=mvdedup(queue_depth_threshold)
| eval over_threshold=if(curdepth>queue_depth_threshold,1,0)
| stats latest(curdepth) as curdepth first(curdepth) as firstCur count as event_count sum(over_threshold)
as over_threshold latest(avg_queue_depth) as avg_queue_depth latest(Priority) as Priority latest(Owner)
as queue_owner latest(queue_depth_threshold) as queue_depth_threshold latest(Team) as Team
latest(mt_impact) as mt_impact latest(mt_impact_desc) as mt_impact_desc latest(mt_impact_priority) as
mt_impact_priority latest(functional_impact) as functional_impact by Queue hostname
| where over_threshold>=3 AND curdepth>queue_depth_threshold
| eval Status=case((curdepth-firstCur)>0, "Queue Depth Increasing", (curdepth-firstCur)<0, "Queue Depth
Decreasing", (curdepth-firstCur)=0, "Queue Depth Stagnant")
| where Status!="Queue Depth Decreasing"
| lookup local=true pw_map.csv Queue

 

If you add that lookup to the end, does that solve your issue?

It matches Queue from lookup to Queue from results and outputs all fields from the lookup

 

 

0 Karma

Miky
Explorer

It gives Error in lookup command: can not find the source field 'job' in the lookup table 'pw_job_map.csv.

0 Karma

martinpu
Communicator

Are there any fields in your lookup that match a field in the result table?

0 Karma

martinpu
Communicator

add this to your query:


| lookup  your_lookup_file.csv pw_job_mopping


this command assumes pw_job_mopping exists in both the lookup and your query and looks up identical values to enrich with all fields in the lookup

If you only want the priority field add outputnew priority_field as priority to the end of that line


This page has additional documentation on the lookup command:
https://docs.splunk.com/Documentation/Splunk/9.0.0/SearchReference/Lookup

0 Karma
Get Updates on the Splunk Community!

Index This | Divide 100 by half. What do you get?

November 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Stay Connected: Your Guide to December Tech Talks, Office Hours, and Webinars!

❄️ Celebrate the season with our December lineup of Community Office Hours, Tech Talks, and Webinars! ...

Splunk and Fraud

Watch Now!Watch an insightful webinar where we delve into the innovative approaches to solving fraud using the ...