Dashboards & Visualizations

SPL

uagraw01
Motivator

Hello Splunkers!!

 

I want compare 30 days, 60 days and 90 days of data in Splunk on the basis of APP_ID. As i have written the below SPL for 30 days. Please let me know the quick and correct approach to write the SPL for the comparion three periods of data.

1. Should i use the join command and use the same kind of serach for 60 & 90 days ?

 

index="ito_snow" sourcetype=csv source="/opt/splunk/etc/apps/splunk_snow_tickets/bin/open_inc.sh" assignment_group="*" cmdb_ci=*
earliest=-30d
| rename cmdb_ci as Hostname state as incident_state number as Incident_Number
| join type=outer Hostname
[| inputlookup LocationMapping.csv
| search Type=MFG OR Type=Mfg ]
| join type=outer Hostname
[| inputlookup abc.csv
| search status="decom"
| eval Hostname=lower(trim(target,"*"))]
| join type=outer Hostname
[| inputlookup yzN.csv
| search SERVERS=*
| rename SERVERS as Hostname]
| where isnull(status) AND incident_state!="Resolved"
| search OneSourceCode="*"
| eval APP_ID=if(isnull(APP_ID),"Not Mapped",APP_ID), APP_NAME=if(isnull(APP_NAME),"Not Mapped",APP_NAME) , BU=if(isnull(BU),"Not Mapped",BU)
| eval APP_DETAILS = APP_NAME."(".APP_ID.")"
| table Incident_Number, BU, APP_DETAILS short_description, assignment_group,incident_state, Hostname, Location, OneSourceCode, Environment, Source, Type, , opened_at
| stats dc(Incident_Number) as "incident Count" by APP_DETAILS

Labels (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

I think the easiest way (I'm not sure that it's the most effective one though) to obtain stats from various point in time is to add an incremental stats value using streamstats and then just filter out three points in time. Something like:

<<your_search>> | streamstats dc(Incident_Number) as "incident Count" by APP_DETAILS

Then you can do

| timechart span=30d latest("incident Count") by APP_DETAILS

Ths should be the easiest one because you don't need to use three different searches to calculate three distinct sets of values and then append them together (it's append, not join which you would want anyway).

 

0 Karma

andrew_m_streic
New Member

There are a couple of ways of doing this. I would recommend bring in all 90 days then aggregating it. You could use something like bin or bucket. You could also you something like rangemap to break it up into easy to find the 30 days groups easier. Another thing you could do would be something like an eval based on range something like eval 30d=(_time-now()/2678400)

0 Karma

uagraw01
Motivator

@Andrew i think rangemap will not work here.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Splunkbase Unveils New App Listing Management Public Preview

Splunkbase Unveils New App Listing Management Public PreviewWe're thrilled to announce the public preview of ...

Leveraging Automated Threat Analysis Across the Splunk Ecosystem

Are you leveraging automation to its fullest potential in your threat detection strategy?Our upcoming Security ...

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...