Dashboards & Visualizations

SPL

uagraw01
Builder

Hello Splunkers!!

 

I want compare 30 days, 60 days and 90 days of data in Splunk on the basis of APP_ID. As i have written the below SPL for 30 days. Please let me know the quick and correct approach to write the SPL for the comparion three periods of data.

1. Should i use the join command and use the same kind of serach for 60 & 90 days ?

 

index="ito_snow" sourcetype=csv source="/opt/splunk/etc/apps/splunk_snow_tickets/bin/open_inc.sh" assignment_group="*" cmdb_ci=*
earliest=-30d
| rename cmdb_ci as Hostname state as incident_state number as Incident_Number
| join type=outer Hostname
[| inputlookup LocationMapping.csv
| search Type=MFG OR Type=Mfg ]
| join type=outer Hostname
[| inputlookup abc.csv
| search status="decom"
| eval Hostname=lower(trim(target,"*"))]
| join type=outer Hostname
[| inputlookup yzN.csv
| search SERVERS=*
| rename SERVERS as Hostname]
| where isnull(status) AND incident_state!="Resolved"
| search OneSourceCode="*"
| eval APP_ID=if(isnull(APP_ID),"Not Mapped",APP_ID), APP_NAME=if(isnull(APP_NAME),"Not Mapped",APP_NAME) , BU=if(isnull(BU),"Not Mapped",BU)
| eval APP_DETAILS = APP_NAME."(".APP_ID.")"
| table Incident_Number, BU, APP_DETAILS short_description, assignment_group,incident_state, Hostname, Location, OneSourceCode, Environment, Source, Type, , opened_at
| stats dc(Incident_Number) as "incident Count" by APP_DETAILS

Labels (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

I think the easiest way (I'm not sure that it's the most effective one though) to obtain stats from various point in time is to add an incremental stats value using streamstats and then just filter out three points in time. Something like:

<<your_search>> | streamstats dc(Incident_Number) as "incident Count" by APP_DETAILS

Then you can do

| timechart span=30d latest("incident Count") by APP_DETAILS

Ths should be the easiest one because you don't need to use three different searches to calculate three distinct sets of values and then append them together (it's append, not join which you would want anyway).

 

0 Karma

andrew_m_streic
New Member

There are a couple of ways of doing this. I would recommend bring in all 90 days then aggregating it. You could use something like bin or bucket. You could also you something like rangemap to break it up into easy to find the 30 days groups easier. Another thing you could do would be something like an eval based on range something like eval 30d=(_time-now()/2678400)

0 Karma

uagraw01
Builder

@Andrew i think rangemap will not work here.

0 Karma
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...