Monitoring Splunk

How to delete the data in outpulookup file which is more than 28 days old?

Amit79
Loves-to-Learn Everything

Hello All,

I have some dashboards  which are using reports for calculations, it has some lookup files, the problem is when the csv file limit reaches the set value, it stopped showing the Graphs on dashboard and I have create new lookup file every time and update the dashboards, but I dont wanted to do it, is there anyway this can be avoided, I wanted outlookup file just to keep last 28 days of data and delete the rest of the data. I am trying below splunk script but not sure if I am doing it correctly. I have also tried the max option and its just restrict the query to dump the records into csv file above the set value

index="idx_rwmsna" sourcetype=st_rwmsna_printactivity source="E:\\Busapps\\rwms\\mna1\\geodev12\\Edition\\logs\\DEFAULT_activity_1.log"
| transaction host, JobID, Report, Site startswith="Print request execution start."
| eval duration2=strftime(duration, "%Mm %Ss %3Nms") | fields *
| rex field=_raw "The request was (?<PrintState>\w*) printed."
| rex field=_raw "The print request ended with an (?<PrintState>\w*)"
| rex field=_raw ".*Dest : (?<Dest>\w+).*"
| search PrintState=successfully Dest=Printer
| table _time, host, Client, Site, JobID, Report, duration, duration2
| stats count as valid_events count(eval(duration<180)) as good_events avg(duration) as averageDuration
| eval sli=round((good_events/valid_events) * 100, 2)
| eval slo=99, timestamp=now()
| eval burnrate=(100-sli)/(100-slo), date=strftime(timestamp,"%Y-%m-%d"), desc="WMS Global print perf"
| eval time=now()
| sort 0 - time
| fields date, desc, sli, slo, burnrate, timestamp, averageDuration
| outputlookup lkp_wms_print_slislo1.csv append=true override_if_empty=true
| where time > relative_time(now(), "-2d@d") OR isnull(time)

Labels (1)
Tags (2)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

There is something wrong with later part of the SPL 

...
| eval time=now()
| sort 0 - time
| fields date, desc, sli, slo, burnrate, timestamp, averageDuration
| outputlookup lkp_wms_print_slislo1.csv append=true override_if_empty=true
| where time > relative_time(now(), "-2d@d") OR isnull(time)

You are saying new field time is the current time of the search (now()) - so all events will have a new field 'time' with the same value and then you sort all events - that will produce no sort.

Then you discard that new field time with the fields statement and then you use the time field that no longer exists to see if it's more than 2 days old - it would never be, as you just set it to now() above then threw it away.

So the where clause 'isnull(time)' will ALWAYS be true.

You have date and timestamp in your data, which is also 'now()'.

If you want to discard everything here, then you could do something like

| fields date, desc, sli, slo, burnrate, timestamp, averageDuration
| inputlookup lkp_wms_print_slislo1.csv append=true 
| where timestamp > relative_time(now(), "-28d@d")
| outputlookup lkp_wms_print_slislo1.csv 

 so instead of appending to the existing lookup, you load the entire lookup and filter out timestamps older than 28 days and then write the entire dataset back.

Note that this does not handle duplicates if you run the search more than once, so you'd have to handle that if you need to.

0 Karma

Amit79
Loves-to-Learn Everything

Thanks for responding, do you have any examples using summary index

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try removing the append=true from the outputlookup command - however, it you do this, you will have to modify the search to have all the information you want to keep i.e. the last 28 days worth.

Another possibility is to use a summary index and set the retention period for the index to 28 days.

Amit79
Loves-to-Learn Everything

Thank you your input, I found one workaround and here is the code.

Luckily I am having timestamp field in my lookup file so I making use of that.

If you have any idea to make it better please let me know

| inputlookup lkp_sds_wms_trw_slislo.csv
| eval start_date = strftime(relative_time(now(),"-60d@d"), "%Y-%m-%d")
| eval Endtimestamp = strptime(start_date, "%Y-%m-%d")
|where timestamp > Endtimestamp
| outputlookup lkp_sds_wms_trw_slislo.csv
 
Thanks Again
 
Regards
Amit
0 Karma
Get Updates on the Splunk Community!

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...

SignalFlow: What? Why? How?

What is SignalFlow? Splunk Observability Cloud’s analytics engine, SignalFlow, opens up a world of in-depth ...

Federated Search for Amazon S3 | Key Use Cases to Streamline Compliance Workflows

Modern business operations are supported by data compliance. As regulations evolve, organizations must ...