Splunk Search

Fetching data fro a limited time in to csv for lookup

nirmalya2006
Path Finder

Hi All

I am trying to schedule a job that will run every day to pull data of last 30 days into a csv file for lookup.
I wrote this query in the job.
But the query is always fetching data from the entire index which include data from over 30 days causing the csv to become extremely huge.
Can someone help me identify what is wrong with the query
ResponseTime and page are extracted fields

| addinfo
| eval info_max_time="now" 
| eval info_min_time="-30d@d"
| where _time >= info_min_time AND _time < info_max_time 
| stats values(ResponseTime) as "resp_time" by page,_time  
| table resp_time, page, _time
| outputlookup average_resp_time.csv

Thanks

Tags (1)
0 Karma
1 Solution

lakshman239
Influencer

I am just trying to understand. If you just need to pull data from a particular index, you need to include that right?

If you are just interested in last 30days events, you can start with
index=main earliest=-30d@d

and restrict what you need for your reports. does this help?

View solution in original post

lakshman239
Influencer

I am just trying to understand. If you just need to pull data from a particular index, you need to include that right?

If you are just interested in last 30days events, you can start with
index=main earliest=-30d@d

and restrict what you need for your reports. does this help?

nirmalya2006
Path Finder

Hmmm .. I didn't think in this line.
This helped.
Thank you very much.

0 Karma
Get Updates on the Splunk Community!

Index This | What’s a riddle wrapped in an enigma?

September 2025 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

BORE at .conf25

Boss Of Regular Expression (BORE) was an interactive session run again this year at .conf25 by the brilliant ...

OpenTelemetry for Legacy Apps? Yes, You Can!

This article is a follow-up to my previous article posted on the OpenTelemetry Blog, "Your Critical Legacy App ...