Here is my requirements.
On last 7 days logs need to search to get unique users per day basis and those users again search on same day log for login status.
Based on the login status (fail,success) generate time chart.
Here i am facing issue is on main search i am using time picker to search on 7 days logs and sub search also searching login status on 7 days logs instead of that the subsearch need to check same day logs, please help me on this.
If your number of unique users is fairly low, you may be able to use map to dynamically search over the current day for each record. Assuming the field names of _time, user, and login_status, I think something like the following should work in place of your subsearch:
| eval earliest=relative_time(_time,"@d"), latest=relative_time(earliest,"+24h")
| map search="search index=* user=$user$ earliest=$earliest$ latest=$latest$ | eval day=strftime(_time,\"%Y-%m-%d\")| fields day user login_status" maxsearches=50
Of course you'll probably need to put in an index name and the other arguments to narrow the map search down to the event type you're looking for, and make sure maxsearches is set high enough to iterate over each unique user, but you should then be able to use the day field to bin/timechart the data as needed.
@sankar_kasala, you can pass earliest
and latest
arguments to the main search and subsearch directly in the SPL rather than from the time picker. Refer to Splunk blog: ttps://www.splunk.com/blog/2012/02/19/compare-two-time-ranges-in-one-report.html
Following is a run anywhere search:
index="_internal" sourcetype="splunkd" log_level="ERROR" earliest=-0d@d latest=@s
| fields log_level
| eval ReportKey="today"
| append
[ search index="_internal" sourcetype="splunkd" log_level="ERROR" earliest=-7d@d latest=-6d@s
| fields log_level
| eval ReportKey="same day last week"
| eval _time=_time+60*60*24*7]
| timechart span=1h count by ReportKey
Also check out the Splunk Timewrap command. Where you can feed in last 7 days data and break up the series daily. Following is a run anywhere search based on Splunk's _internal index:
index="_internal" sourcetype="splunkd" log_level="ERROR" earliest=-7d@d latest=@s
| timechart span=1h count
| timewrap 1d
@sankar_kasala, you can pass earliest
and latest
arguments to the main search and subsearch directly in the SPL rather than from the time picker. Refer to Splunk blog: ttps://www.splunk.com/blog/2012/02/19/compare-two-time-ranges-in-one-report.html
Following is a run anywhere search:
index="_internal" sourcetype="splunkd" log_level="ERROR" earliest=-0d@d latest=@s
| fields log_level
| eval ReportKey="today"
| append
[ search index="_internal" sourcetype="splunkd" log_level="ERROR" earliest=-7d@d latest=-6d@s
| fields log_level
| eval ReportKey="same day last week"
| eval _time=_time+60*60*24*7]
| timechart span=1h count by ReportKey
Also check out the Splunk Timewrap command. Where you can feed in last 7 days data and break up the series daily. Following is a run anywhere search based on Splunk's _internal index:
index="_internal" sourcetype="splunkd" log_level="ERROR" earliest=-7d@d latest=@s
| timechart span=1h count
| timewrap 1d
Thanks a lot for your help.
If your number of unique users is fairly low, you may be able to use map to dynamically search over the current day for each record. Assuming the field names of _time, user, and login_status, I think something like the following should work in place of your subsearch:
| eval earliest=relative_time(_time,"@d"), latest=relative_time(earliest,"+24h")
| map search="search index=* user=$user$ earliest=$earliest$ latest=$latest$ | eval day=strftime(_time,\"%Y-%m-%d\")| fields day user login_status" maxsearches=50
Of course you'll probably need to put in an index name and the other arguments to narrow the map search down to the event type you're looking for, and make sure maxsearches is set high enough to iterate over each unique user, but you should then be able to use the day field to bin/timechart the data as needed.
Thanks a lot for your help to frame the query.