Getting Data In

return yesterday count on: ---| eval filename=strftime(now(), "xyz_%d.csv

knitz
Explorer

Hello,

I am running below search; daily (last 24h) .... which returns results and "outputlookup" results into a csv based on "xyz_NO_of_day"

Runs fine....if I am running such search on same day (i.e. close to midnight) but the source get inputs after midnight so I miss data and had to run such search next day..... i.e. running 04:30am following day
Running next day same search and setting it will return a file name based of the day (next-day)

So I like to run the search on next day.... i.e. running the search on day 09 @ 04:30am (search day before, which is day8) ... it should | eval filename=strftime(now(), "Application-license-usage-per_day_%d.csv") %d must be counted as the day before =8 not 9.

I tried without results to:

| outputlookup [ | stats count | eval filename=strftime(now(), "-1d", "Application-license-usage-per_day_%d.csv") | return $filename]

Do you have any idea how to fix it?

below is the initial search
index="application-license" sourcetype=application License_User_device=* License_feature_status="OUT" License_user=*
| eval License_feature_status=(License_feature_status)
| eval License_User_device=split(License_User_device,",")
| eval License_user=split(License_user,",")
| makemv delim="," License_user
| mvexpand License_user
| sort License_user
| dedup License_user
| stats list(License_user) as "User" list(License_User_device) as "Computer" count(License_feature_status) as "LicenseTaken" by _time

| outputlookup [ | stats count | eval filename=strftime(now(), "Application-license-usage-per_day_%d.csv") | return $filename]

Thanks in advance

1 Solution

manjunathmeti
Champion

hi @knitz,

Subtract 86400 (seconds for 1 day) from now().

| eval filename=strftime(now()-86400, "Application-license-usage-per_day_%d.csv")

View solution in original post

manjunathmeti
Champion

hi @knitz,

Subtract 86400 (seconds for 1 day) from now().

| eval filename=strftime(now()-86400, "Application-license-usage-per_day_%d.csv")
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...