Hi at all,
I need to create a report with a row counter, with a variable number of rows.
My report must run 2 times a month, and the counter of every report must start from the last value of the previous report.
Anyone has any hint to do this?
Bye.
Giuseppe
How many rows we're talking about? Maintaining the last row number in lookup might work (which involves updating the row number lookup after the search has run), but may cause problems if the report has to be run more that twice. Another option would be store the Report result to summary index with the row number. The second run of the report will check the last row number for that month and start from next index.
Very vague pseudo-SPL:
mysearch earliest=@y latest=now | streamstats count as MyCounter | where date_month=strftime(now(), "%m") | ... other stuff
So that you can interprete my (metaphorically speaking) chicken-scratching, the idea is to run the search over all year and use streamstats to create a new MyCounter field that is just a counter like what you'd want. Then the where
searches for ONLY those that occur in this month, ripping out all the stuff you don't want.
You'll need to verify a few bits of syntax - I don't have a working Splunk install available to me right now so I sortof guessed some things. Specifically, I think the date_month is a number, and I think the strftime will do what you need. Either way, run each piece and build from the last making sure you understand what it's doing and I think you'll get your answer.
How many rows we're talking about? Maintaining the last row number in lookup might work (which involves updating the row number lookup after the search has run), but may cause problems if the report has to be run more that twice. Another option would be store the Report result to summary index with the row number. The second run of the report will check the last row number for that month and start from next index.
Hi somesoni2,
we're speaking about 25-30 rows every time, but the problem is that this number is variable, depending by the results (if one row is empty, there isn't).
It's a scheduled report, so I haven't performance problems,
could I run the report with fixed dates (from the year beginning to now) inserting the row numbers in a field and after filter results using the time picker rangefor my report?
Thank You.
Bye.
Giuseppe
Didn't get this part : " after filter results using the time picker rangefor my report"? Are you saying you row number never resets to 0 (like every month)? In any case, yes, you can run it for a fix period and store the result in Summary index OR your can do a summary index backfill for prior period. After that, the starting row number should be picked up from summary index. Kind of like this
index=foo sourcetype=bar | some aggregation command | eval serial_no=[search index=summary_index_foo source="YourSummaryIndexSearch" earliest=-30d | stats count latest(serial_no) as serial_no | eval search=coalesce(serial_no+1,1) | table search ] | accum serial_number
Spell check on serial_number/serial_no.
I'm not following the use of accum at the end of that. The subsearch has gone to the summary_index_foo and returned the latest serial_no, +1.
Is accum acting like "streamstats count" on top of the original value returned by the search?