Hello,
I have an alert that output a csv file that look like this
Person | Number_of_login | Login_fail |
Person A | 1 | |
Person B | 6 | 2 |
Person C | 4 | 1 |
I run this search everyday and append upon the csv file. Upon a few days it'll look like this
Person | Number_of_login | Login_fail |
Person A | 1 | |
Person B | 6 | 2 |
Person C | 4 | 1 |
Person A | 2 | 1 |
Person B | 7 | 1 |
Person C | 4 | |
..... |
Now at the end of the month, I want to calculate the total number of login and number of login fail for each person, and also to calculate the percentage of login fail for each one. As far as I know addcoltotals don't have "by" like stats do (like stats count by Person). I want my final table to look like this
Person | Number_of_login | Login_fail |
Person A | 3 | 1 |
Person B | 13 | 3 |
Person C | 8 | 1 |
So how do I go about this?
Hi @phamxuantung,
At first you should sare the search you're using to generate the dayly results so I could update it.
Anyway you can use the same running on a month instead of a day and add an eval at the end to calculate the percentage,
so if e.g. you're speaking of wineventlog, the condition is EventCode=4624 or EventCode=4625, in this case you could run something like this:
index=wineventlog EventCode IN ("4624","4625")
| stats count AS Number_of_login count(eval(EventCode="4625")) AS Login_fail BY user
| eval perc=Login_fail/Number_of_login*100
Ciao.
Giuseppe
Hello @gcusello ,
My search query is pretty simple like the one you post. The data that I provide is for example purpose only, but the output is also similar to it. The problems that I encounter is that our data retention policy is very short (around 2 weeks), and the number of events each day can reach to a few millions event. Calculate on that data set (for report purpose) is
1. Will not correct (don't have enough data for a full month)
2. Too large of a dataset will tank our system, and our is strain enough already
So my solution is to accumulate result each day to lessen the stress of our system and also to save the data that will be delete.
My exact search go something like this
index=main sourcetype=dbx2 RESPCODE=0
|stats count as Total by ACQUIRER
|join type=left ACQUIRER
[search index=main sourcetype=dbx2 RESPCODE=0
|eval length = len(POS_ENTRY_CODE)
|search POS_ENTRY_CODE=8* AND length=2
|stats count as fallback by ACQUIRER]
Pretty simple and run on Yesterday
Hi @phamxuantung,
the easiest way to proceed is to save the results of a dayly execution in a summary index and use if for the monthly report.
You can do this creating a summary index and running your search every day adding at the end the command
| collect index=your_summary_index
then you can run the monthly search on the summary index that will contain the fields in the generating schedule search.
Ciao.
Giuseppe