Currently, about 80 to 90 percent of errors logged within a specific index I'm monitoring is made up of the top 10 to 15 errors. These errors are logged daily and are pretty much noise. I have created a search to exclude each of these errors by using the !=
option, and then using the top limit=10
command to show me the next top 10. But as you can imagine, this isn't very efficient. Is there any other way that I can run a search that would show me the top 10 errors after excluding the current top 10 or 15 errors without having to manually define each error using the !=
option?
Try this ....
your search
| stats count as errorcount by myerror
| sort 25 - errorcount
| tail 15
| reverse
Why not try working with rare instead of top?
The command is intended to find the infrequent results.
I believe (and testing seems to confirm) that you can combine the two.
... | top 15 myerror | rare 5 myerror
Seems to do the right thing in that it first takes the top 15 myerrors, then displays the "least top" 5 of those (e.g. the rare ones). It does mess up your counts and that, but still, the list appears correct.
didn't work for me
tried
| top 100 my-field-name | rare 99 my-field-name
for actually five records, but got all of them and not the first one (100-99) removed!
Put high number (100) because five records is not the max number, may be even 150
regards
Altin
I'm sorry, I should have said to use head and tail. You'll need to sort, but here's a run-anywhere..
index=_internal | stats count by component | sort - count | head 50 | tail 5
That should show you the 45th through 50th (or 45th through 49th?) items.
Does that work better?
Yes, it did this way
thanks and regards
Altin