I guess I'm not alone for this issue.
Any of you encountered high CPU using when UF is monitoring like over 10k of files?
In fact each file is very small. But they're required to be collected.
As I know UF would have a full list of files in memory, seems traversing the file list would spend a lot of CPU time.
This is still the same if we specified ignoreOlderThan.
And I can't reorganize customer's files
Now I'm considering to write a scheduled script to add file by file through the script, e.g. using "add oneshot".
But that's pain to keep track whether files have been captured or not.
Kindly want to listen if any other smarter suggestions.
@philip_w - keep in mind that when the forwarder comes up, it has to build this list which is costly. The moment the original scan is over, the forwarder should be stable and consume less cpu. So, I suggest that in your testing, allow time to reach the stabilized period...
do you want to "monitor" over 10k files or just "upload" more than 10k files regularly/often ?!?!
these links can give you some more info..
I meant monitoring 10k. In fact, we just need to index once since all the files are XMLs, they won't update.
As said, I can't rotate or reorganize customer's files. They're there for other business reason.
From this post, it seems setting ulimit -n to unlimited may not be the best. Currently we use ulimited. Let me check if smaller number works.