<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: input monitor scanning too much files and causes Splunk indexing troubles in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122089#M32858</link>
    <description>&lt;P&gt;I had similar issue (essentially there are 100s of thousands of files on the way [within dir structures]to the actual files i was interested in indexing [which were at the very bottom of the structure]. I've used uF, but it just couldn't handle it. too many files to 'scan'. &lt;/P&gt;

&lt;P&gt;I finally given up and just change the dir structure so the files I was interested were being saved to other location.&lt;/P&gt;</description>
    <pubDate>Fri, 27 Jun 2014 12:45:24 GMT</pubDate>
    <dc:creator>mic1024</dc:creator>
    <dc:date>2014-06-27T12:45:24Z</dc:date>
    <item>
      <title>input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122086#M32855</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;

&lt;P&gt;I have to monitor specific files over a NFS share containing itself thousands of files, this causes troubles to Splunk which seems to be scanning all files in the NFS share and stop indexing other inputs.&lt;/P&gt;

&lt;P&gt;&lt;STRONG&gt;Files that i want to monitor can be accessed by:&lt;/STRONG&gt;&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;/mnt/MYMOUNT/logs/*/*/exploit/prod/RQ_TB_*.res
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;where first wilcard represents the day like YYYY_MM_DD and the second hostnames XXXXXX (alphanumeric)&lt;/P&gt;

&lt;P&gt;Where files to monitor can be named like RQ_TB_XXXXX.res (XXXXX for the server hostname)&lt;/P&gt;

&lt;P&gt;If a i set the monitor like this, Splunk start to scan the share and reports thousands and thousands of files, which causes Splunk to stop indexing other monitors... (some file descriptors limits i guess ?)&lt;/P&gt;

&lt;P&gt;&lt;STRONG&gt;I've tried to set some whitelist regex like:&lt;/STRONG&gt;&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;whitelist = RQ[\_]TB[_][a-zA-Z0-9]\.(res)$
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;OR:&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;whitelist = \.(res)$
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;But still Splunk reports thousands of files within the manager where i should have around 700 files&lt;/P&gt;

&lt;P&gt;How can i prevent Splunk from scanning all the files as only a few should match ?&lt;/P&gt;

&lt;P&gt;This seems to also generate a useless system load...&lt;/P&gt;

&lt;P&gt;Thank your very much for any help !&lt;/P&gt;</description>
      <pubDate>Mon, 28 Sep 2020 16:56:41 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122086#M32855</guid>
      <dc:creator>guilmxm</dc:creator>
      <dc:date>2020-09-28T16:56:41Z</dc:date>
    </item>
    <item>
      <title>Re: input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122087#M32856</link>
      <description>&lt;P&gt;Hi guilmxm,&lt;/P&gt;

&lt;P&gt;no chance using a universal forwarder here? &lt;/P&gt;

&lt;P&gt;When you specify wildcards in a file input path, Splunk creates an implicit whitelist for that stanza. The longest fully qualified path becomes the monitor stanza, and the wildcards are translated into regular expressions. &lt;BR /&gt;
This means your whitelist is being clobberd by your use of &lt;CODE&gt;*&lt;/CODE&gt; expressions in the monitor stanza.&lt;BR /&gt;
Try using &lt;CODE&gt;...&lt;/CODE&gt; instead and see the &lt;A href="http://docs.splunk.com/Documentation/Splunk/6.1.1/Data/Specifyinputpathswithwildcards"&gt;docs&lt;/A&gt; for more detailed information.&lt;/P&gt;

&lt;P&gt;cheers, MuS&lt;/P&gt;</description>
      <pubDate>Fri, 27 Jun 2014 09:09:40 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122087#M32856</guid>
      <dc:creator>MuS</dc:creator>
      <dc:date>2014-06-27T09:09:40Z</dc:date>
    </item>
    <item>
      <title>Re: input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122088#M32857</link>
      <description>&lt;P&gt;Hi MuS,&lt;/P&gt;

&lt;P&gt;Thank you for answering, changing from * to ... did not changed the number of files being scanned by Splunk (and the time for it to find all required files)&lt;/P&gt;

&lt;P&gt;This is a dev environment running a few single Splunk instances, using a UF no because i'm also indexing data retrieved from various DB using DB connect, but i can concentrate inputs within an Heavy Forwarder &lt;/P&gt;

&lt;P&gt;I thought part of the answer could have been in report with withelist / blacklist...&lt;/P&gt;</description>
      <pubDate>Fri, 27 Jun 2014 10:00:34 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122088#M32857</guid>
      <dc:creator>guilmxm</dc:creator>
      <dc:date>2014-06-27T10:00:34Z</dc:date>
    </item>
    <item>
      <title>Re: input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122089#M32858</link>
      <description>&lt;P&gt;I had similar issue (essentially there are 100s of thousands of files on the way [within dir structures]to the actual files i was interested in indexing [which were at the very bottom of the structure]. I've used uF, but it just couldn't handle it. too many files to 'scan'. &lt;/P&gt;

&lt;P&gt;I finally given up and just change the dir structure so the files I was interested were being saved to other location.&lt;/P&gt;</description>
      <pubDate>Fri, 27 Jun 2014 12:45:24 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122089#M32858</guid>
      <dc:creator>mic1024</dc:creator>
      <dc:date>2014-06-27T12:45:24Z</dc:date>
    </item>
    <item>
      <title>Re: input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122090#M32859</link>
      <description>&lt;P&gt;Thanks for your experience feedback.&lt;BR /&gt;
In my case, i have to adapt my own configuration to the customer, changing files structured won't be possible...&lt;/P&gt;</description>
      <pubDate>Fri, 27 Jun 2014 15:45:31 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122090#M32859</guid>
      <dc:creator>guilmxm</dc:creator>
      <dc:date>2014-06-27T15:45:31Z</dc:date>
    </item>
    <item>
      <title>Re: input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122091#M32860</link>
      <description>&lt;P&gt;I've installed and configured a new instance running as an Heavy Forwarder where i will concentrate all inputs.&lt;/P&gt;

&lt;P&gt;Off course i have the same behavior will many files being scanned, i've changed some script inputs which were generating files (that splunk was monitoring) to stdout streaming, some kind of workaround&lt;/P&gt;</description>
      <pubDate>Fri, 27 Jun 2014 15:47:40 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122091#M32860</guid>
      <dc:creator>guilmxm</dc:creator>
      <dc:date>2014-06-27T15:47:40Z</dc:date>
    </item>
    <item>
      <title>Re: input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122092#M32861</link>
      <description>&lt;P&gt;First - do &lt;EM&gt;not&lt;/EM&gt; ask a Splunk indexer to monitor this many files. Even if you can't install a Universal Forwarder on the remote system. The indexer is already doing the actual indexing of data and responding to searches.&lt;/P&gt;

&lt;P&gt;Second - if you &lt;EM&gt;can&lt;/EM&gt;, use a separate system to collect and forward the data. This separate system could even be a virtual machine. Its only job will be to scan the remote filesystem and forward the correct files to the indexer(s). This system will need very good network access and a fair amount of CPU and memory - but it won't need great disk I/O. On this machine, install and configure the Universal Forwarder.&lt;/P&gt;

&lt;P&gt;Third - If you can't use a separate system to collect and forward the data, you could still run a separate Universal Forwarder (on the same machine as the indexer) and use it to collect the data. I am not sure how much this will help, but it should improve things somewhat, especially if you follow suggestion the "Fourth" below.&lt;/P&gt;

&lt;P&gt;Fourth - Run a regular script to remove older files. Otherwise, Splunk will continue to monitor files that will never be written to again. (Because, how does Splunk know they won't be updated?) This is a complete waste of resources. If you cannot do this, you can also add this setting to your &lt;CODE&gt;inputs.conf&lt;/CODE&gt;&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;ignoreOlderThan = 14d
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Note that once a file becomes "ignored", it will never be examined again, even if it is subsequently updated! So be sure to pick a reasonable date. This setting alone might solve your problem.&lt;/P&gt;

&lt;P&gt;Fifth - If needed, you can run multiple Universal Forwarders, and have each forwarder monitor a section of the directory structure. So the first Universal Forwarder could have inputs for&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[monitor:///mnt/MYMOUNT/logs/*/A*/exploit/prod/RQ_TB_*.res]   
[monitor:///mnt/MYMOUNT/logs/*/B*/exploit/prod/RQ_TB_*.res]
etc.
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;Anything that you do to keep Splunk from traversing unnecessary files and directories, will help.&lt;/P&gt;</description>
      <pubDate>Fri, 27 Jun 2014 23:42:13 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122092#M32861</guid>
      <dc:creator>lguinn2</dc:creator>
      <dc:date>2014-06-27T23:42:13Z</dc:date>
    </item>
    <item>
      <title>Re: input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122093#M32862</link>
      <description>&lt;P&gt;Hi, Thanks for your answer and suggestions&lt;/P&gt;

&lt;P&gt;As previously mentioned we're now in dev env but in Production we will dedicate Splunk instances to do this kind of jobs&lt;/P&gt;

&lt;P&gt;Because we need to retrieve data from various DB, we planned to build an instance as an heavy forwarder.&lt;/P&gt;

&lt;P&gt;I think your suggestion to ignore files older than may help, i will test this and revert. (after a first full run when files have been indexed, this is useless anyway to monitor every files)&lt;/P&gt;

&lt;P&gt;Note that files in this NFS share are purged periodically, but it concentrates many logs of numerous systems&lt;/P&gt;</description>
      <pubDate>Mon, 30 Jun 2014 19:17:27 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122093#M32862</guid>
      <dc:creator>guilmxm</dc:creator>
      <dc:date>2014-06-30T19:17:27Z</dc:date>
    </item>
    <item>
      <title>Re: input monitor scanning too much files and causes Splunk indexing troubles</title>
      <link>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122094#M32863</link>
      <description>&lt;P&gt;For those who be interested in such a case, i could not find a correct pure Splunk answer to this case.&lt;/P&gt;

&lt;P&gt;I have ended by creating an rsync mirror workflow that would rsync files that i wanted to monitor from my NFS share, and then create required splunk inputs.&lt;/P&gt;

&lt;P&gt;Works perfectly.&lt;/P&gt;</description>
      <pubDate>Wed, 06 Aug 2014 12:55:24 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/input-monitor-scanning-too-much-files-and-causes-Splunk-indexing/m-p/122094#M32863</guid>
      <dc:creator>guilmxm</dc:creator>
      <dc:date>2014-08-06T12:55:24Z</dc:date>
    </item>
  </channel>
</rss>

