<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Does Splunk have the ability to batch ingest large .csv files? in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487362#M83448</link>
    <description>&lt;P&gt;up to 250MB&lt;/P&gt;</description>
    <pubDate>Tue, 14 Jan 2020 21:23:20 GMT</pubDate>
    <dc:creator>nick405060</dc:creator>
    <dc:date>2020-01-14T21:23:20Z</dc:date>
    <item>
      <title>Does Splunk have the ability to batch ingest large .csv files?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487360#M83446</link>
      <description>&lt;P&gt;Is Splunk capable of batch ingesting large .csv files? It does not seem like it.&lt;/P&gt;

&lt;P&gt;For example, the below works&lt;/P&gt;

&lt;PRE&gt;&lt;CODE&gt;[monitor:///opt/splunk/var/run/splunk/csv/tenable_reports/*/*.csv]
disabled=false
index=security
sourcetype=csv
ignoreOlderThan=30d
&lt;/CODE&gt;&lt;/PRE&gt;

&lt;P&gt;but when you change &lt;CODE&gt;monitor&lt;/CODE&gt; to &lt;CODE&gt;batch&lt;/CODE&gt; and add &lt;CODE&gt;move_policy=sinkhole&lt;/CODE&gt;, as well as delete &lt;CODE&gt;ignoreOlderThan&lt;/CODE&gt;, it breaks. No ingestion, no purging. Adding &lt;CODE&gt;initCrcLen=1000000000&lt;/CODE&gt; does nothing. If I quickly &lt;CODE&gt;vim&lt;/CODE&gt; a much smaller test csv then it works. Other users have had similar issues:&lt;/P&gt;

&lt;P&gt;&lt;A href="https://answers.splunk.com/answers/660982/why-is-the-batch-input-not-indexing-certain-files.html"&gt;https://answers.splunk.com/answers/660982/why-is-the-batch-input-not-indexing-certain-files.html&lt;/A&gt; &lt;/P&gt;

&lt;P&gt;I am batch inputting from my search head; the index resides on the indexer. There is no problem doing this with &lt;CODE&gt;monitor&lt;/CODE&gt; or with smaller csvs.&lt;/P&gt;

&lt;P&gt;It's a shame Splunk is not a robust enough SIEM to be able to handle batch ingestion of a CSV file &lt;span class="lia-unicode-emoji" title=":confused_face:"&gt;😕&lt;/span&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 14 Jan 2020 20:27:57 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487360#M83446</guid>
      <dc:creator>nick405060</dc:creator>
      <dc:date>2020-01-14T20:27:57Z</dc:date>
    </item>
    <item>
      <title>Re: Does Splunk have the ability to batch ingest large .csv files?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487361#M83447</link>
      <description>&lt;P&gt;How large are the files?&lt;/P&gt;</description>
      <pubDate>Tue, 14 Jan 2020 21:20:04 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487361#M83447</guid>
      <dc:creator>richgalloway</dc:creator>
      <dc:date>2020-01-14T21:20:04Z</dc:date>
    </item>
    <item>
      <title>Re: Does Splunk have the ability to batch ingest large .csv files?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487362#M83448</link>
      <description>&lt;P&gt;up to 250MB&lt;/P&gt;</description>
      <pubDate>Tue, 14 Jan 2020 21:23:20 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487362#M83448</guid>
      <dc:creator>nick405060</dc:creator>
      <dc:date>2020-01-14T21:23:20Z</dc:date>
    </item>
    <item>
      <title>Re: Does Splunk have the ability to batch ingest large .csv files?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487363#M83449</link>
      <description>&lt;P&gt;This is only a partial answer. 15 hours and 100+ reboots later. Sigh.&lt;/P&gt;

&lt;P&gt;I had to stop Splunk, delete the app, fishbucket, and batch directory all at once, restart, and then move files back in. If I did not do this exactly it did not work. Even then, it only indexed/deleted about 500MB of csvs at a time before it stopped indexing or deleting ANYTHING. So I had to go through the process again with my remaining ~800MB of csv files. And again, it stopped and broke halfway through. The third time going through this process finally finished the batch ingestion. &lt;/P&gt;

&lt;P&gt;Again, this doesn't help much, because next time I go to deposit a large number of .csvs, I'm screwed unless I want to go through this process every time.&lt;/P&gt;</description>
      <pubDate>Tue, 14 Jan 2020 23:06:03 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/Does-Splunk-have-the-ability-to-batch-ingest-large-csv-files/m-p/487363#M83449</guid>
      <dc:creator>nick405060</dc:creator>
      <dc:date>2020-01-14T23:06:03Z</dc:date>
    </item>
  </channel>
</rss>

