I have a csv file with nearly 50000 rows. When I try to fetch all the rows using the inputlookup command, I am not able to retrieve all the 50000 rows. Only 42000 odd rows are returned.
Also, when I use this csv for lookup, for all the rows that are present after the 5000th row, lookup is not happening. However, if I take a particular row and place it within the 5000 rows, lookup happens succesfully.
Can anyone explain this strange behavior? Please let me know what changes I should make in conf files to enable succesful lookup.
I checked the
max_memtable_bytes value in my setup and my csv file size is way below the limit.
If you need in stats command.. Here is the text from splunk docs
Memory and maximum results
In the limits.conf file, the maxresultrows setting in the [searchresults] stanza specifies the maximum number of results to return. The default value is 50,000. Increasing this limit can result in more memory usage.
The maxmemusage_mb setting in the [default] stanza is used to limit how much memory the stats command uses to keep track of information. If the stats command reaches this limit, the command stops adding the requested fields to the search results. You can increase the limit, contingent on the available system memory.
If you are using Splunk Cloud and want to change either of these limits, file a Support ticket.
Read the post carefully. They have 50000 rows but they were getting only 42000.
As per your comment then wouldn't they be getting all 50000 results.
More over the question talks about CSV. In case if the CSV has any unbalanced quotes then the lookup works till that point and fails after that.