May I know if there is any size limit of the csv file when performing a lookup?
I'm doing a lookup to a csv with around 300k records, encounter the error below.
Encountered an error while reading file 'D:\Splunk\var\run\splunk\dispatchtmp\subsearch_admin__admin__search
Thanks
Upgrading Splunk to 4.3.4 should fix the issue
Is that the exact error? It looks similar to a current splunk bug on 4.3.3 which occurs when you have a sub search in your search string. Csvs can contain many many more records than 300k so it could be the aforementioned bug you are hitting instead.
Update:
Subsearch failing with the error "Encountered an error while reading file '/opt/splunk/var/run/splunk/dispatchtmp/subsearch_/prereport_.csv.gz'.", the workaround is to format the fields with the command fields instead of table at the end of the sub search. (SPL-52862)
Yeah this sounds like the bug. No subsearch included? Also it may pop up somewhere else. Bear in mind that CSV at the end of that is unrelated to the fact that you may be using a CSV in your search. I've updated my answer with the bug detail. I believe a fix is due in the next maintenance release.
Hi Drainy,
The full error message.
Encountered an error while reading file 'D:\Splunk\var\run\splunk\dispatchtmp\subsearch_admin_admin_search_TWFsaWNpb3VzIElQIHNlYXJjaCBieSBkc3Q_1344931019.717_1344931019.1\collapse-132809093_0.csv.gz'.
I was able to perform the same search with a smaller csv file though.
Thanks.