Splunk Search

Why am I getting "SyntaxError: Unexpected token <." trying to upload a large lookup file?

pepper_seattle
Path Finder

Attempting to upload a "large" lookup file, 2 columns of 190k rows each presents the error "Your entry was not saved. The following error was reported: SyntaxError: Unexpected token <.". I have confirmed that the file does not contain the listed token, nor any other token and is not over the 500mb limit (2.7mb unzipped) and even tried to gzip it to resolve the issue (1.2mb zipped).

Has anyone else run into this issue and if so were you able to resolve the issue?

0 Karma
1 Solution

pepper_seattle
Path Finder

We found the issue to be a size limit on the proxy host which blocked us from uploading 'large' files. Unfortunately this was difficult to find due to the error messaging given. Either way, thanks for weighing in on this issue!

View solution in original post

pepper_seattle
Path Finder

We found the issue to be a size limit on the proxy host which blocked us from uploading 'large' files. Unfortunately this was difficult to find due to the error messaging given. Either way, thanks for weighing in on this issue!

m_efremov
Explorer

Great, it works. In our case there was an error message in proxy (nginx) log, in wich unavailability to write request body to /etc/nginx/client_body_temp/ described. Chmod 777 solved the problem. Now proxy is available send large requests to upstream and no error messages like message from subject

0 Karma

woodcock
Esteemed Legend

What was imposing this limit? Was it Splunk or a Firewall or what? If it was Splunk, what configuration did you change?

0 Karma

pepper_seattle
Path Finder

Unfortunately I cannot say as I didn't make the change/don't have access to do so, if however I find out I will update you.

0 Karma

woodcock
Esteemed Legend

If your file does contain an unparseable (within context) character, the easiest way to identify it is with a binary search. Cut the file in half and upload it. If it fails, cut it in half again until it doesn't. At that point, add back in half of what you cut out last. Keep going just like this and quickly you will find the one line that is causing the problem and go from there.

0 Karma

woodcock
Esteemed Legend

I, too, have seen problems like this with large files (both lookup CSVs and app TGZs) and the easiest way around it is to not use the GUI but transfer the file yourself with CLI (e.g. ftp) and make the associated configuration file changes manually (i.e. add the lookup stanza to transforms.conf). This always works.

0 Karma

pepper_seattle
Path Finder

I checked the file for unparseable characters and could not find one. Should have mentioned that I've run into this error before with a "large" lookup and the fix was to gzip the file. Unfortunately that did not work this time around.

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...