Developing for Splunk Enterprise

Splunk Java - result JSON is huge- ~1.9MB - not being sent to javascript/client

Path Finder

Hi,
We are using SPlunk java sdk and it returns result in json format. We need _raw details also- so our Splunk query includes _raw data.
The problem we are facing is- the resulting output- data is huge and is breaking beyond a point. We got to know that JSON data has a limit of 1,000,000 characters in length.
Are there any suggestions on how we can circumvent this.
We are using strus2, jquery

Tags (3)

Ultra Champion

Do you have an example of the actual error/exception stacktrace your code is throwing ?

0 Karma

Splunk Employee
Splunk Employee

This isn't in the SDK level, but in the underlying JSON libraries that we don't control. This isn't rare. So here's how you do this in general so you don't have to deal with giant hunks of data being sent over the pipe. It will also let you resume more straightforwardly in case the connection gets dropped for some reason.

When you call getResults on the Job, you can pass two argument: count and offset. offset is the number of records at the beginning to skip, and count is the number to return after skipping. So you call getResults with offset 0 and count 100, parse those, then call again with offset 100 and count 100, and then offset 200 and count 100, etc.

Here's some code, starting from your example where you've defined jss and waited for it to complete:

int nEventsPerRequest = 100;
JobResultsArgs oparg = new JobResultsArgs(); // This has convenience methods for getting results
oparg.setOffset(n);
for (int offset = 0; offset < jss.getResultCount(); offset += nEventsPerRequest) {
oparg.setCount(nEventsPerRequest);
InputStream res = jss.getResults(oparg);
ResultsReaderJson resultsReader = new ResultsReaderJson(res);
// ...process the results in this batch...
}

I haven't tested that, so it might have fencepost errors, but that will take care of having too much data in the stream.

Path Finder

The splunk query returns raw details also which is being sent in form of json. The length of this raw is not constant.. Here is where I have the challenge.

0 Karma

Explorer

That code looks about right to me. Where did you see the 1,000,000 character limit that you mention?

Path Finder

I feel I'm not doing streaming right. Trimming the code bcoz of char restriction.Is streaming rightly done here from Splunk side - or I should take care from UI.

ServiceArgs loginArgs = new ServiceArgs();
//populate loginargs
Service service = Service.connect(loginArgs);
// Retrieve the new saved search
SavedSearch savedSearch = service.getSavedSearches().get("mysearch");
Job jss= savedSearch.dispatch();
while (!jss.isDone()) {
Thread.sleep(500);}}
Args oparg= new Args();//put json
InputStream res = jss.getResults(oparg);
ResultsReaderJson resultsReader = new ResultsReaderJson(results);

0 Karma

Explorer

The Java result readers (including JSON) stream back results, so they aren't usually constrained by data size limits.

JSON data has a limit of 1,000,000 characters in length.

Did you find this limit with a single row, a single _raw value, the entire JSON stream, or something else?