All Apps and Add-ons

Splunk DB Connect: How to resolve error "ERROR org.easybatch.core.job.BatchJob - Unable to open record reader java.io.IOException: Push back buffer is full"?

carlkennedy
Path Finder

I have recently installed Splunk DB Connect 3.0.2 and I am trying to work with a MySQL database. I am able to use SQL Explorer in the Data Lab tab and can view rows from my table. The identity is not set for Read Only. I am running a job where I output search results into my table. When the job runs, it throws this error:

2017-04-28 12:37:22.884 -0400  [QuartzScheduler_Worker-3] ERROR org.easybatch.core.job.BatchJob - Unable to open record reader
java.io.IOException: Push back buffer is full
    at java.io.PushbackInputStream.unread(PushbackInputStream.java:232)
    at java.io.PushbackInputStream.unread(PushbackInputStream.java:252)
    at com.splunk.InsertRootElementFilterInputStream.<init>(InsertRootElementFilterInputStream.java:85)
    at com.splunk.ResultsReaderXml.<init>(ResultsReaderXml.java:82)
    at com.splunk.ResultsReaderXml.<init>(ResultsReaderXml.java:59)
    at com.splunk.dbx.server.dboutput.recordreader.DbOutputRecordReader.export(DbOutputRecordReader.java:82)
    at com.splunk.dbx.server.dboutput.recordreader.DbOutputRecordReader.open(DbOutputRecordReader.java:75)
    at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
    at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
    at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
    at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)

I do have Splunk DB Connect 3.0.2 running on a non-prod box with the exact same MySQL database, table and credentials and it can write to the table which tells me this is not a permissions issue.

Tags (2)

hexxamillion
Explorer

Were you able to resolve this? What did you find out?

0 Karma

mhoogcarspel_sp
Splunk Employee
Splunk Employee

In Splunk support I've seen a few "Push back buffer is full" where reducing the amount of data returned (by limiting the timeframe of returned events in the search) solved the issue.

0 Karma

hexxamillion
Explorer

Can you elaborate on this a bit more? I am getting the FAILED error as well when trying to output to a table. The same job for the past few days worked fine. It seems to be intermittent I guess. Anyone know a cause for this? I have the latest version of DB Connect and running Splunk 7.2.

0 Karma

jan_lukasz
Explorer

Hello,

Did you finally solved your issue ?

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...