All Apps and Add-ons

Why are DBconnect 3 inputs unable to write records and giving an http 400: bad request error?

bkoehler4070
Explorer

I have DBconnect 3.1.3 running on a 7.0.1 instance with 3 DB Inputs, 2 of them work perfectly but the third one puts out the error below and then fails. Two of the connections are almost identical except they go to different DBs, but one works and one doesn't. I have tried restarting and reconfiguring the input.

2018-04-12 18:48:43.627 +0000 [QuartzScheduler_Worker-29] ERROR c.s.d.s.task.listeners.RecordWriterMetricsListener - action=unable_to_write_batch
java.io.IOException: HTTP Error 400: Bad Request
at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEventBatch(HttpEventCollector.java:112)
at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEvents(HttpEventCollector.java:89)
at com.splunk.dbx.server.dbinput.recordwriter.HecEventWriter.writeRecords(HecEventWriter.java:36)
at org.easybatch.core.job.BatchJob.writeBatch(BatchJob.java:203)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:79)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)

1 Solution

bkoehler4070
Explorer

Found the problem, the HEC fails to write when a single event has 800000+ characters. Had 1 log message that was ~800x the size of a normal log message and that was causing the failure, when that message is skipped (by changing the rising column) everything goes back to working as intended.

View solution in original post

Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...