All Apps and Add-ons

Data input using DB connect 3.1.4

hketer
Path Finder

Hello,

I'm using DB connect 3.1.4,

Here are the errors we get

2020-10-21 11:55:00.165 +0100 [QuartzScheduler_Worker-9] ERROR org.easybatch.core.job.BatchJob - Unable to write records
java.io.IOException: HTTP Error 403, HEC response body: {"text":"Invalid token","code":4}, trace: HttpResponseProxy{HTTP/1.1 403 Forbidden [Date: Wed, 21 Oct 2020 10:55:00 GMT, Content-Type: application/json; charset=UTF-8, X-Content-Type-Options: nosniff, Content-Length: 33, Vary: Authorization, Connection: Keep-Alive, X-Frame-Options: SAMEORIGIN, Server: Splunkd] ResponseEntityProxy{[Content-Type: application/json; charset=UTF-8,Content-Length: 33,Chunked: false]}}
at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEventBatch(HttpEventCollector.java:132)
at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEvents(HttpEventCollector.java:96)
at com.splunk.dbx.server.dbinput.recordwriter.HecEventWriter.writeRecords(HecEventWriter.java:36)
at org.easybatch.core.job.BatchJob.writeBatch(BatchJob.java:203)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:79)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
2020-10-21 11:55:00.165 +0100 [QuartzScheduler_Worker-9] INFO org.easybatch.core.job.BatchJob - Job 'XXXXXXXXXXXX' finished with status: FAILED

 

Execute SQL works, connection with DB is good

The problem is that I don't see any new data coming in

Already checked the Timstamp

and tried to add  to db_inputs.conf

token = [My_Token_Number]

 

Please assist.

Thanks,

Hen

Labels (2)
0 Karma

kennetkline
Path Finder

Looking at the stackTrace;

This appears to be the biggest clue;   As response from HEC preventing data write, not the JAVA/DbConnect/SQL Query.

HEC response body: {"text":"Invalid token","code":4},

Digging back:  I found this article:

https://community.splunk.com/t5/Getting-Data-In/Why-am-I-getting-the-following-Http-Event-Collector-...

Maybe multiple http inputs exist with duplicate (same name)?  

More about the HEC collector:
https://docs.splunk.com/Documentation/Splunk/8.0.6/Data/AboutHECIDXAck

I also was able to fine the HEC codes:

HEC reply codes:
reply HttpInputReply status event_message
0 Success OK Success
1 TokenDisabled FORBIDDEN Token disabled
2 NoAuthorization UNAUTHORIZED Token is required
3 InvalidAuthorization UNAUTHORIZED Invalid authorization
4 TokenNotFound FORBIDDEN Invalid token
5 NoData BAD_REQUEST No data
6 InvalidData BAD_REQUEST Invalid data format
7 IncorrectIndex BAD_REQUEST Incorrect index
8 ServerError has been removed as it is not used anywhere
9 ServerBusy SERVICE_UNAVAILABLE Server is busy
10 NoChannel BAD_REQUEST Data channel is missing
11 InvalidChannel BAD_REQUEST Invalid data channel
12 NoEvent BAD_REQUEST Event field is required
13 BlankEvent BAD_REQUEST Event field cannot be blank
14 AckDisabled BAD_REQUEST ACK is disabled
15 UnsupportedType BAD_REQUEST Error in handling indexed fields
16 QueryStringAuthNotEnabled BAD_REQUEST Query string authorization is not enabled
17 HECHealthy OK HEC is healthy
18 QueuesFull SERVICE_UNAVAILABLE HEC is unhealthy, queues are full
19 AckUnavailable SERVICE_UNAVAILABLE HEC is unhealthy, ack service unavailable
20 QueuesFullAckUnavailable SERVICE_UNAVAILABLE Hec is unhealthy, queues are full, ack service unavailable

0 Karma

hketer
Path Finder

I have 2 inputs with 2 different names.

 

I also tried to upgrade db_connect to 3.4.0 and I still get the same errors.

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...