All Apps and Add-ons

Data input using DB connect 3.1.4

hketer
Path Finder

Hello,

I'm using DB connect 3.1.4,

Here are the errors we get

2020-10-21 11:55:00.165 +0100 [QuartzScheduler_Worker-9] ERROR org.easybatch.core.job.BatchJob - Unable to write records
java.io.IOException: HTTP Error 403, HEC response body: {"text":"Invalid token","code":4}, trace: HttpResponseProxy{HTTP/1.1 403 Forbidden [Date: Wed, 21 Oct 2020 10:55:00 GMT, Content-Type: application/json; charset=UTF-8, X-Content-Type-Options: nosniff, Content-Length: 33, Vary: Authorization, Connection: Keep-Alive, X-Frame-Options: SAMEORIGIN, Server: Splunkd] ResponseEntityProxy{[Content-Type: application/json; charset=UTF-8,Content-Length: 33,Chunked: false]}}
at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEventBatch(HttpEventCollector.java:132)
at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEvents(HttpEventCollector.java:96)
at com.splunk.dbx.server.dbinput.recordwriter.HecEventWriter.writeRecords(HecEventWriter.java:36)
at org.easybatch.core.job.BatchJob.writeBatch(BatchJob.java:203)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:79)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
2020-10-21 11:55:00.165 +0100 [QuartzScheduler_Worker-9] INFO org.easybatch.core.job.BatchJob - Job 'XXXXXXXXXXXX' finished with status: FAILED

 

Execute SQL works, connection with DB is good

The problem is that I don't see any new data coming in

Already checked the Timstamp

and tried to add  to db_inputs.conf

token = [My_Token_Number]

 

Please assist.

Thanks,

Hen

Labels (2)
0 Karma

kennetkline
Path Finder

Looking at the stackTrace;

This appears to be the biggest clue;   As response from HEC preventing data write, not the JAVA/DbConnect/SQL Query.

HEC response body: {"text":"Invalid token","code":4},

Digging back:  I found this article:

https://community.splunk.com/t5/Getting-Data-In/Why-am-I-getting-the-following-Http-Event-Collector-...

Maybe multiple http inputs exist with duplicate (same name)?  

More about the HEC collector:
https://docs.splunk.com/Documentation/Splunk/8.0.6/Data/AboutHECIDXAck

I also was able to fine the HEC codes:

HEC reply codes:
reply HttpInputReply status event_message
0 Success OK Success
1 TokenDisabled FORBIDDEN Token disabled
2 NoAuthorization UNAUTHORIZED Token is required
3 InvalidAuthorization UNAUTHORIZED Invalid authorization
4 TokenNotFound FORBIDDEN Invalid token
5 NoData BAD_REQUEST No data
6 InvalidData BAD_REQUEST Invalid data format
7 IncorrectIndex BAD_REQUEST Incorrect index
8 ServerError has been removed as it is not used anywhere
9 ServerBusy SERVICE_UNAVAILABLE Server is busy
10 NoChannel BAD_REQUEST Data channel is missing
11 InvalidChannel BAD_REQUEST Invalid data channel
12 NoEvent BAD_REQUEST Event field is required
13 BlankEvent BAD_REQUEST Event field cannot be blank
14 AckDisabled BAD_REQUEST ACK is disabled
15 UnsupportedType BAD_REQUEST Error in handling indexed fields
16 QueryStringAuthNotEnabled BAD_REQUEST Query string authorization is not enabled
17 HECHealthy OK HEC is healthy
18 QueuesFull SERVICE_UNAVAILABLE HEC is unhealthy, queues are full
19 AckUnavailable SERVICE_UNAVAILABLE HEC is unhealthy, ack service unavailable
20 QueuesFullAckUnavailable SERVICE_UNAVAILABLE Hec is unhealthy, queues are full, ack service unavailable

0 Karma

hketer
Path Finder

I have 2 inputs with 2 different names.

 

I also tried to upgrade db_connect to 3.4.0 and I still get the same errors.

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...