All Apps and Add-ons

Splunk DB Connect Doesn't index data

L1_marrera
Explorer

Hello world,

I'm running Splunk 6.4.0 build f2c836328108 and I'm trying to install Splunk DB Connect v.3.1.3. When i'm configuring the inputs, i get results from the query but it doesn't index the data to splunk. This are the things that i have done:

  • I went to the "splunk_app_db_connect_server.log", check for errors and found this one:

    2019-04-17 07:54:10.403 -0400  [QuartzScheduler_Worker-31] ERROR c.s.d.s.task.listeners.RecordWriterMetricsListener - action=unable_to_write_batch
        java.io.IOException: HTTP Error 403: Forbidden
            at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEventBatch(HttpEventCollector.java:112)
            at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEvents(HttpEventCollector.java:89)
            at com.splunk.dbx.server.dbinput.recordwriter.HecEventWriter.writeRecords(HecEventWriter.java:36)
            at org.easybatch.core.job.BatchJob.writeBatch(BatchJob.java:203)
            at org.easybatch.core.job.BatchJob.call(BatchJob.java:79)
            at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
            at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
            at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
        2019-04-17 07:54:10.403 -0400  [QuartzScheduler_Worker-31] ERROR c.s.d.s.dbinput.recordwriter.CheckpointUpdater - action=skip_checkpoint_update_batch_writing_failed
        java.io.IOException: HTTP Error 403: Forbidden
            at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEventBatch(HttpEventCollector.java:112)
            at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEvents(HttpEventCollector.java:89)
            at com.splunk.dbx.server.dbinput.recordwriter.HecEventWriter.writeRecords(HecEventWriter.java:36)
            at org.easybatch.core.job.BatchJob.writeBatch(BatchJob.java:203)
            at org.easybatch.core.job.BatchJob.call(BatchJob.java:79)
            at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
            at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
            at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
        2019-04-17 07:54:10.403 -0400  [QuartzScheduler_Worker-31] ERROR org.easybatch.core.job.BatchJob - Unable to write records
        java.io.IOException: HTTP Error 403: Forbidden
            at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEventBatch(HttpEventCollector.java:112)
            at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEvents(HttpEventCollector.java:89)
            at com.splunk.dbx.server.dbinput.recordwriter.HecEventWriter.writeRecords(HecEventWriter.java:36)
            at org.easybatch.core.job.BatchJob.writeBatch(BatchJob.java:203)
            at org.easybatch.core.job.BatchJob.call(BatchJob.java:79)
            at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
            at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
            at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
        2019-04-17 07:54:10.403 -0400  [QuartzScheduler_Worker-31] INFO  org.easybatch.core.job.BatchJob - Job '[Job_Name]' finished with status: FAILED
    

Then i went to the DB Connect troubleshooter (https://docs.splunk.com/Documentation/DBX/3.1.3/DeployDBX/Troubleshooting#Debug_HTTP_Event_Collector...) and did what it says there. When i changed the port, I got another error:

2019-04-17 14:39:28.656 -0400  [QuartzScheduler_Worker-1] ERROR org.easybatch.core.job.BatchJob - Unable to write records
org.apache.http.conn.HttpHostConnectException: Connect to 127.0.0.1:8090 [/127.0.0.1] failed: Connection refused: connect
    at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:159)
    at org.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:359)
    at org.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:381)
    at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
    at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
    at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
    at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:111)
    at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
    at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEventBatch(HttpEventCollector.java:109)
    at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEvents(HttpEventCollector.java:89)
    at com.splunk.dbx.server.dbinput.recordwriter.HecEventWriter.writeRecords(HecEventWriter.java:36)
    at org.easybatch.core.job.BatchJob.writeBatch(BatchJob.java:203)
    at org.easybatch.core.job.BatchJob.call(BatchJob.java:79)
    at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
    at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
Caused by: java.net.ConnectException: Connection refused: connect
    at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
    at java.net.DualStackPlainSocketImpl.socketConnect(Unknown Source)
    at java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
    at java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
    at java.net.AbstractPlainSocketImpl.connect(Unknown Source)
    at java.net.PlainSocketImpl.connect(Unknown Source)
    at java.net.SocksSocketImpl.connect(Unknown Source)
    at java.net.Socket.connect(Unknown Source)
    at org.apache.http.conn.socket.PlainConnectionSocketFactory.connectSocket(PlainConnectionSocketFactory.java:75)
    at org.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
    ... 18 common frames omitted
2019-04-17 14:39:28.656 -0400  [QuartzScheduler_Worker-1] INFO  org.easybatch.core.job.BatchJob - Job '[Job_Name]' finished with status: FAILED

I'm running it in a Windows Server 2012. don't know what more can i do now.

0 Karma
1 Solution

L1_marrera
Explorer

I solved it!

The token was missing from the $SPLUNK_HOME/etc/apps/splunk_app_db_connect/local/inputs.conf file.
I added the following lines and restarted splunk:

[http://[My_Input_Name]]
disabled = 0
token = [My_Token_Number]

The docs. (https://docs.splunk.com/Documentation/DBX/3.1.4/DeployDBX/inputsspec) say it should be in the db_inputs.conf file but it wasn't there either.

View solution in original post

0 Karma

L1_marrera
Explorer

I solved it!

The token was missing from the $SPLUNK_HOME/etc/apps/splunk_app_db_connect/local/inputs.conf file.
I added the following lines and restarted splunk:

[http://[My_Input_Name]]
disabled = 0
token = [My_Token_Number]

The docs. (https://docs.splunk.com/Documentation/DBX/3.1.4/DeployDBX/inputsspec) say it should be in the db_inputs.conf file but it wasn't there either.

0 Karma

shivakarnati
New Member

Hi Marrera,

Please help me how to get the token Number?

0 Karma

L1_marrera
Explorer

Go to settings > Data Inputs > HTTP Event Collector. There should be one created for db connect, if not, there is where you can create it.

0 Karma
Get Updates on the Splunk Community!

CX Day is Coming!

Customer Experience (CX) Day is on October 7th!! We're so excited to bring back another day full of wonderful ...

Strengthen Your Future: A Look Back at Splunk 10 Innovations and .conf25 Highlights!

The Big One: Splunk 10 is Here!  The moment many of you have been waiting for has arrived! We are thrilled to ...

Now Offering the AI Assistant Usage Dashboard in Cloud Monitoring Console

Today, we’re excited to announce the release of a brand new AI assistant usage dashboard in Cloud Monitoring ...