All Apps and Add-ons

Troubleshoot Splunk DB connect error

ips_mandar
Builder

Hi,
I am using splunk db connect version 3.1.4 and I am getting below error for sourcetype=dbx_server -

[QuartzScheduler_Worker-32] ERROR c.s.d.s.d.r.columnprocessor.RowProcessor - action=fail_to_transform_column row_id=N/A
java.sql.SQLException: Value '0000-00-00 00:00:00' can not be represented as java.sql.Timestamp
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:897)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:886)
    at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:860)
    at com.mysql.jdbc.ResultSetImpl.getTimestampFromString(ResultSetImpl.java:5673)
    at com.mysql.jdbc.ResultSetImpl.getStringInternal(ResultSetImpl.java:5306)
    at com.mysql.jdbc.ResultSetImpl.getString(ResultSetImpl.java:5135)
    at com.zaxxer.hikari.pool.HikariProxyResultSet.getString(HikariProxyResultSet.java)
    at com.splunk.dbx.server.dbinput.recordreader.columnprocessor.TimezoneAwareProcessor.transform(TimezoneAwareProcessor.java:30)
    at com.splunk.dbx.server.dbinput.recordreader.columnprocessor.RowProcessor.transformColumn(RowProcessor.java:67)
    at com.splunk.dbx.server.dbinput.recordreader.columnprocessor.RowProcessor.apply(RowProcessor.java:56)
    at com.splunk.dbx.server.dbinput.task.processors.EventPayloadProcessor.processRecord(EventPayloadProcessor.java:59)
    at org.easybatch.core.processor.CompositeRecordProcessor.processRecord(CompositeRecordProcessor.java:38)
    at org.easybatch.core.job.BatchJob.processRecord(BatchJob.java:179)
    at org.easybatch.core.job.BatchJob.readAndProcessBatch(BatchJob.java:152)
    at org.easybatch.core.job.BatchJob.call(BatchJob.java:78)
    at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
    at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)

seems to be timestamp issue but not sure for which input it is getting..any idea how I can start troubleshooting this.

0 Karma

treywebb
Explorer

I actually ran into this last night and found a fix that I thought i would share. Basically the way i understand it is that the database has a DateTime field that is empty and the java driver doesn't like it. So you can add a parameter to the jdbcUrl to tell it to covert them to Nulls like so:

jdbcUrlFormat = jdbc:mysql://<host>:<port>/<database>?zeroDateTimeBehavior=convertToNull

This should be in your local/db_connection_types.conf file (in my case under a mysql stanza.

I found this information here: https://stackoverflow.com/questions/11133759/0000-00-00-000000-can-not-be-represented-as-java-sql-ti...

0 Karma

harsmarvania57
Ultra Champion

Search using QuartzScheduler_Worker-32 in $SPLUNK_HOME/var/log/splunk/splunk_app_db_connect_server.log , as far as I know quartz scheduler worker number is same for all different steps for single DB input execution at particular time (NOTE: Same quartz scheduler number will assign to another jobs, it is starting from 1 and will go to 32 (Increment of one for every new job) and once it reach 32, it will reset again to 1 & this is my observation only)

0 Karma

skalliger
Motivator

Look through the _internal logs and see this doc: Troubleshooting DB Connect and see if that helps you finding the root caue.

Skalli

0 Karma

ips_mandar
Builder

Thanks @skalliger above error is coming under internal logs index=_internal sourcetype=dbx_server ...provided link wont help as I am not understanding for which db input it is happening

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...