All Apps and Add-ons

DB Connect MySQL can't parse timestamp

rapmancz
Explorer

Hello,

I am using DB connect 3.1.1 to get data from MySQL. I have 15 working inputs with rising id column and timestamp (DATETIME) to index data. Server and table structure is same for all inputs.
But for 4 next inputs, I get this message in db_connect_server.log :

2018-01-23 00:39:47.459 +0100  [QuartzScheduler_Worker-28] ERROR org.easybatch.core.job.BatchJob - Unable to open record reader
java.time.format.DateTimeParseException: Text '2017-12-04 10:58:57.0' could not be parsed, unparsed text found at index 19
    at java.time.format.DateTimeFormatter.parseResolved0(Unknown Source)
    at java.time.format.DateTimeFormatter.parse(Unknown Source)
    at com.splunk.dbx.server.dbinput.recordreader.columnprocessor.RowWithTimeProcessor.extractTimestampFromString(RowWithTimeProcessor.java:88)
    at com.splunk.dbx.server.dbinput.recordreader.columnprocessor.RowWithTimeProcessor.apply(RowWithTimeProcessor.java:70)
    at com.splunk.dbx.server.dbinput.recordreader.columnprocessor.RowWithTimeProcessor.apply(RowWithTimeProcessor.java:28)
    at com.google.common.collect.Iterators$7.transform(Iterators.java:750)
    at com.google.common.collect.TransformedIterator.next(TransformedIterator.java:47)
    at com.google.common.collect.Iterators$6.computeNext(Iterators.java:616)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140)
    at com.splunk.dbx.server.dbinput.recordreader.iterator.EventPayloadRecordIterator.hasNext(EventPayloadRecordIterator.java:76)
    at com.splunk.dbx.server.dbinput.recordreader.DbInputRecordReader.open(DbInputRecordReader.java:92)
    at org.easybatch.core.job.BatchJob.openReader(BatchJob.java:117)
    at org.easybatch.core.job.BatchJob.call(BatchJob.java:74)
    at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
    at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
    at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
2018-01-23 00:39:47.459 +0100  [QuartzScheduler_Worker-28] INFO  org.easybatch.core.job.BatchJob - Job 'psa-wfk-lc_04307600_xx' finished with status: FAILED

The timestamp column is also DATETIME and everything is fine in GUI wizard to select column for timestamp. Please do you have any idea what can be wrong?

0 Karma

EmEdwards
Path Finder

Did you resolve this ? I have the exact same problem at the moment. My same query worked in version 2.4 we upgraded to 3.1.1 and now same query same data input doesn't work due to time format with the same error you had.

0 Karma

Richfez
SplunkTrust
SplunkTrust

"Index 19" is, if I count correctly, the "." separating seconds from the "0" for tenths of a second. That brings up a couple of possibilities. First, double check against a known-good one (one that works) exactly what type of data the data and time is coming from. Then check exactly what the query is you are running. I'll bet either the column types are different, or there's a cast/convert (or whatever MySQL uses for that) in the ones that work. For that matter, what's the timestamp time specifier you put into the GUI? (Do you still have to do that?)

0 Karma

p_gurav
Champion

Could you try converting timestamp field in sql query?

0 Karma
Get Updates on the Splunk Community!

Splunk Platform | Upgrading your Splunk Deployment to Python 3.9

Splunk initially announced the removal of Python 2 during the release of Splunk Enterprise 8.0.0, aiming to ...

From Product Design to User Insights: Boosting App Developer Identity on Splunkbase

co-authored by Yiyun Zhu & Dan Hosaka Engaging with the Community at .conf24 At .conf24, we revitalized the ...

Detect and Resolve Issues in a Kubernetes Environment

We’ve gone through common problems one can encounter in a Kubernetes environment, their impacts, and the ...