Getting Data In

Can I use epoch as timestamp in DBConnect?

wmuselle
Path Finder

Hi,

struggling with this for a while.

I have an epoch time value (10 digit NUMBER) that I want to use as both rising column and timestamp for the event.

The first works fine , but getting it to use this for the event timestamp not.

I know about the workaround of timestamp'1970-01-01 00:00:00' + ( "<my_field>" /86400 ) as eventTime,

but preferably I do not ingest an extra field if it is not really necessary.

 

there is no config I can get to work for the timestamp.

from the Java DateTimeFormatter it appears not possible, but just want to ask out here if anyone has ffound something

 

sources: https://docs.splunk.com/Documentation/DBX/3.13.0/DeployDBX/Troubleshooting

 

I have also tried using props.conf,

[mysourcetype ]
SHOULD_LINEMERGE=true
NO_BINARY_CHECK=true
TIME_FORMAT=%s
TIME_PREFIX=my_field=

but it doesnt even reach this part and errors out earlier

error pattern:

2023-06-20 10:27:07.521 +0200 Trace-Id=940a0c4d-8c64-44d7-a948-39fd6e9b7417 [Scheduled-Job-Executor-5] ERROR org.easybatch.core.job.BatchJob - Unable to process Record: {header=[number=1, source="SHE170U-JOURNEY_TRACKER", creationDate="Tue Jun 20 10:27:07 CEST 2023"], payload=[HikariProxyResultSet@2091779680 wrapping oracle.jdbc.driver.ForwardOnlyResultSet@4f527ef2]} java.time.format.DateTimeParseException: Text '1687249036' could not be parsed at index 0 at java.base/java.time.format.DateTimeFormatter.parseResolved0(DateTimeFormatter.java:2052) at java.base/java.time.format.DateTimeFormatter.parse(DateTimeFormatter.java:1880) at com.splunk.dbx.server.dbinput.task.processors.ExtractIndexingTimeProcessor.extractTimestampFromString(ExtractIndexingTimeProcessor.java:112) at com.splunk.dbx.server.dbinput.task.processors.ExtractIndexingTimeProcessor.extractTimestamp(ExtractIndexingTimeProcessor.java:92) at com.splunk.dbx.server.dbinput.task.processors.ExtractIndexingTimeProcessor.processRecord(ExtractIndexingTimeProcessor.java:46) at org.easybatch.core.processor.CompositeRecordProcessor.processRecord(CompositeRecordProcessor.java:61) at org.easybatch.core.job.BatchJob.processRecord(BatchJob.java:209) at org.easybatch.core.job.BatchJob.readAndProcessBatch(BatchJob.java:178) at org.easybatch.core.job.BatchJob.call(BatchJob.java:101) at com.splunk.dbx.server.api.service.conf.impl.InputServiceImpl.runTask(InputServiceImpl.java:298) at com.splunk.dbx.server.api.resource.InputResource.lambda$runInput$1(InputResource.java:162) at com.splunk.dbx.logging.MdcTaskDecorator.run(MdcTaskDecorator.java:23) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833)

 

Labels (1)
0 Karma

VatsalJagani
SplunkTrust
SplunkTrust

@wmuselle 

  • For raising column
    • You should be able to use that column as a raising column without any change
  • For timestamp
    • There is no direct way, to convert it to a timestamp as you mentioned already.

 

I hope this helps!!! Kindly upvote if this helps!!!

0 Karma

wmuselle
Path Finder
  • For timestamp
    • There is no direct way, to convert it to a timestamp as you mentioned already.

 

that's my point , I don't understand why there is no support for epoch timestamps in DB connect

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...