Activation of Trace at the dbinput level worked.
It seems I do get the metainformation on source, sourcetiype, index, host as they should be but some "Error in handling indexed fields" occurs
could there be an error in proceeding on the data? I received the data stream until Aug. 31st but was unable to resume this process.
2019-03-14 12:39:34.093 +0100 [QuartzScheduler_Worker-15] DEBUG c.s.d.s.dbinput.recordreader.DbInputRecordReader - action=closing_db_reader task=Alvin_Log
2019-03-14 12:39:34.093 +0100 [QuartzScheduler_Worker-15] INFO org.easybatch.core.job.BatchJob - Job 'Alvin_Log' finished with status: FAILED
2019-03-14 12:39:34.093 +0100 [QuartzScheduler_Worker-15] ERROR org.easybatch.core.job.BatchJob - Unable to write records
java.io.IOException: HTTP Error 400, HEC response body: {"text":"Error in handling indexed fields","code":15,"invalid-event-number":0}, trace: HttpResponseProxy{HTTP/1.1 400 Bad Request [Date: Thu, 14 Mar 2019 11:39:34 GMT, Content-Type: application/json; charset=UTF-8, X-Content-Type-Options: nosniff, Content-Length: 78, Vary: Authorization, Connection: Keep-Alive, X-Frame-Options: SAMEORIGIN, Server: Splunkd] ResponseEntityProxy{[Content-Type: application/json; charset=UTF-8,Content-Length: 78,Chunked: false]}}
at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEventBatch(HttpEventCollector.java:132)
at com.splunk.dbx.server.dbinput.recordwriter.HttpEventCollector.uploadEvents(HttpEventCollector.java:96)
at com.splunk.dbx.server.dbinput.recordwriter.HecEventWriter.writeRecords(HecEventWriter.java:36)
at org.easybatch.core.job.BatchJob.writeBatch(BatchJob.java:203)
at org.easybatch.core.job.BatchJob.call(BatchJob.java:79)
at org.easybatch.extensions.quartz.Job.execute(Job.java:59)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
2019-03-14 12:39:34.081 +0100 [QuartzScheduler_Worker-15] INFO c.s.d.s.dbinput.recordwriter.HttpEventCollector - action=writing_events_via_http_event_collector record_count=1000
2019-03-14 12:39:34.078 +0100 [QuartzScheduler_Worker-15] INFO c.s.dbx.server.dbinput.recordwriter.HecEventWriter - action=write_records batch_size=1000
2019-03-14 12:39:34.078 +0100 [QuartzScheduler_Worker-15] DEBUG c.s.d.s.dbinput.task.processors.EventMarshaller - action=start_format_hec_events_from_payload record=Record: {header=[RisingInputRecordHeader{risingColumnValue='2018-08-31 19:28:46.0'} number=1000, source="Alvin_Log", creationDate="2018-08-31 19:28:46.0"], payload=[EventPayload{fieldNames=[FID, AUFTRAG_NR, DATUM, FBG_NR, LOET_PROG, RECHNER, FID_MOB, CARRIER], row=[T-K828700909, 42421363, 2018-08-31 19:28:46.0, A5E36675927, 34, MD1KS4WC, E0040100269A0909, 4711100]}]}
2019-03-14 12:39:34.078 +0100 [QuartzScheduler_Worker-15] DEBUG c.s.d.s.dbinput.task.processors.EventMarshaller - action=finish_format_hec_events record=Record: {header=[RisingInputRecordHeader{risingColumnValue='2018-08-31 19:26:10.0'} number=998, source="Alvin_Log", creationDate="2018-08-31 19:26:10.0"], payload=[{"time":"1535736370,000","event":"2018-08-31 19:26:10.000, FID=\"T-K828700890\", AUFTRAG_NR=\"42421363\", DATUM=\"2018-08-31 19:26:10.0\", FBG_NR=\"A5E36675927\", LOET_PROG=\"34\", RECHNER=\"MD1KS4WC\", FID_MOB=\"E00401009EC8BAC8\", CARRIER=\"4711100\"","source":"Alvin_Log","sourcetype":"dwh","index":"dwh","host":"alvin_log"}]}
... View more