Getting Data In

How to store data from dbxquery fails with "unable to process Record"?

ron451
Engager

Hi,

In "splunk_app_db_connect" I've defined this input configuration:

[ALERT_SNO_MISMATCH]
connection = PDBAPP_SYSTEM_SCAN
description = Search all 5 minutes for "SNO mismatch"
disabled = 0
index = temp
index_time_mode = dbColumn
input_timestamp_column_number = 1
input_timestamp_format = yyyy-MM-dd HH:mm:ss.SSS
interval = */5 * * * *
mode = batch
query = select to_char(ORIGINATING_TIMESTAMP,'YYYY-MM-DD HH24:MI:SS.FF3') AS TIME, MESSAGE_TEXT\
from v$diag_alert_ext\
where component_id like '%rdbms%'\
and message_text like '%SNO mismatch for LOCAL TRAN%'\
and originating_timestamp>sysdate-1/288;
query_timeout = 300
sourcetype = csv
fetch_size = 10000

Running the SQL statement in SQL explorer I receive such a result set:

TIME MESSAGE_TEXT
2020-10-21 15:30:08.379 SNO mismatch for LOCAL TRAN 14.4.82391
2020-10-21 15:31:07.907 SNO mismatch for LOCAL TRAN 11.30.78254
2020-10-21 15:31:08.709 SNO mismatch for LOCAL TRAN 27.33.68134
2020-10-21 15:31:42.296 SNO mismatch for LOCAL TRAN 20.28.52198

 

But as repetitive running db_input-job it fails always with this error:
2020-10-21 15:30:33.144 +0200 [QuartzScheduler_Worker-21] ERROR org.easybatch.core.job.BatchJob - Unable to process Record: {header=[number=1, source="ALERT_SNO_MISMATCH", creationDate="Wed Oct 21 15:30:33 CEST 2020"], payload=[HikariProxyResultSet@1360101480 wrapping oracle.jdbc.driver.ForwardOnlyResultSet@409d22a]}


2020-10-21 15:30:33.144 +0200 [QuartzScheduler_Worker-21] ERROR org.easybatch.core.job.BatchJob - Unable to process Record: {header=[number=2, source="ALERT_SNO_MISMATCH", creationDate="Wed Oct 21 15:30:33 CEST 2020"], payload=[HikariProxyResultSet@1360101480 wrapping oracle.jdbc.driver.ForwardOnlyResultSet@409d22a]}

It does this error for each row. At the end nothing got stored in index.
Any idea what I'm doing wrong?

Thanks in advance, Aaron

Labels (3)
Tags (1)
0 Karma

amartin6
Path Finder

Were you able to find a solution for this?

0 Karma
Get Updates on the Splunk Community!

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

Splunk Up Your Game: Why It's Time to Embrace Python 3.9+ and OpenSSL 3.0

Did you know that for Splunk Enterprise 9.4, Python 3.9 is the default interpreter? This shift is not just a ...