Getting Data In

How to store data from dbxquery fails with "unable to process Record"?

ron451
Engager

Hi,

In "splunk_app_db_connect" I've defined this input configuration:

[ALERT_SNO_MISMATCH]
connection = PDBAPP_SYSTEM_SCAN
description = Search all 5 minutes for "SNO mismatch"
disabled = 0
index = temp
index_time_mode = dbColumn
input_timestamp_column_number = 1
input_timestamp_format = yyyy-MM-dd HH:mm:ss.SSS
interval = */5 * * * *
mode = batch
query = select to_char(ORIGINATING_TIMESTAMP,'YYYY-MM-DD HH24:MI:SS.FF3') AS TIME, MESSAGE_TEXT\
from v$diag_alert_ext\
where component_id like '%rdbms%'\
and message_text like '%SNO mismatch for LOCAL TRAN%'\
and originating_timestamp>sysdate-1/288;
query_timeout = 300
sourcetype = csv
fetch_size = 10000

Running the SQL statement in SQL explorer I receive such a result set:

TIME MESSAGE_TEXT
2020-10-21 15:30:08.379 SNO mismatch for LOCAL TRAN 14.4.82391
2020-10-21 15:31:07.907 SNO mismatch for LOCAL TRAN 11.30.78254
2020-10-21 15:31:08.709 SNO mismatch for LOCAL TRAN 27.33.68134
2020-10-21 15:31:42.296 SNO mismatch for LOCAL TRAN 20.28.52198

 

But as repetitive running db_input-job it fails always with this error:
2020-10-21 15:30:33.144 +0200 [QuartzScheduler_Worker-21] ERROR org.easybatch.core.job.BatchJob - Unable to process Record: {header=[number=1, source="ALERT_SNO_MISMATCH", creationDate="Wed Oct 21 15:30:33 CEST 2020"], payload=[HikariProxyResultSet@1360101480 wrapping oracle.jdbc.driver.ForwardOnlyResultSet@409d22a]}


2020-10-21 15:30:33.144 +0200 [QuartzScheduler_Worker-21] ERROR org.easybatch.core.job.BatchJob - Unable to process Record: {header=[number=2, source="ALERT_SNO_MISMATCH", creationDate="Wed Oct 21 15:30:33 CEST 2020"], payload=[HikariProxyResultSet@1360101480 wrapping oracle.jdbc.driver.ForwardOnlyResultSet@409d22a]}

It does this error for each row. At the end nothing got stored in index.
Any idea what I'm doing wrong?

Thanks in advance, Aaron

Labels (3)
Tags (1)
0 Karma

amartin6
Path Finder

Were you able to find a solution for this?

0 Karma
Get Updates on the Splunk Community!

New Case Study Shows the Value of Partnering with Splunk Academic Alliance

The University of Nevada, Las Vegas (UNLV) is another premier research institution helping to shape the next ...

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...