<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic How to store data from dbxquery fails with &amp;quot;unable to process Record&amp;quot;? in Getting Data In</title>
    <link>https://community.splunk.com/t5/Getting-Data-In/How-to-store-data-from-dbxquery-fails-with-quot-unable-to/m-p/525808#M88745</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;In "&lt;EM&gt;splunk_app_db_connect&lt;/EM&gt;" I've defined this input configuration:&lt;/P&gt;
&lt;P&gt;&lt;FONT face="courier new,courier"&gt;[ALERT_SNO_MISMATCH]&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;connection = PDBAPP_SYSTEM_SCAN&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;description = Search all 5 minutes for "SNO mismatch"&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;disabled = 0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;index = temp&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;index_time_mode = dbColumn&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;input_timestamp_column_number = 1&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;input_timestamp_format = yyyy-MM-dd HH:mm:ss.SSS&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;interval = */5 * * * *&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;mode = batch&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;query = select to_char(ORIGINATING_TIMESTAMP,'YYYY-MM-DD HH24:MI:SS.FF3') AS TIME, MESSAGE_TEXT\&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;from v$diag_alert_ext\&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;where component_id like '%rdbms%'\&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;and message_text like '%SNO mismatch for LOCAL TRAN%'\&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;and originating_timestamp&amp;gt;sysdate-1/288;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;query_timeout = 300&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;sourcetype = csv&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;fetch_size = 10000&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;Running the SQL statement in SQL explorer I receive such a result set:&lt;/P&gt;
&lt;TABLE border="1" width="100%"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;&lt;STRONG&gt;TIME&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="50%"&gt;&lt;STRONG&gt;MESSAGE_TEXT&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;2020-10-21 15:30:08.379&lt;/TD&gt;
&lt;TD width="50%"&gt;SNO mismatch for LOCAL TRAN 14.4.82391&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;2020-10-21 15:31:07.907&lt;/TD&gt;
&lt;TD width="50%"&gt;SNO mismatch for LOCAL TRAN 11.30.78254&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;2020-10-21 15:31:08.709&lt;/TD&gt;
&lt;TD width="50%"&gt;SNO mismatch for LOCAL TRAN 27.33.68134&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;2020-10-21 15:31:42.296&lt;/TD&gt;
&lt;TD width="50%"&gt;SNO mismatch for LOCAL TRAN 20.28.52198&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;But as repetitive running db_input-job it fails always with this error:&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;2020-10-21 15:30:33.144 +0200 [QuartzScheduler_Worker-21] ERROR org.easybatch.core.job.BatchJob - &lt;STRONG&gt;Unable to process Record: {header=[number=1&lt;/STRONG&gt;, source="ALERT_SNO_MISMATCH", creationDate="Wed Oct 21 15:30:33 CEST 2020"], payload=[HikariProxyResultSet@1360101480 wrapping &lt;/FONT&gt;&lt;A&gt;oracle.jdbc.driver.ForwardOnlyResultSet@409d22a]}&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;2020-10-21 15:30:33.144 +0200 [QuartzScheduler_Worker-21] ERROR org.easybatch.core.job.BatchJob - Unable to process Record: {header=[number=2, source="ALERT_SNO_MISMATCH", creationDate="Wed Oct 21 15:30:33 CEST 2020"], payload=[HikariProxyResultSet@1360101480 wrapping oracle.jdbc.driver.ForwardOnlyResultSet@409d22a]}&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;It does this error for each row. At the end nothing got stored in index.&lt;BR /&gt;Any idea what I'm doing wrong?&lt;/P&gt;
&lt;P&gt;Thanks in advance, Aaron&lt;/P&gt;</description>
    <pubDate>Thu, 14 Sep 2023 16:30:55 GMT</pubDate>
    <dc:creator>ron451</dc:creator>
    <dc:date>2023-09-14T16:30:55Z</dc:date>
    <item>
      <title>How to store data from dbxquery fails with "unable to process Record"?</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-store-data-from-dbxquery-fails-with-quot-unable-to/m-p/525808#M88745</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;In "&lt;EM&gt;splunk_app_db_connect&lt;/EM&gt;" I've defined this input configuration:&lt;/P&gt;
&lt;P&gt;&lt;FONT face="courier new,courier"&gt;[ALERT_SNO_MISMATCH]&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;connection = PDBAPP_SYSTEM_SCAN&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;description = Search all 5 minutes for "SNO mismatch"&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;disabled = 0&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;index = temp&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;index_time_mode = dbColumn&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;input_timestamp_column_number = 1&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;input_timestamp_format = yyyy-MM-dd HH:mm:ss.SSS&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;interval = */5 * * * *&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;mode = batch&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;query = select to_char(ORIGINATING_TIMESTAMP,'YYYY-MM-DD HH24:MI:SS.FF3') AS TIME, MESSAGE_TEXT\&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;from v$diag_alert_ext\&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;where component_id like '%rdbms%'\&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;and message_text like '%SNO mismatch for LOCAL TRAN%'\&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;and originating_timestamp&amp;gt;sysdate-1/288;&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;query_timeout = 300&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;sourcetype = csv&lt;/FONT&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;fetch_size = 10000&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;Running the SQL statement in SQL explorer I receive such a result set:&lt;/P&gt;
&lt;TABLE border="1" width="100%"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;&lt;STRONG&gt;TIME&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="50%"&gt;&lt;STRONG&gt;MESSAGE_TEXT&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;2020-10-21 15:30:08.379&lt;/TD&gt;
&lt;TD width="50%"&gt;SNO mismatch for LOCAL TRAN 14.4.82391&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;2020-10-21 15:31:07.907&lt;/TD&gt;
&lt;TD width="50%"&gt;SNO mismatch for LOCAL TRAN 11.30.78254&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;2020-10-21 15:31:08.709&lt;/TD&gt;
&lt;TD width="50%"&gt;SNO mismatch for LOCAL TRAN 27.33.68134&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="50%"&gt;2020-10-21 15:31:42.296&lt;/TD&gt;
&lt;TD width="50%"&gt;SNO mismatch for LOCAL TRAN 20.28.52198&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;But as repetitive running db_input-job it fails always with this error:&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;2020-10-21 15:30:33.144 +0200 [QuartzScheduler_Worker-21] ERROR org.easybatch.core.job.BatchJob - &lt;STRONG&gt;Unable to process Record: {header=[number=1&lt;/STRONG&gt;, source="ALERT_SNO_MISMATCH", creationDate="Wed Oct 21 15:30:33 CEST 2020"], payload=[HikariProxyResultSet@1360101480 wrapping &lt;/FONT&gt;&lt;A&gt;oracle.jdbc.driver.ForwardOnlyResultSet@409d22a]}&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;FONT face="courier new,courier"&gt;2020-10-21 15:30:33.144 +0200 [QuartzScheduler_Worker-21] ERROR org.easybatch.core.job.BatchJob - Unable to process Record: {header=[number=2, source="ALERT_SNO_MISMATCH", creationDate="Wed Oct 21 15:30:33 CEST 2020"], payload=[HikariProxyResultSet@1360101480 wrapping oracle.jdbc.driver.ForwardOnlyResultSet@409d22a]}&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;It does this error for each row. At the end nothing got stored in index.&lt;BR /&gt;Any idea what I'm doing wrong?&lt;/P&gt;
&lt;P&gt;Thanks in advance, Aaron&lt;/P&gt;</description>
      <pubDate>Thu, 14 Sep 2023 16:30:55 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-store-data-from-dbxquery-fails-with-quot-unable-to/m-p/525808#M88745</guid>
      <dc:creator>ron451</dc:creator>
      <dc:date>2023-09-14T16:30:55Z</dc:date>
    </item>
    <item>
      <title>Re: storing data from dbxquery fails with "unable to process Record"</title>
      <link>https://community.splunk.com/t5/Getting-Data-In/How-to-store-data-from-dbxquery-fails-with-quot-unable-to/m-p/657484#M111291</link>
      <description>&lt;P&gt;Were you able to find a solution for this?&lt;/P&gt;</description>
      <pubDate>Wed, 13 Sep 2023 20:22:33 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Getting-Data-In/How-to-store-data-from-dbxquery-fails-with-quot-unable-to/m-p/657484#M111291</guid>
      <dc:creator>amartin6</dc:creator>
      <dc:date>2023-09-13T20:22:33Z</dc:date>
    </item>
  </channel>
</rss>

