All Apps and Add-ons

Splunk DB Connect error DbxOutputCommand.main(DbxOutputCommand.java:100) Caused by: java.lang.NullPointerException

edoardo_vicendo
Contributor

Hi All,

I had this error at it took a while to understand and fix it.

Here my environment:

  • Splunk 8.0.5
  • Splunk DB Connect 3.6.0
  • Java /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.275.b01-0.el6_10.x86_64/jre/bin/java
  • Red Hat Enterprise Linux Server release 6.10 (Santiago)
  • Target DB is PostgreSQL

We have several query all properly running, just one was giving the error.

The query is the following:

 

index=myindex sourcetype=mysourcetype etc…
| dbxoutput output=my_stanza

 

“my_stanza” refers to one present on db_outputs.conf

The error in Splunk Search Head was:

 

rx.exceptions.OnErrorNotImplementedException at rx.internal.util.InternalObservableUtils$ErrorNotImplementedAction.call(InternalObservableUtils.java:386) at rx.internal.util.InternalObservableUtils$ErrorNotImplementedAction.call(InternalObservableUtils.java:383) at rx.internal.util.ActionSubscriber.onError(ActionSubscriber.java:44) at rx.observers.SafeSubscriber._onError(SafeSubscriber.java:153) at rx.observers.SafeSubscriber.onError(SafeSubscriber.java:115) at rx.exceptions.Exceptions.throwOrReport(Exceptions.java:212) at rx.observers.SafeSubscriber.onNext(SafeSubscriber.java:139) at rx.internal.operators.OperatorBufferWithSize$BufferExact.onCompleted(OperatorBufferWithSize.java:128) at rx.internal.operators.OnSubscribeMap$MapSubscriber.onCompleted(OnSubscribeMap.java:97) at rx.internal.operators.OperatorPublish$PublishSubscriber.checkTerminated(OperatorPublish.java:423) at rx.internal.operators.OperatorPublish$PublishSubscriber.dispatch(OperatorPublish.java:505) at rx.internal.operators.OperatorPublish$PublishSubscriber.onCompleted(OperatorPublish.java:305) at rx.internal.operators.OnSubscribeFromIterable$IterableProducer.slowPath(OnSubscribeFromIterable.java:134) at rx.internal.operators.OnSubscribeFromIterable$IterableProducer.request(OnSubscribeFromIterable.java:89) at rx.Subscriber.setProducer(Subscriber.java:211) at rx.internal.operators.OnSubscribeFromIterable.call(OnSubscribeFromIterable.java:63) at rx.internal.operators.OnSubscribeFromIterable.call(OnSubscribeFromIterable.java:34) at rx.Observable.unsafeSubscribe(Observable.java:10327) at rx.internal.operators.OperatorPublish.connect(OperatorPublish.java:214) at rx.observables.ConnectableObservable.connect(ConnectableObservable.java:52) at com.splunk.dbx.command.DbxOutputCommand.process(DbxOutputCommand.java:161) at com.splunk.search.command.StreamingCommand.process(StreamingCommand.java:58) at com.splunk.search.command.ChunkedCommandDriver.execute(ChunkedCommandDriver.java:109) at com.splunk.search.command.AbstractSearchCommand.run(AbstractSearchCommand.java:50) at com.splunk.search.command.StreamingCommand.run(StreamingCommand.java:16) at com.splunk.dbx.command.DbxOutputCommand.main(DbxOutputCommand.java:100) Caused by: java.lang.NullPointerException at java.math.BigDecimal.<init>(BigDecimal.java:809) at com.splunk.dbx.service.output.OutputServiceImpl.setParameterAsObject(OutputServiceImpl.java:288) at com.splunk.dbx.service.output.OutputServiceImpl.setParameter(OutputServiceImpl.java:270) at com.splunk.dbx.service.output.OutputServiceImpl.processInsertion(OutputServiceImpl.java:216) at com.splunk.dbx.service.output.OutputServiceImpl.output(OutputServiceImpl.java:76) at rx.internal.util.ActionSubscriber.onNext(ActionSubscriber.java:39) at rx.observers.SafeSubscriber.onNext(SafeSubscriber.java:134) ... 19 more

 

 

Looking at search.log from job inspector:

 

12-03-2021 17:26:18.187 INFO  DispatchExecutor - END OPEN: Processor=noop
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr: Exception in thread "main" java.lang.IllegalStateException: I/O operation on closed writer
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr:           at com.splunk.search.command.AbstractWriteHandler.checkValidity(AbstractWriteHandler.java:100)
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr:           at com.splunk.search.command.AbstractWriteHandler.flush(AbstractWriteHandler.java:228)
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr:           at com.splunk.search.command.ChunkedWriteHandler.flush(ChunkedWriteHandler.java:69)
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr:           at com.splunk.search.command.AbstractWriteHandler.close(AbstractWriteHandler.java:233)
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr:           at com.splunk.search.command.ChunkedCommandDriver.execute(ChunkedCommandDriver.java:120)
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr:           at com.splunk.search.command.AbstractSearchCommand.run(AbstractSearchCommand.java:50)
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr:           at com.splunk.search.command.StreamingCommand.run(StreamingCommand.java:16)
12-03-2021 17:26:18.188 ERROR ChunkedExternProcessor - stderr:           at com.splunk.dbx.command.DbxOutputCommand.main(DbxOutputCommand.java:100)

 

 

I solved in this way (adding fillnull):

 

index=myindex sourcetype=mysourcetype etc…
| fillnull value=0.00 mbytes_in | fillnull value=0.00 mbytes_out
| dbxoutput output=my_stanza

 

 

There were 2 records in the extraction having "mbytes_in" and "mbytes_out" fields without any value.

I am sure before upgrading to Splunk DB Connect 3.6.0 it was working properly.

The target DB is a PostgreSQL and the table is defined as below, as you can see "mbytes_in" and "mbytes_out" can accept NULL values (and I can see several records in the PostgreSQL DB populated in the past with "mbytes_in" and "mbytes_out" having NULL values)

Here the table definition in PostgreSQL:

 

CREATE TABLE myschema.mytable
(
    field01 integer NOT NULL,
    field02 character varying(6) NOT NULL,
    field03 character varying(6),
    field04 character varying(15) NOT NULL,
    field05 timestamp(6) with time zone NOT NULL,
    mbytes_in numeric(12, 2),
    mbytes_out numeric(12, 2),
    field06 character varying(15) NOT NULL,
    field07 character varying(50),
    field08 character varying(50),
    field09 character varying(50) NOT NULL,
    field10 character varying(255) NOT NULL,
    field11 character varying(15) NOT NULL,
    field12 character varying(255),
    field13 date NOT NULL,
    field14 character varying(255) NOT NULL,
    CONSTRAINT my_pkey PRIMARY KEY (field01)
)
WITH (
    OIDS = FALSE
)
TABLESPACE mytablespace;

ALTER TABLE myschema.mytable
    OWNER to myuser;

GRANT ALL ON TABLE myschema.mytable TO myuser;

 

 

The log error that pointed me to a solution was the following:

at com.splunk.dbx.command.DbxOutputCommand.main(DbxOutputCommand.java:100) Caused by: java.lang.NullPointerException at java.math.BigDecimal.<init>(BigDecimal.java:809)

 

By the way no valuable logs were present in Splunk _internal index, usually when some SPL query fail to insert into our PostgreSQL DB I find valuable information like SQL codes and SQL errors. This time it was not present.

I hope this post will help someone having the same issue.

Best Regards,
Edoardo

Labels (2)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

The solution is in the "question".

---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

The solution is in the "question".

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...