All Apps and Add-ons

Why are DBconnect 3 sometime unable to write records, read timed out, unable_to_write_batch

dailv1808
Path Finder

I have DBconnect 3.1.5 running on a 8.1 instance.

sometime i got errors below and it will stop ingesting data.

dailv1808_0-1629963987249.png

dailv1808_1-1629964005744.png

 

dailv1808_2-1629964018638.png

 

 

Labels (3)
Tags (1)
0 Karma

codebuilder
Influencer

That generally means your query is running too long or returning too many results, and you've exhausted the allocated memory or hit a config limit.

----
An upvote would be appreciated and Accept Solution if it helps!
0 Karma

dailv1808
Path Finder
Thanks your response. In my case, I thinks it returning too many results. is there any way to fix it?
0 Karma

codebuilder
Influencer

You can limit results by adding "maxrows" to your dbxquery.

maxrows=1000 for example. You can test with different values until you find one that doesn't cause the issue.

----
An upvote would be appreciated and Accept Solution if it helps!
0 Karma

dailv1808
Path Finder

Error in HF (DBC3 is installed)

dailv1808_0-1631000657071.png

 

0 Karma

dailv1808
Path Finder

i tried many  different values  "maxrows" from 1000 to unlimit, and fetch size value from 300 to 1000, 3000, 30000,.. But it still got above errors.

0 Karma

isoutamo
SplunkTrust
SplunkTrust
Another thing what you could try with this is extend time-out.
R. Ismo
0 Karma

dailv1808
Path Finder

some time i got HEC service error

dailv1808_0-1630928184174.png

 

0 Karma

dailv1808
Path Finder

i change timeout to 300 seconds, it still got errors.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...