All Apps and Add-ons

Problem fetching a large number of rows from the database using the command dbxquery

solo
Engager

Hello

I am facing a problem while fetching a large number of rows from the database using the command dbxquery.

I have a table of ~ 500,000 rows. I would like to receive data from this table and save it to a csv file. The data in the table rarely changes, so you want to import it into a csv file. But when trying to fetch data, the request hangs. I tried limiting the selection to a smaller number of rows, tried from 10,000 to 50,000 rows (through rownum...). Sometimes the query returns a result, sometimes it also hangs. Moreover, it can work and freeze with the same limitation, and I do not observe any dependence. For example, I run a query: | dbxquery connection="connection_name" query="query condition... and rownum<50000" and the result is returned, I run the same query again without any changes and it hangs.

Has anyone come across a similar one? How can this be solved?

0 Karma
Get Updates on the Splunk Community!

Cultivate Your Career Growth with Fresh Splunk Training

Growth doesn’t just happen—it’s nurtured. Like tending a garden, developing your Splunk skills takes the right ...

Introducing a Smarter Way to Discover Apps on Splunkbase

We’re excited to announce the launch of a foundational enhancement to Splunkbase: App Tiering.  Because we’ve ...

How to Send Splunk Observability Alerts to Webex teams in Minutes

As a Developer Evangelist at Splunk, my team and I are constantly tinkering with technology to explore its ...