All Apps and Add-ons

Problem fetching a large number of rows from the database using the command dbxquery

solo
Engager

Hello

I am facing a problem while fetching a large number of rows from the database using the command dbxquery.

I have a table of ~ 500,000 rows. I would like to receive data from this table and save it to a csv file. The data in the table rarely changes, so you want to import it into a csv file. But when trying to fetch data, the request hangs. I tried limiting the selection to a smaller number of rows, tried from 10,000 to 50,000 rows (through rownum...). Sometimes the query returns a result, sometimes it also hangs. Moreover, it can work and freeze with the same limitation, and I do not observe any dependence. For example, I run a query: | dbxquery connection="connection_name" query="query condition... and rownum<50000" and the result is returned, I run the same query again without any changes and it hangs.

Has anyone come across a similar one? How can this be solved?

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...