All Apps and Add-ons

Problem fetching a large number of rows from the database using the command dbxquery

solo
Engager

Hello

I am facing a problem while fetching a large number of rows from the database using the command dbxquery.

I have a table of ~ 500,000 rows. I would like to receive data from this table and save it to a csv file. The data in the table rarely changes, so you want to import it into a csv file. But when trying to fetch data, the request hangs. I tried limiting the selection to a smaller number of rows, tried from 10,000 to 50,000 rows (through rownum...). Sometimes the query returns a result, sometimes it also hangs. Moreover, it can work and freeze with the same limitation, and I do not observe any dependence. For example, I run a query: | dbxquery connection="connection_name" query="query condition... and rownum<50000" and the result is returned, I run the same query again without any changes and it hangs.

Has anyone come across a similar one? How can this be solved?

0 Karma
Get Updates on the Splunk Community!

AppDynamics Summer Webinars

This summer, our mighty AppDynamics team is cooking up some delicious content on YouTube Live to satiate your ...

SOCin’ it to you at Splunk University

Splunk University is expanding its instructor-led learning portfolio with dedicated Security tracks at .conf25 ...

Credit Card Data Protection & PCI Compliance with Splunk Edge Processor

Organizations handling credit card transactions know that PCI DSS compliance is both critical and complex. The ...