Hello
I am facing a problem while fetching a large number of rows from the database using the command dbxquery.
I have a table of ~ 500,000 rows. I would like to receive data from this table and save it to a csv file. The data in the table rarely changes, so you want to import it into a csv file. But when trying to fetch data, the request hangs. I tried limiting the selection to a smaller number of rows, tried from 10,000 to 50,000 rows (through rownum...). Sometimes the query returns a result, sometimes it also hangs. Moreover, it can work and freeze with the same limitation, and I do not observe any dependence. For example, I run a query: | dbxquery connection="connection_name" query="query condition... and rownum<50000" and the result is returned, I run the same query again without any changes and it hangs.
Has anyone come across a similar one? How can this be solved?