Hi,
We need to have a copy of a big SQL table in a CSV file to speed up some lookups...
We do retrieve the data using a savedsearch, and we schedule it to run every hour and save the result to a CSV file.
The search is like this:
| dbxquery maxrows=0
query="query string" connection="db_connection"
| fields field1, field2, field3, field4, field5, field6, field7, field8, field9
Adding the maxrows=0 allow to retrieve all data. If we run the search thru Splunk web, we do see 507.000 results.
If we use the API to get the results as explained in this link:
Exporting Large Result Sets to CSV
We get the full CSV, with 507.000 rows, and we can use it for lookups.
However, if we create a schedule to the savedsearch and a trigger to export to a lookup CSV file, we only get 50.000 lines...
How can we save the whole 500.000 lines to a CSV using the scheduler?
Thanks in advance!
Don't use outputcsv
and use outputlookup
instead.
Hello @futurebroadband,
Your question looks similar to this one: How to overcome CSV max results to email?
So edit limits.conf as follows (and restart Splunk afterwards):
[scheduler]
max_action_results = 500000
[searchresults]
maxresultrows = 500000
Alternatively, instead of using a trigger action, use the outputlookup command:
| dbxquery maxrows=0
query="query string" connection="db_connection"
| fields field1, field2, field3, field4, field5, field6, field7, field8, field9
| outputlookup results.csv
Schedule this report, without a trigger action. I don't think you will need to modify limits.conf in this case.