At this time, I do a nightly log import (stored in a SQL table) for one of our applications. Recently, new data sets were added to the SQL table from the app, going back to last year to help with log data.
Now I have been asked if I can refresh the Splunk data, which we use for alerts, graphs, lookups, etc. While I have updated all these for data going forward, not sure how to go get all the previous data without doing a new index or purging the current one, and doing a new SQL query 1,000,000 records at a time.
Does anyone know if I can import a SQL dump via DB Connect, into an index?
AFAIK, DB Connect cannot import a SQL dump file. If you can get the dump into a CSV format, you can import that in the usual Splunk way. To avoid duplicate data, you'll need to delete the old data or use a new index.
--- If this reply helps you, an upvote would be appreciated.
That is what I feared. Plan was to purge the existing data, then re-import, but to get the time frame requested, I need to pull in parts of over 20 million records, which means doing a lot of updates in the query, or finding another way.