I don't know that you need a python script to do that. Simple bash will do it. But, if you want to go down that route, here is some help:
Example a basic python backup script:
Example of a closer example that only copies specific files and maintains directory structure:
and a cron formatter so that you can schedule it:
I highly recommend asking these types of questions over at stackexchange or stackoverflow, as there are more people over there equipped to answer that type of a question. Not that the Splunk community can't, but that you'll have more success over there with this type of question.
A very lazy solution to keep a daily copy for a week and a monthly copy for a year of the whole $SPLUNK_HOME/etc needs tow simple cron jobs.
50 23 * * * umask 0007 ; tar -czf /opt/splunk/etc-`date +"\%A"`.tgz /opt/splunk/etc > /dev/null 2>&1 55 23 1 * * umask 0007 ; tar -czf /opt/splunk/etc-`date +"\%B"`.tgz /opt/splunk/etc > /dev/null 2>&1
Uses one GB of disk space and allows to recover from any mishaps in the etc directory. You can of course use a find to select just the csv files:
tar -czf /opt/splunk/etc-`date +"\%A"`.tgz `find /opt/splunk/etc -name "*.csv"`
No need for any (python) script at all
Hey @ksarode, did you find a solution for your problem? If not do you want to describe in more detail what you are trying to accomplish to see if someone has an answer for you?