I have a Python script which will take input file as .log and produces .csv files. I used to upload these .csv files in Splunk and process them to create charts and Statistics table.
I wanted to know if there is any option where I upload all .log files to splunk and the Python script will run on the .log files which are uploaded to splunk and generate .csv files which are automatically uploaded to splunk... In Short, a one-shot solution...like At the beginning I will just upload all .log files ...maybe with click of a button in dashboard it can create all .csv files, upload them automatically and then i can create queries to create Charts.
Neither the current process nor the proposed one sounds like it's "one shot".
For more of a "one-shot" operation, schedule your python script as a Splunk scripted input. It can read the .log files, convert them into CSV, and write them to stdout while Splunk will index. That eliminates the interim files.
--- If this reply helps you, Karma would be appreciated.