I have an azure app service with CUSTOM text log files (stored locally in app service filesystem). How can I index them in splunk?
I was thinking about the following, but none was working:
You need to send your logs to app fabric, table, blob, or eventhub then pull the data using other Microsoft cloud services app from splunk. Note it doesnt support event hubs but you can send event hub to blob storage and read from there.
In fact none of these addons are good enough. FTP Receiver sets up local ftp server instead of reading logs from remote one. The other addon can only rean diagnostic logs.
Can you provide more info about the scripts? Do they run on splunk server? Can they work realtime? I have daily rolling text files but I would like to have them indexed realtime not only after they are rolled.
Do you have any examples of such script?
Firstly, you have to bear with me , I have 0 experience on azure and a newbie on AWS so I am probably not able to understand simple things in azure.
Please see this - https://docs.splunk.com/Documentation/Splunk/7.2.6/Data/MonitorWMIdata#Security_and_remote_access_co...
Can you open cmd on your local and curl into the remote machine to read the log files? If you can then we can always set up a script , key thing is NOT the indexing here, but how you connect from your local to your remote instance AND download the log info from the remote machine.
I suggest you google a bit on pyhton or shell or curl command/scripts on how to connect and get logs from a remote azure instance. After that its a cakewalk and I can guide you in that but firstly, can you (you have to, if your splunk is on a different instance than the remote azure instance) gather the logs from the remote instance to your local?