Hello, Yes I have been able to find a good way to do it. I wanted to write a solution post for this topic but I never had chance. I’ll do it providing all the steps and config. To summarize the way ...
See more...
Hello, Yes I have been able to find a good way to do it. I wanted to write a solution post for this topic but I never had chance. I’ll do it providing all the steps and config. To summarize the way I found is: 1- in Azure AKS in diagnostic settings (if I remember well) you can decide to spool the logs you need into a Storage Account or a Streaming Service. If you don’t need real time go with Storage Account that is cheaper. 2- you then read with Microsoft TA from that Storage Account every 5 minutes 3- you set-up a policy to cancel data older than 7 days from your Storage Account. Retention policy can be adjusted as per your preference, but here act mostly like a buffer. In this way the cost will be under control. Also, about the REST API billing I didn’t see much of a difference honestly. 4- the Microsoft TA modular input seems having a bug. Basically scheduling it every 5 minutes after several hours it stopped working. As a workaround I downloaded an app with an SPL command that allows you to reload the endpoint you want. I embedded it into a scheduled search that run every 5 minutes, keeping the modular input every hour. In this way it is the scheduled report that trigger the data download. Schedule frequency need to be higher than the time it takes to download your data from the Storage Account and then parse them 5- once you download the data you then have to parse removing the unwanted data. Unfortunately it is a JSON into another JSON, and you need the nested one. I did this for AKS audit but probably can be easily adjusted for other typology of logs As soon as I have some time I will provide the config as well. Best Regards, Edoardo