Getting Data In

Azure Kubernetes Service (AKS) - log ingestion with Splunk

edoardo_vicendo
Contributor

Hi,

I am trying to understand the best/cost effective approach to ingest logs from Azure AKS in Splunk Enterprise with Enterprise Security.

The logs we have to collect are mainly for security purposes.

Here the options I have found:

  • Use the "Splunk OpenTelemetry Collector for Kubernetes"

https://docs.splunk.com/Documentation/SVA/current/Architectures/OTelKubernetes

  • Use Cloud facilities to export the logs to Storage Accounts
  • Use Cloud facilities to export the logs to Event Hubs
  • Use Cloud facilities to send syslog to a Log Analytics workspace

https://learn.microsoft.com/en-us/azure/azure-monitor/containers/container-insights-syslog

 

references:

https://learn.microsoft.com/en-us/azure/azure-monitor/containers/monitor-kubernetes

https://learn.microsoft.com/en-us/azure/aks/monitor-aks

https://learn.microsoft.com/en-us/azure/azure-monitor/logs/logs-data-export?tabs=portal

https://learn.microsoft.com/en-us/azure/architecture/aws-professional/eks-to-aks/monitoring

https://learn.microsoft.com/en-us/azure/azure-monitor/logs/log-analytics-workspace-overview

 

Is there a way to use Cloud facilities to stream the logs directly to Splunk so that we can avoid deploying the OTEL collector?

Otherwise, if we must save the logs first to a Workspace/Storage Accounts/Event Hubs and export them with Splunk via API calls with "Splunk Add-on for Microsoft Cloud Services" or with "Microsoft Azure Add-on for Splunk", which is the best/cost effective approach?

Thanks a lot,

Edoardo

Tags (3)

RP-TSB
Engager

Have you found a solution for this? I'm on the same quest.

0 Karma

edoardo_vicendo
Contributor

Hello,

Yes I have been able to find a good way to do it. I wanted to write a solution post for this topic but I never had chance. I’ll do it providing all the steps and config. To summarize the way I found is:

1- in Azure AKS in diagnostic settings (if I remember well) you can decide to spool the logs you need into a Storage Account or a Streaming Service. If you don’t need real time go with Storage Account that is cheaper. 

2- you then read with Microsoft TA from that Storage Account every 5 minutes

3- you set-up a policy to cancel data older than 7 days from your Storage Account. Retention policy can be adjusted as per your preference, but here act mostly like a buffer. In this way the cost will be under control. Also, about the REST API billing I didn’t see much of a difference honestly.

4- the Microsoft TA modular input seems having a bug. Basically scheduling it every 5 minutes after several hours it stopped working. As a workaround I downloaded an app with an SPL command that allows you to reload the endpoint you want. I embedded it into a scheduled search that run every 5 minutes, keeping the modular input every hour. In this way it is the scheduled report that trigger the data download. Schedule frequency need to be higher than the time it takes to download your data from the Storage Account and then parse them

5- once you download the data you then have to parse removing the unwanted data. Unfortunately it is a JSON into another JSON, and you need the nested one. I did this for AKS audit but probably can be easily adjusted for other typology of logs

As soon as I have some time I will provide the config as well.

Best Regards,

Edoardo

0 Karma
Get Updates on the Splunk Community!

Take Your Breath Away with Splunk Risk-Based Alerting (RBA)

WATCH NOW!The Splunk Guide to Risk-Based Alerting is here to empower your SOC like never before. Join Haylee ...

Industry Solutions for Supply Chain and OT, Amazon Use Cases, Plus More New Articles ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Enterprise Security Content Update (ESCU) | New Releases

In November, the Splunk Threat Research Team had one release of new security content via the Enterprise ...