i have my databricks setup in aws which runs multiple ETL pipelines. i want to send logs, metrices, application flow tracker etc. in splunk. i am not sure on how this can be achieved. i have my organisation splunk setup where i can generate my auth token and can see the endpoint details. whether this is enough to push data from databricks to splunk or i need to have open telemetry alike collector which will read the data stored in databricks /some/location and push them to splunk?
... View more