I have setup the TA-ms-loganalytics on my Splunk enterprise instance, and configured the inputs, i have given the start_date as 08/04/2020 00:00:00 in my inputs configuration, the current data flow (13/07/2020) is coming fine, but the count is very less OR zero for the past month dates, i validated the events/data are present there in my azure for the respective dates. below is my inputs.conf
[log_analytics://SourceLogs1_Backlog]
application_id = XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
application_key = **************************
event_delay_lag_time = 15
index = myindex
sourcetype = mysourcetype
interval = 300
log_analytics_query = AuditLogs | where ResourceGroup != ""
resource_group = AAAA-BBB-CC
start_date = 08/04/2020 00:00:00
subscription_id = XXXXXXX-XXXXXX-XXXXX-XXXX-XXXXX
tenant_id = XXXXXXX-XXXXXX-XXXXX-XXXX-XXXXX
workspace_id = XXXXXXX-XXXXXX-XXXXX-XXXX-XXXXX
disabled = 0
[log_analytics://SourceLogs2_Backlog]
application_id = XXXXXXXXXXXXXXXXXXXXXXXXXXX
application_key = ***************************************
event_delay_lag_time = 15
index = myindex
sourcetype = mysourcetype
interval = 300
log_analytics_query = AzureDiagnostics | where ResourceGroup != ""
resource_group = AAAA-BBB-CC
start_date = 08/04/2020 00:00:00
subscription_id = XXXXXXX-XXXXXX-XXXXX-XXXX-XXXXX
tenant_id = XXXXXXX-XXXXXX-XXXXX-XXXX-XXXXX
workspace_id = XXXXXXX-XXXXXX-XXXXX-XXXX-XXXXX
disabled = 0
Hi @jkat54.
I used this and able to pull JSON format data by direct API call.
https://api.loganalytics.io/v1/workspaces/{{workspace_id}}/query?query=AzureDiagnostics | where ResourceGroup != "" | where TimeGenerated between(datetime("2020-04-08 00:00:00") .. datetime("2020-04-12 23:59:59"))
😄
that would add up a lot of manual efforts as the log format will not be the same. As we are using this add-on, could you suggest any code changes which can be done to get the missing data? we have another setup for the same addon on another machine, this current setup we have created for the backlog recovery.
I'm afraid I'm at the end of the rope for ideas.
i don't have an environment to test this in anymore and I can't just log into your machine to debug without contracts etc.
wish I could do more for you sir, apologies that I can't.
Hi @jkat54 ,
Many thanks for your above all information & appreciated support.
We are trying to update your add-on but we need your help, is it could be possible to fetch back data by inputs.conf file 's log_analytics_query with time range as below? as by using this query i can getch backlog april 2020 data by direct API call.
log_analytics_query=AzureDiagnostics | where ResourceGroup != "" | where TimeGenerated between(datetime("2020-04-09 00:00:00")..datetime("2020-04-14 23:59:59"))
Also please suggest us inside your add-on which python files is causing this data pull based on start_date? So that we could also debug on these codes?
Are there any Splunk PS support as it;s quite urgent?
Hi @jkat54.
I used this and able to pull JSON format data by direct API call.
https://api.loganalytics.io/v1/workspaces/{{workspace_id}}/query?query=AzureDiagnostics | where ResourceGroup != "" | where TimeGenerated between(datetime("2020-04-08 00:00:00") .. datetime("2020-04-12 23:59:59"))
Hi @jkat54 ,
Can you please let me know above API query could use in this add-on to pull this 7 days backlogs data as direct query is resulting?
Hi @jkat54 ,
We configure above query in inputs.conf file and able recovery few data from april 9th to april 14th but it's only pull very first time and then it's not pulling at-all.
Is there any configuration need to update or KV store side update ?
log_analytics_query = ContainerLog | where _ResourceId != "" | where TimeGenerated between(datetime("2020-04-09 00:00:00") .. datetime("2020-04-14 23:59:59"))
start_date = 01/01/1970 00:00:00