Building for the Splunk Platform

Why does TA-ms-loganalytics stops sending data?

jaihingorani
Explorer

Hi

hope someone can help here.

ta-ms-loganalytics have suddenly stopped working, i can see below type of errors being logged about the modular inputs 

ERROR ModularInputs - Unable to initialize modular input "log_analytics" defined inside the app "TA-ms-loganalytics": Introspecting scheme=log_analytics: script running failed (killed by signal 9: Killed).

raise ConnectionError(e, request=request)\nConnectionError: HTTPSConnectionPool(host='127.0.0.1', port=8089): Max retries exceeded with url: /servicesNS/nobody/TA-ms-loganalytics/data/inputs/log_analytics?count=0&output_mode=json (Caused by NewConnectionError('<solnlib.packages.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f118318e610>: Failed to establish a new connection: [Errno 111] Connection refused',))\n"

any help much appreciated.

Labels (3)
0 Karma

jaihingorani
Explorer

Thank You @jkat54  for responding, we checked with our network teams, there were no changes related to firewalls, we have restarted the splunk services, and we dont see these connection errors anymore, its now logging "ERROR ModularInputs - Unable to initialize modular input "log_analyticsdefined inside the app "TA-ms-loganalytics": Introspecting scheme=log_analytics: script running failed (killed by signal 9: Killed)." 

0 Karma

jkat54
SplunkTrust
SplunkTrust

It says it's not able to establish a connection.  Did firewall rules change?  Can you curl the azure api from the host?

0 Karma

jaihingorani
Explorer

Hello again @jkat54 

it turned out to be some issues at the OS end, as the CLI was responding slow as well, the logs started to come in post the server reboot. But few of the sources are having the errors as below:

2022-04-12 13:22:23,902 ERROR pid=26000 tid=MainThread file=base_modinput.py:log_error:307 | OMSInputName="My_Diagnostics" status="400" step="Post Query" response="{"error":{"message":"Response size too large","code":"ResponseSizeError","correlationId":"<myID>","innererror":{"code":"ResponseSizeError","message":"Maximum response size of 100000000 bytes exceeded. Actual response Size is 107148567 bytes."}}}"

 

Can you suggest if this is something which we can tweak the values for ?

0 Karma

jkat54
SplunkTrust
SplunkTrust

you can reduce the amount of data the query returns by adjusting the query.

0 Karma

jaihingorani
Explorer

Thank you @jkat54 , we have been considering this option, but turns out that our internal teams requires all the data which was being pulled. Is there any way we can increase this limit instead ?

0 Karma

jkat54
SplunkTrust
SplunkTrust

I'm not super familiar with log analytics queries.

can you limit the time frame of the query?

in splunk you have earliest, latest, etc.  can you grab half the data with one query and the other half with another?

0 Karma

jaihingorani
Explorer

but the inputs are running at every 10minutes of interval, setting the timestamp in the query will not work, because if we do so it will pull the same timeframe of data every 10minutes. is there any way to increase the response size in the TA-ms-loganalytics add on ?

0 Karma

jkat54
SplunkTrust
SplunkTrust

Does the query pull all the data every 10 minutes?  Instead of getting data for last 10m?

If you run it every 5 minutes will it help?

Please note the size limitation is not from my app, it's the limit on the azure API.  You have to reduce the results your query returns or it will always have this error.

you should appeal to azure support if can't figure out how to reduce the size of your query, and can't increase the limit on the API.

There's literally nothing I can do on my end to fix this issue.

 

0 Karma

jaihingorani
Explorer

Thanks @jkat54 , and understood its the API limit. 

the interval is changed to 5 minutes, and its still the same. should the value of event_delay_lag_time = 15 also need to be reduced ?

0 Karma

jkat54
SplunkTrust
SplunkTrust

Hear me out...

if you run a query that generates more events than the limit, it will not matter how often or even when you run that query.

so what you have to do, is change your query or increase the limit.  

0 Karma

jaihingorani
Explorer

Thanks again, we will explore the possibilities to change the query.

0 Karma

jaihingorani
Explorer
curl the azure API -> can you give an example how can i check this ?
0 Karma

jkat54
SplunkTrust
SplunkTrust
0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...