All Apps and Add-ons

cisco aci Add-on for splunk enterprise : Error with collect.py health

surekhasplunk
Communicator

05-15-2020 09:16:00.244 +0200 ERROR ExecProcessor - message from "python .......app/collect.py -health fvTenant fvAp fvEPg fvAEPg fvBD vzFilter vzEntry vzBrCP fvCtx l3extOut fabricNode" Response too big. Need to collect it in pages. Starting collection...

Why am i getting this error.

Below is the configuration in my inputs.conf

[script://...bin/collect.py -health fvTenant fvAp fvEPg fvAEPg fvBD vzFilter vzEntry vzBrCP fvCtx l3extOut fabricNode]
disabled = 0
sourcetype = cisco:apic:health
index = cisco-aci
interval = 21600

0 Karma

PavelP
Motivator

Hello @surekhasplunk

this is a confirmed cisco issue ( https://quickview.cloudapps.cisco.com/quickview/bug/CSCvc32906 ) and according to source ( https://github.com/datacenter/acitoolkit/blob/master/acitoolkit/acisession.py ) the script does a fallback to getting the info in pieces. Can you check if the info is complete or is missing? As it looks it should not be categorized as an error but as warning/notice/informational message.

        elif resp.status_code == 400 and 'Unable to process the query, result dataset is too big' in resp.text:
            # Response is too big so we will need to get the response in pages
            # Get the first chunk of entries
            log.error('Response too big. Need to collect it in pages. Starting collection...')
            page_number = 0
            log.debug('Getting first page')
            cookies = self._prep_x509_header('GET', url + '&page=%s&page-size=10000' % page_number)
            resp = self.session.get(get_url + '&page=%s&page-size=10000' % page_number,
                                    timeout=timeout, verify=self.verify_ssl, proxies=self._proxies, cookies=cookies)
            entries = []
            if resp.ok:
                entries += resp.json()['imdata']
                orig_total_count = int(resp.json()['totalCount'])
                total_count = orig_total_count - 10000
                while total_count > 0 and resp.ok:
                    page_number += 1
                    log.debug('Getting page %s', page_number)
                    # Get the next chunk
                    cookies = self._prep_x509_header('GET', url + '&page=%s&page-size=10000' % page_number)
                    resp = self.session.get(get_url + '&page=%s&page-size=10000' % page_number,
                                            timeout=timeout, verify=self.verify_ssl,
                                            proxies=self._proxies, cookies=cookies)
                    if resp.ok:
                        entries += resp.json()['imdata']
                        total_count -= 10000
                resp_content = {'imdata': entries,
                                'totalCount': orig_total_count}
                resp._content = json.dumps(resp_content).encode('ascii')
0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Introduction to Splunk AI

How are you using AI in Splunk? Whether you see AI as a threat or opportunity, AI is here to stay. Lucky for ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...

Maximizing the Value of Splunk ES 8.x

Splunk Enterprise Security (ES) continues to be a leader in the Gartner Magic Quadrant, reflecting its pivotal ...