All Apps and Add-ons

cisco aci Add-on for splunk enterprise : Error with collect.py health

surekhasplunk
Communicator

05-15-2020 09:16:00.244 +0200 ERROR ExecProcessor - message from "python .......app/collect.py -health fvTenant fvAp fvEPg fvAEPg fvBD vzFilter vzEntry vzBrCP fvCtx l3extOut fabricNode" Response too big. Need to collect it in pages. Starting collection...

Why am i getting this error.

Below is the configuration in my inputs.conf

[script://...bin/collect.py -health fvTenant fvAp fvEPg fvAEPg fvBD vzFilter vzEntry vzBrCP fvCtx l3extOut fabricNode]
disabled = 0
sourcetype = cisco:apic:health
index = cisco-aci
interval = 21600

0 Karma

PavelP
Motivator

Hello @surekhasplunk

this is a confirmed cisco issue ( https://quickview.cloudapps.cisco.com/quickview/bug/CSCvc32906 ) and according to source ( https://github.com/datacenter/acitoolkit/blob/master/acitoolkit/acisession.py ) the script does a fallback to getting the info in pieces. Can you check if the info is complete or is missing? As it looks it should not be categorized as an error but as warning/notice/informational message.

        elif resp.status_code == 400 and 'Unable to process the query, result dataset is too big' in resp.text:
            # Response is too big so we will need to get the response in pages
            # Get the first chunk of entries
            log.error('Response too big. Need to collect it in pages. Starting collection...')
            page_number = 0
            log.debug('Getting first page')
            cookies = self._prep_x509_header('GET', url + '&page=%s&page-size=10000' % page_number)
            resp = self.session.get(get_url + '&page=%s&page-size=10000' % page_number,
                                    timeout=timeout, verify=self.verify_ssl, proxies=self._proxies, cookies=cookies)
            entries = []
            if resp.ok:
                entries += resp.json()['imdata']
                orig_total_count = int(resp.json()['totalCount'])
                total_count = orig_total_count - 10000
                while total_count > 0 and resp.ok:
                    page_number += 1
                    log.debug('Getting page %s', page_number)
                    # Get the next chunk
                    cookies = self._prep_x509_header('GET', url + '&page=%s&page-size=10000' % page_number)
                    resp = self.session.get(get_url + '&page=%s&page-size=10000' % page_number,
                                            timeout=timeout, verify=self.verify_ssl,
                                            proxies=self._proxies, cookies=cookies)
                    if resp.ok:
                        entries += resp.json()['imdata']
                        total_count -= 10000
                resp_content = {'imdata': entries,
                                'totalCount': orig_total_count}
                resp._content = json.dumps(resp_content).encode('ascii')
0 Karma
Get Updates on the Splunk Community!

Splunk Mobile: Your Brand-New Home Screen

Meet Your New Mobile Hub  Hello Splunk Community!  Staying connected to your data—no matter where you are—is ...

Introducing Value Insights (Beta): Understand the Business Impact your organization ...

Real progress on your strategic priorities starts with knowing the business outcomes your teams are delivering ...

Enterprise Security (ES) Essentials 8.3 is Now GA — Smarter Detections, Faster ...

As of today, Enterprise Security (ES) Essentials 8.3 is now generally available, helping SOC teams simplify ...