Have you been able to root cause this issue? I have come across a similar one. When using Python SDK, jobs.export and BufferedReader (reader = results.ResultsReader(io.BufferedReader(search_results)), on some occasions I get the following exception: Traceback (most recent call last): File ".../splunk_event_editor.py", line 747, in search_and_modify self._get_field_types_from_splunk(search_query, sampling=sampling, no_change_stop=2000) File ".../ams/splunk_event_editor.py", line 434, in _get_field_types_from_splunk for item in reader: File ".../python3.7/site-packages/splunklib/results.py", line 210, in next return next(self._gen) File ".../python3.7/site-packages/splunklib/results.py", line 219, in _parse_results for event, elem in et.iterparse(stream, events=('start', 'end')): File ".../python3.7/xml/etree/ElementTree.py", line 1222, in iterator yield from pullparser.read_events() File ".../python3.7/xml/etree/ElementTree.py", line 1297, in read_events raise event File ".../python3.7/xml/etree/ElementTree.py", line 1269, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: not well-formed (invalid token): line 51128, column 3080 The same code/query usually works a moment later so I suspect that it may have something to do with the fact that the new events matched by the search query might be arriving (via HTTP Event Collector) during the execution of the export.
... View more