Getting Data In

"Failed to find newline while reading transport header" when querying API

dedupper
Explorer

Hello,

I have a problem with a custom app in Splunk. I've written a simple app that uses the Python requests-library to query the Microsoft Graph API. It works perfectly for most queries, but when I try to use it to get all users in our AAD environment, it throws an error:

ERROR ChunkedExternProcessor [111784 phase_1] - Failed to find newline while reading transport header.

This always happens at the same page (I have to use pagination, since the API returns 100 lines per response). I've looked at that page, and the one after, but nothing special caught my eye.

This is a Splunk-specific issue: I can use the requests-library to get all the results and the json-library to dump them with no problems, but when I use these in conjunction with splunklib and yield the results as rows, I get the error above. The logs (with debug-mode on) don't seem to have any other clues.

Could this be an encoding issue - could the results have some special characters that throw the Python code off somehow?

Any help is greatly appreciated!

Labels (1)
0 Karma
1 Solution

dedupper
Explorer

I figured out what the problem was: the default 50 000 row limit in Splunk. For some reason trying to write more lines results in this baffling error message.

View solution in original post

0 Karma

dedupper
Explorer

I figured out what the problem was: the default 50 000 row limit in Splunk. For some reason trying to write more lines results in this baffling error message.

0 Karma
Get Updates on the Splunk Community!

New Case Study Shows the Value of Partnering with Splunk Academic Alliance

The University of Nevada, Las Vegas (UNLV) is another premier research institution helping to shape the next ...

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...