Splunk Search

corrupted CSV results / Python API

tomasv
Explorer

Hi,

I'm using the Python SDK to export some search results to a CSV file, but the results seem to be somehow corrupted unless I specify an exact list of fields. So for example this:

index=myindex field=pattern

produces corrupted results but this

index=myindex field=pattern | table field_a, field_b

works just fine. The corruption looks a bit like a randomly scattered newline characters - the line suddenly ends (in the middle of a field), then there's a part of the original line missing and the remainder is on the next line. (or something like that, I'm not sure what is the exact corruption).

The data do not contain newline characters - when exporting to JSON everything is peachy. Also there are no mixed Windows/Unix line ends (I'm on Linux).

wcolgate_splunk
Splunk Employee
Splunk Employee

Question: I've seen JSON output and it appears to have "nice-afied" the "\n" new lines. Could you capture a CSV and XML portion and send to devinfo@splunk.com?

Thanks,

Wim

0 Karma
Get Updates on the Splunk Community!

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

Splunk Up Your Game: Why It's Time to Embrace Python 3.9+ and OpenSSL 3.0

Did you know that for Splunk Enterprise 9.4, Python 3.9 is the default interpreter? This shift is not just a ...