<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Can all fields be outputted with outputcsv in double quotes? in Reporting</title>
    <link>https://community.splunk.com/t5/Reporting/Can-all-fields-be-outputted-with-outputcsv-in-double-quotes/m-p/331270#M5962</link>
    <description>&lt;P&gt;I was stuck on this issue many years ago, and after exhausting efforts to get splunk to understand that some legacy systems are rigid and only expect things one way.&lt;/P&gt;

&lt;P&gt;The easiest, reliable solution I found was to use a python script to output all fields into double quotes. Because my data set was giant and the aggregate included all types of characters, it was particularly hard to get a regex or pattern-based solution to work- I wouldn't trust one either. It took ~30 minutes per run across millions of records and external lookups.&lt;/P&gt;

&lt;P&gt;This was the only way to get a reliable and consistent csv export and I scheduled it in cron to run daily and added the -24H interval in the query itself.&lt;/P&gt;

&lt;P&gt;I don't get why it wasn't a default mode, but whatever. Forces you away from the legacy structure.&lt;/P&gt;

&lt;P&gt;You have two methods to get it done:&lt;BR /&gt;
Simply use the request library to get the results as output and then parse through them. This will be easier, but will probably take long for big jobs. &lt;A href="https://www.splunk.com/blog/2011/08/02/splunk-rest-api-is-easy-to-use.html"&gt;https://www.splunk.com/blog/2011/08/02/splunk-rest-api-is-easy-to-use.html&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;The other one is to use Splunk's Python SDK to create jobs, then stream the results, while processing them.  &lt;A href="http://dev.splunk.com/python"&gt;http://dev.splunk.com/python&lt;/A&gt;&lt;BR /&gt;
&lt;A href="https://www.splunk.com/blog/2013/09/15/exporting-large-results-sets-to-csv.html"&gt;https://www.splunk.com/blog/2013/09/15/exporting-large-results-sets-to-csv.html&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Another possible option for you is to get a JSON/XML output and have a back end script that converts it to your liking. This might be a bit more hacky, but easier to mock up.&lt;/P&gt;</description>
    <pubDate>Mon, 23 Oct 2017 23:21:58 GMT</pubDate>
    <dc:creator>hasan300zx</dc:creator>
    <dc:date>2017-10-23T23:21:58Z</dc:date>
    <item>
      <title>Can all fields be outputted with outputcsv in double quotes?</title>
      <link>https://community.splunk.com/t5/Reporting/Can-all-fields-be-outputted-with-outputcsv-in-double-quotes/m-p/331269#M5961</link>
      <description>&lt;P&gt;We are currently using the outputcsv command to generate a report for one of our support teams.  Overall it works great but they did have one request - currently only fields that have a special character in them get included in double quotes; all other fields are simply comma separated.  To make life easier on that time we'd like to accomplish one of two goals:  1) have all fields included in double quotes, or 2) be able to output in a structured format other than CSV such as PSV or tilde-separated (their choice, not mine).&lt;/P&gt;

&lt;P&gt;Our overall search is pretty standard - it's just a standard listing of output fields:  |table field1 field2 field3 ... field X&lt;/P&gt;

&lt;P&gt;Any help is greatly appreciated!&lt;/P&gt;</description>
      <pubDate>Mon, 23 Oct 2017 21:00:55 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Reporting/Can-all-fields-be-outputted-with-outputcsv-in-double-quotes/m-p/331269#M5961</guid>
      <dc:creator>burras</dc:creator>
      <dc:date>2017-10-23T21:00:55Z</dc:date>
    </item>
    <item>
      <title>Re: Can all fields be outputted with outputcsv in double quotes?</title>
      <link>https://community.splunk.com/t5/Reporting/Can-all-fields-be-outputted-with-outputcsv-in-double-quotes/m-p/331270#M5962</link>
      <description>&lt;P&gt;I was stuck on this issue many years ago, and after exhausting efforts to get splunk to understand that some legacy systems are rigid and only expect things one way.&lt;/P&gt;

&lt;P&gt;The easiest, reliable solution I found was to use a python script to output all fields into double quotes. Because my data set was giant and the aggregate included all types of characters, it was particularly hard to get a regex or pattern-based solution to work- I wouldn't trust one either. It took ~30 minutes per run across millions of records and external lookups.&lt;/P&gt;

&lt;P&gt;This was the only way to get a reliable and consistent csv export and I scheduled it in cron to run daily and added the -24H interval in the query itself.&lt;/P&gt;

&lt;P&gt;I don't get why it wasn't a default mode, but whatever. Forces you away from the legacy structure.&lt;/P&gt;

&lt;P&gt;You have two methods to get it done:&lt;BR /&gt;
Simply use the request library to get the results as output and then parse through them. This will be easier, but will probably take long for big jobs. &lt;A href="https://www.splunk.com/blog/2011/08/02/splunk-rest-api-is-easy-to-use.html"&gt;https://www.splunk.com/blog/2011/08/02/splunk-rest-api-is-easy-to-use.html&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;The other one is to use Splunk's Python SDK to create jobs, then stream the results, while processing them.  &lt;A href="http://dev.splunk.com/python"&gt;http://dev.splunk.com/python&lt;/A&gt;&lt;BR /&gt;
&lt;A href="https://www.splunk.com/blog/2013/09/15/exporting-large-results-sets-to-csv.html"&gt;https://www.splunk.com/blog/2013/09/15/exporting-large-results-sets-to-csv.html&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Another possible option for you is to get a JSON/XML output and have a back end script that converts it to your liking. This might be a bit more hacky, but easier to mock up.&lt;/P&gt;</description>
      <pubDate>Mon, 23 Oct 2017 23:21:58 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Reporting/Can-all-fields-be-outputted-with-outputcsv-in-double-quotes/m-p/331270#M5962</guid>
      <dc:creator>hasan300zx</dc:creator>
      <dc:date>2017-10-23T23:21:58Z</dc:date>
    </item>
    <item>
      <title>Re: Can all fields be outputted with outputcsv in double quotes?</title>
      <link>https://community.splunk.com/t5/Reporting/Can-all-fields-be-outputted-with-outputcsv-in-double-quotes/m-p/331271#M5963</link>
      <description>&lt;P&gt;Thanks - knowing that there weren't any options out there to do it directly in Splunk I ended up cobbling together a gawk script that was able to convert the output to a tilde-separated field.&lt;/P&gt;</description>
      <pubDate>Tue, 24 Oct 2017 19:35:10 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Reporting/Can-all-fields-be-outputted-with-outputcsv-in-double-quotes/m-p/331271#M5963</guid>
      <dc:creator>burras</dc:creator>
      <dc:date>2017-10-24T19:35:10Z</dc:date>
    </item>
  </channel>
</rss>

