Output XML via a custom search command

Splunk Employee
Splunk Employee

I'm working on a custom search command which will take the results of a search and create an XML output file. As a very simplified example, the search might look like this:

source=a OR source=b | fields host, source, some_field | outputxml

Within my search command, I read the results and aggregate all of the stuff into Python dicts (e.g. source[type]['total'] += 1, source[type][value] += 1, etc), and then attempt to write the results to a randomly named output file, where the XML would look something like:

  <source type="syslog" total="2">
    <some_field value="1" count="1"/>
    <some_field value="0" count="1"/>
  <source type="dhcp" total="1">
    <some_field value="1" count="1"/>

However, I suppose due to map/reduce maybe, multiple output files are created with the results being spread among them. At least, I suppose that it would make sense to be a function of map/reduce, and actually rather cool to see in action.

Is my analysis correct? If so, what is the best practice for handling this merging of results into a single, highly structured output file where order matters?

Tags (1)
0 Karma

Re: Output XML via a custom search command

Splunk Employee
Splunk Employee

I would suggest that it might be easier to get what you want by calling the Splunk API:

wget --no-check-certificate --user=admin --password=changeme -O - --post-data='search sourcetype%3Dmysourcetype | head 2&exec_mode=oneshot' https://localhost:8089/services/search/jobs

There is also generally no need for you to worry about map-reduce. Splunk will take care of that. (It's possible to write map-reduceable search commands if you specify them as streaming, but converting CSV to XML and attempting to merge them in the reduce step is not an operation that will gain from what Splunk already does with the results.) So you can just worry about convert the CSV input to XML on a single node.

View solution in original post