Reporting

Variable File Name in outputcsv

simonattardGO
Path Finder

Hi,

I need to run a scheduled search to export some logs every certain amount of time. The search I am using is this:

outputtext usexml=false | rename _xml as raw | fields raw | fields - _* | outputcsv results.txt

The problem is that each time the search runs, results.txt gets overridden. I would like to automatically append the time and date to the name of the file Eg. results_3-2-12_12-00.txt

Is this possible?

Thanks in advance.

Tags (1)
1 Solution

Ayn
Legend

You can do this through some subsearch ugliness (or beauty, I guess it's in the eye of the beholder 🙂 )

Subsearches work much like backticks in most UNIX shells, i.e. they run first of all and then return their results back to the outer query. You can put a subsearch anywhere in your search pipeline, including after outputcsv. By default however, a subsearch returns a string that is formatted for being used by the search command. You can change this behaviour by calling format (http://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Format) to make sure the formatting suits your purposes.

The idea here would be to create a dynamic value for the filename in the subsearch, then return that filename to outputcsv.

... | outputcsv [search * | head 1 | eval query="results_".strftime(now(),"%d_%m_%y_%H_%M_%S") | fields query | format "" "" "" "" "" ""]

I don't know your level of Splunk-foo so let me know if you want more explanation on the internal workings of the search. I used now() as a method for getting the date/time that shoul be used when naming the results file - you might want to use another time, but if the current time is OK, just use now().

View solution in original post

Guarddogmonitor
Engager

Part of my snippet folder now! really helpful. thanks

0 Karma

Lowell
Super Champion

The macro approach. Here's yet another approach. You can use an eval-based macro to return the current timestamp and drop that into your search string. This is especially helpful if you want to use this pattern multiple places.

macros.conf

[timestamp]
definition = strftime(time(), "%Y%m%d_%H%M")
iseval = 1

Then your search would look like this:

<my search here> | outputcsv results_`timestamp`.csv

There's a subtle difference between these answers because of the use of time() vs now(), but for many cases it will not matter. Quoting steveyz,

time() is the wall clock time. now() is the "nominal" start time of the search. For example, the scheduler may run a search that is supposed to start at 2PM but really started at 2:15pm, now() would still be 2pm)

See stevyz answer on Can you use now() in eval based macros? for more details.

_d_
Splunk Employee
Splunk Employee

That's good, but here's a less ugly one 🙂

<my search here> | outputcsv [ | stats count | eval filename=strftime(now(), "results_%d_%m_%y_%H_%M_%S") | return $filename]

Enjoy it!

crt89
Communicator

Thank you ! I'll be using this one. 🙂

0 Karma

salbayrak
Engager

I also used this one, but replaced "| stats count" with "| makeresults"

0 Karma

varad_joshi
Communicator

I used this one 😄

0 Karma

Ayn
Legend

You can do this through some subsearch ugliness (or beauty, I guess it's in the eye of the beholder 🙂 )

Subsearches work much like backticks in most UNIX shells, i.e. they run first of all and then return their results back to the outer query. You can put a subsearch anywhere in your search pipeline, including after outputcsv. By default however, a subsearch returns a string that is formatted for being used by the search command. You can change this behaviour by calling format (http://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Format) to make sure the formatting suits your purposes.

The idea here would be to create a dynamic value for the filename in the subsearch, then return that filename to outputcsv.

... | outputcsv [search * | head 1 | eval query="results_".strftime(now(),"%d_%m_%y_%H_%M_%S") | fields query | format "" "" "" "" "" ""]

I don't know your level of Splunk-foo so let me know if you want more explanation on the internal workings of the search. I used now() as a method for getting the date/time that shoul be used when naming the results file - you might want to use another time, but if the current time is OK, just use now().

ShaneNewman
Motivator

I am trying to use this. It will create a file with the correct file name, it just has no contents... Any Ideas?

my command:
outputcsv [ search * | head 1 | eval query="All_lab_new_".strftime(now(),"%b_%d_%Y") | fields query | format "" "" "" "" "" ""]

When I run the search without the outputcsv command I get results...

0 Karma

sdwilkerson
Contributor

Ayn rocks, and this answer saved my butt. I'd give multiple up-votes if I could. Thanks.

0 Karma

Ayn
Legend

np. Could you please mark the answer as accepted? Thanks!

0 Karma

simonattardGO
Path Finder

Thank you very much! 🙂

0 Karma

Ayn
Legend

For one, you're taking a detour by using outputtext. Check this thread for some inspiration: http://splunk-base.splunk.com/answers/5757/export-raw-logs-from-splunk

0 Karma

simonattardGO
Path Finder

Thanks for your help. How can I output the raw text file, without enclosing it in an xml field?

0 Karma

Ayn
Legend

Well you're writing the raw data to the xml field, so Splunk encloses that whole field in double quotes. That is standard behaviour.

0 Karma

simonattardGO
Path Finder

This is the command I am using:

source="10.70.22.80:10514"|outputtext usexml=false | rename xml as raw | fields raw | fields - _* | outputcsv [search * | head 1 | eval query="results".strftime(now(),"%d_%m_%y_%H_%M_%S").".txt" | fields query | format "" "" "" "" "" ""]

You see anything wrong?

0 Karma

Ayn
Legend

outputcsv uses double quotes to enclose some fields. It shouldn't be enclosing complete lines.

0 Karma

simonattardGO
Path Finder

Wonderful!! 🙂 Thanks a lot for that, it works very well.

The only issue I have is that when the file is outputted, each log line is enclosed in double quotes.

Do you know the reason for that?

0 Karma

Ayn
Legend

I don't think that's an actual search error (I'm getting it as well), it's just a message from the search assistant that is used for helping you in some situations with the text you enter into the search field.

0 Karma

simonattardGO
Path Finder

Thanks a lot for your response Ayn.
I tried your suggestion, but I am getting the following error:
This search cannot be parsed when parse_only is set to true

What is the reason for this error?

0 Karma
Get Updates on the Splunk Community!

New This Month in Splunk Observability Cloud - Metrics Usage Analytics, Enhanced K8s ...

The latest enhancements across the Splunk Observability portfolio deliver greater flexibility, better data and ...

Alerting Best Practices: How to Create Good Detectors

At their best, detectors and the alerts they trigger notify teams when applications aren’t performing as ...

Discover Powerful New Features in Splunk Cloud Platform: Enhanced Analytics, ...

Hey Splunky people! We are excited to share the latest updates in Splunk Cloud Platform 9.3.2408. In this ...