Splunk Search

How do I generate many output csv files from a single search ?

fk319
Builder

I am reviewing the scheduled jobs on our Splunk system and I noticed that several people are running the same query many times and extracting something slightly different each time.


With each query taking 5-10 minutes each in the off hours, I can save a lot of time by running the search only once. I can do this in a view, but don't know how to do it in a search.


Any suggestions?

Tags (2)
0 Karma
1 Solution

Sqig
Path Finder

Your subject mentions writing csv files, so I assume you really do want your data ultimately to come out of the Splunk system and go into several identical copies on your real filesystem.

I would run the one search and pipe it through a custom command that simply writes the data out to several output files.

Here are some samples to start in case you haven't worked with them before (and this is off the top of my head, so beware a random syntax problem... also note that the Splunk documentation on this stuff is a bit convoluted and seems to have some problems):

After the below steps are done, you would run your search and add | script perl mydistrib to the end.

To get set up:

Add an entry to your commands.conf file

[mydistrib] 
filename=distrib.pl
type=perl
retainsevents=yes
streaming=no
enableheader=false

Then in your /ops/splunk/etc/searchscripts directory, create a script named the same as "filename" above.

#!/usr/bin/perl

@outfiles = ("/path1/file1","/path2/file2","/path3/file3");
$main_out = "/path/to/primary/outfile";

open(OUTFILE,">$main_out) or die "Cannot open $main_out for writing\n";

# Copy everything Splunk sends via STDIN to a master output file
while (<>) {
   print OUTFILE "$_";
}
close OUTFILE;

# Now just duplicate the file.
foreach $target (@outfiles) {
   system("cp $main_out $target");
}

View solution in original post

Sqig
Path Finder

Your subject mentions writing csv files, so I assume you really do want your data ultimately to come out of the Splunk system and go into several identical copies on your real filesystem.

I would run the one search and pipe it through a custom command that simply writes the data out to several output files.

Here are some samples to start in case you haven't worked with them before (and this is off the top of my head, so beware a random syntax problem... also note that the Splunk documentation on this stuff is a bit convoluted and seems to have some problems):

After the below steps are done, you would run your search and add | script perl mydistrib to the end.

To get set up:

Add an entry to your commands.conf file

[mydistrib] 
filename=distrib.pl
type=perl
retainsevents=yes
streaming=no
enableheader=false

Then in your /ops/splunk/etc/searchscripts directory, create a script named the same as "filename" above.

#!/usr/bin/perl

@outfiles = ("/path1/file1","/path2/file2","/path3/file3");
$main_out = "/path/to/primary/outfile";

open(OUTFILE,">$main_out) or die "Cannot open $main_out for writing\n";

# Copy everything Splunk sends via STDIN to a master output file
while (<>) {
   print OUTFILE "$_";
}
close OUTFILE;

# Now just duplicate the file.
foreach $target (@outfiles) {
   system("cp $main_out $target");
}

gladiatorankit
Explorer

I Fired the command on search box but I am getting error
Error in 'script' command: Cannot find program 'mydistrib' or script 'mydistrib'.

I have copied the distrib.pl in \splunk\etc\apps\search\scripts

and I have two conf file and the path is \Splunk\etc\system\default and the second ones path is
\Splunk\etc\apps\search\default

0 Karma

peppersprayy
New Member

index=(what ever you index) | convert ctime(_time) as timestamp | table EXAMPLE (timestamp name signature src spt dst dpt) what ever field sets you want to bring back)** | sendmail to=youremailaddress@email.com **server=(your mail server)** sendresults=true inline=false graceful=true**

Everything in bold are your main commands. Also, you put in this command | convert ctime(_time) as timestamp when you care about the time stamp. By default the time will not come out right when you output to CSV, therefore the need for the command.

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...