Reporting

how to export/import events from indexes?

mataharry
Communicator

I want to move some events from an indexer to another, for a particular period of time.
I saw that there are some importtool and exportool commands in $SPLUNK_HOME/bin
how to use them ?

1 Solution

yannK
Splunk Employee
Splunk Employee

How to export/import selectively data from an indexer to another.

Here is the example for the defaultdb index (the main index)
with $SPLUNK_HOME = /opt/splunk
and a time period from April 10th 00:00 to April 11th 00:00 GMT (equivalent to 1302393600 to 1302480000 epoch time)

1 - roll the hot buckets to warm on the initial indexer


cd /opt/splunk/bin
./splunk _internal call /data/indexes/defaultdb/roll-hot-buckets -auth admin:changeme

specify the correct db name, and password

2- identify the buckets containing data for your time period.

The dates are in epoch time UTC in the filename, in the reverse order.
the filename is db_recentevent_oldestevent_bucketuniquenumber.
You can use http://www.epochconverter.com/ to check


example :
/opt/splunk/var/lib/splunk/defaultdb/db/db_1305913172_1301920239_29/
contains data for the period of
to 1301920239 = GMT: Mon, 04 Apr 2011 12:30:39 GMT
from 1305913172 = GMT: Fri, 20 May 2011 17:39:32 GMT

3 - export the events for the index and the period you need


usage : exporttool db_directory exportfile [-et ] [-lt ] [-csv] [export_search]
example :
cd /opt/splunk/bin
./splunk cmd exporttool /opt/splunk/var/lib/splunk/defaultdb/db/db_1305913172_1301920239_29/ /myexportpath/export1.csv -et 1302393600 -lt 1302480000 -csv

If needed, you can also add a search as last parameter.
Check that an export file was created.
Repeat for each buckets containing data of the good period and change the export file.
If you want to run the export over all the buckets, use a loop command.

4 - import each file into the new indexer, in the proper destination index


usage : importtool

example :
cd /opt/splunk/bin
./splunk cmd importtool /opt/splunk/var/lib/splunk/defaultdb/db /myexportpath/export1.csv
"Successfully imported 71615 events into the bucket.
Please ensure this bucket resides in a valid index and restart Splunk to recognize the new events."

Restart to have splunk detecting the new data and recalculate the metadata.

example :
./splunk restart
....
Perform recovery now? [y/n] y
Recovering (across all data)...
bucket=opt/splunk/var/lib/splunk/defaultdb/db/db_1306285067_1305920377_54 count mismatch tsidx=2525 source-metadata=2524, repairing...
Done

View solution in original post

exabeamer
New Member

Can someone describe the syntax for this:
"If needed, you can also add a search as last parameter." ?

it looks like if I dd at the end 'some_string' it will filter based on that.

However if I do 'sourcetype=some_source' it returns nothing

Does this mean that I cannot use source type to search, or is my syntax incorrect?

0 Karma

bchen
Splunk Employee
Splunk Employee

Great Post!

A couple of corrections during import (at least with 4.2.5):

  • add the bucket dir in the import line, thus:

    /opt/splunk/var/lib/splunk/defaultdb/db/hot_v1_0

  • after restart, I didn't get prompted, perhaps there's a new fsck that happens automatically (you'll see in splunkd.log the recovery occur)

yannK
Splunk Employee
Splunk Employee

How to export/import selectively data from an indexer to another.

Here is the example for the defaultdb index (the main index)
with $SPLUNK_HOME = /opt/splunk
and a time period from April 10th 00:00 to April 11th 00:00 GMT (equivalent to 1302393600 to 1302480000 epoch time)

1 - roll the hot buckets to warm on the initial indexer


cd /opt/splunk/bin
./splunk _internal call /data/indexes/defaultdb/roll-hot-buckets -auth admin:changeme

specify the correct db name, and password

2- identify the buckets containing data for your time period.

The dates are in epoch time UTC in the filename, in the reverse order.
the filename is db_recentevent_oldestevent_bucketuniquenumber.
You can use http://www.epochconverter.com/ to check


example :
/opt/splunk/var/lib/splunk/defaultdb/db/db_1305913172_1301920239_29/
contains data for the period of
to 1301920239 = GMT: Mon, 04 Apr 2011 12:30:39 GMT
from 1305913172 = GMT: Fri, 20 May 2011 17:39:32 GMT

3 - export the events for the index and the period you need


usage : exporttool db_directory exportfile [-et ] [-lt ] [-csv] [export_search]
example :
cd /opt/splunk/bin
./splunk cmd exporttool /opt/splunk/var/lib/splunk/defaultdb/db/db_1305913172_1301920239_29/ /myexportpath/export1.csv -et 1302393600 -lt 1302480000 -csv

If needed, you can also add a search as last parameter.
Check that an export file was created.
Repeat for each buckets containing data of the good period and change the export file.
If you want to run the export over all the buckets, use a loop command.

4 - import each file into the new indexer, in the proper destination index


usage : importtool

example :
cd /opt/splunk/bin
./splunk cmd importtool /opt/splunk/var/lib/splunk/defaultdb/db /myexportpath/export1.csv
"Successfully imported 71615 events into the bucket.
Please ensure this bucket resides in a valid index and restart Splunk to recognize the new events."

Restart to have splunk detecting the new data and recalculate the metadata.

example :
./splunk restart
....
Perform recovery now? [y/n] y
Recovering (across all data)...
bucket=opt/splunk/var/lib/splunk/defaultdb/db/db_1306285067_1305920377_54 count mismatch tsidx=2525 source-metadata=2524, repairing...
Done

timmy13
Communicator

Very helpful post. However, when I run the '_internal call...' command, I do return data, but I find no epoch times listed in the "s:key name=" lines. Can you provide the specific line I'm looking for?

Thanks

0 Karma

splunker12er
Motivator

When I try export , it gives me the following :

[root@test-machine]# /opt/splunk/bin/splunk cmd exporttool ../../db_1409651281_1409651235_37/ /export.csv -et 1409651235 -lt 1409651281 -csv
Using logging configuration at /opt/splunk/etc/log-cmdline.cfg.
no events

What does this shows no events - But actually events are present in this bucket.
I'm using splunk v6.2

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...