Getting Data In

With a heavy forwarder, how do I redirect a data set to separate indexes with different indexes?

daniel333
Builder

All,

I have a data set that I need in indexclusterA as index=distil. HOW EVER I need that same data in indexclusterB as index=web. Data all flow through a Heavy forwarder.

Any idea how I would do this?

0 Karma

sduff_splunk
Splunk Employee
Splunk Employee

As an alternative to changing the index on the recipient HF/Indexer, you may try using sourcetype cloning. The caveat will be that the sourcetype will end up different on each cluster (although you could put additional config there to change it back).

On the HF,
props.conf
[original_sourcetype]
TRANSFORMS-clone = clone_sourcetype

[sourcetype2]
TRANSFORMS-change_index = change_index

transforms.conf
[clone_sourcetype]
CLONE_SOURCETYPE = sourcetype2
REGEX = .

[change_index]
REGEX = .
DEST_KEY = _MetaData:Index
FORMAT = web

0 Karma

ddrillic
Ultra Champion
0 Karma

sduff_splunk
Splunk Employee
Splunk Employee

On the Heavy Forwarder, you would use multiple [tcpout:<target_group>], one for indexclusterA, and the other for indexclusterB. The Heavy Forwarder should also have an inputs.conf file with the input's index set to distill.

On the 2nd index cluster, you have an inputs.conf with a Splunk TCP input monitor, [splunktcp:9997]. Under this stanza, add the line, queue = parsingQueue. This will ensure that props and transforms on the index cluster will apply.

Then, in props.conf, have the following stanza.

 [whatever source/sourcetype/host you want to change]
 TRANSFORMS-changeindex = changeindex

And transforms.conf

 [changeindex]
 REGEX = .
 DEST_KEY = _MetaData:Index
 FORMAT = web

You could also change the splunktcp line to only match the IP address of the Heavy Forwarder, if you have multiple forwarders logging to the indexers.

0 Karma

daniel333
Builder

No way to handle this at the heavy forwarder level? Enabling "queue = parsingQueue" I believe reprocesses cooked data, the results would be I'd have to move over dozens of apps and reprocess the data over and this would be a huge load on my indexers?

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to November Tech Talks, Office Hours, and Webinars!

&#x1f342; Fall into November with a fresh lineup of Community Office Hours, Tech Talks, and Webinars we’ve ...

Transform your security operations with Splunk Enterprise Security

Hi Splunk Community, Splunk Platform has set a great foundation for your security operations. With the ...

Splunk Admins and App Developers | Earn a $35 gift card!

Splunk, in collaboration with ESG (Enterprise Strategy Group) by TechTarget, is excited to announce a ...