Getting Data In

With a heavy forwarder, how do I redirect a data set to separate indexes with different indexes?

daniel333
Builder

All,

I have a data set that I need in indexclusterA as index=distil. HOW EVER I need that same data in indexclusterB as index=web. Data all flow through a Heavy forwarder.

Any idea how I would do this?

0 Karma

sduff_splunk
Splunk Employee
Splunk Employee

As an alternative to changing the index on the recipient HF/Indexer, you may try using sourcetype cloning. The caveat will be that the sourcetype will end up different on each cluster (although you could put additional config there to change it back).

On the HF,
props.conf
[original_sourcetype]
TRANSFORMS-clone = clone_sourcetype

[sourcetype2]
TRANSFORMS-change_index = change_index

transforms.conf
[clone_sourcetype]
CLONE_SOURCETYPE = sourcetype2
REGEX = .

[change_index]
REGEX = .
DEST_KEY = _MetaData:Index
FORMAT = web

0 Karma

ddrillic
Ultra Champion
0 Karma

sduff_splunk
Splunk Employee
Splunk Employee

On the Heavy Forwarder, you would use multiple [tcpout:<target_group>], one for indexclusterA, and the other for indexclusterB. The Heavy Forwarder should also have an inputs.conf file with the input's index set to distill.

On the 2nd index cluster, you have an inputs.conf with a Splunk TCP input monitor, [splunktcp:9997]. Under this stanza, add the line, queue = parsingQueue. This will ensure that props and transforms on the index cluster will apply.

Then, in props.conf, have the following stanza.

 [whatever source/sourcetype/host you want to change]
 TRANSFORMS-changeindex = changeindex

And transforms.conf

 [changeindex]
 REGEX = .
 DEST_KEY = _MetaData:Index
 FORMAT = web

You could also change the splunktcp line to only match the IP address of the Heavy Forwarder, if you have multiple forwarders logging to the indexers.

0 Karma

daniel333
Builder

No way to handle this at the heavy forwarder level? Enabling "queue = parsingQueue" I believe reprocesses cooked data, the results would be I'd have to move over dozens of apps and reprocess the data over and this would be a huge load on my indexers?

0 Karma
Get Updates on the Splunk Community!

Updated Team Landing Page in Splunk Observability

We’re making some changes to the team landing page in Splunk Observability, based on your feedback. The ...

New! Splunk Observability Search Enhancements for Splunk APM Services/Traces and ...

Regardless of where you are in Splunk Observability, you can search for relevant APM targets including service ...

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...