Getting Data In

HF1 -> HF2 -> Indexer: How to route a set of data that has already been parsed on Heavy Forwarder #1 to a specified index on the indexer through HF #2?

Contributor

Hello, all

I have infrastructure like this 1stHF => 2ndHF => Indexer

On the first Heavy Forwarder, I clone some set of data to the second HF using this way:
http://answers.splunk.com/answers/224060/cloning-set-of-data-to-specified-splunk-indexer.html#answer...

On the 2nd HF, I need to route data to an index depending on a field value like in this case:
http://answers.splunk.com/answers/50761/how-do-i-route-data-to-specific-index-based-on-a-field.html
but as I understand, it is not possible to do this because data are already parsed by the 1st HF.

Is it possible to make a HF not parse data?
or what way can I clone data (for example create 2 inputs for same file).

Tags (2)

SplunkTrust
SplunkTrust

Splunk support doesn't necessarily approve of this approach, but you can configure HF2 to cook (parse) the data a second time. So, twice cooked data 🙂

Make SURE that HF2 has all of the right props.conf rules to parse the data all over again, because it will be doing just that. Also, this may cause weirdness with indexed extraction data, so TEST TEST TEST TEST.

In inputs.conf where you define the splunktcp input on HF2 add this line to that stanza:

route=has_key:_replicationBucketUUID:replicationQueue;has_key:_dstrx:typingQueue;has_key:_linebreaker:parsingQueue;absent_key:_linebreaker:parsingQueue

This effecting tells HF2 that you want data reparsed there regardless of how it was parsed prior.

Again, TEST TEST TEST!

Thanks! It works.

0 Karma

Explorer

Works. Very useful.

0 Karma

Path Finder

This worked well for me. The above "route" forced second HF to parse data

0 Karma

Contributor

we testing it. In case of success I will accept answer.

0 Karma