I have infrastructure like this 1stHF => 2ndHF => Indexer
On the first Heavy Forwarder, I clone some set of data to the second HF using this way:
On the 2nd HF, I need to route data to an index depending on a field value like in this case:
but as I understand, it is not possible to do this because data are already parsed by the 1st HF.
Is it possible to make a HF not parse data?
or what way can I clone data (for example create 2 inputs for same file).
Splunk support doesn't necessarily approve of this approach, but you can configure HF2 to cook (parse) the data a second time. So, twice cooked data 🙂
Make SURE that HF2 has all of the right props.conf rules to parse the data all over again, because it will be doing just that. Also, this may cause weirdness with indexed extraction data, so TEST TEST TEST TEST.
inputs.conf where you define the
splunktcp input on HF2 add this line to that stanza:
This effecting tells HF2 that you want data reparsed there regardless of how it was parsed prior.
Again, TEST TEST TEST!