Getting Data In

Reindexing already indexed data and props/transforms.conf

mark
Path Finder

Hi All,

Question about reindexing indexed data:

I have a legacy 4.2.x splunk server running.
Its set to index all data and forward onto a new instance.
The intention is that I want to deprecate the legacy instance, but not until the new instance is up and running..

The new instance has various props.conf and transforms.conf files configured. I also have various universal forwarders and the data from the universal forwarders is getting transformed as I want to the new instance. In this case mostly just setting an appropriate index/etc to index to.

However this is not the case with the legacy instance forwarding to the new instance. None of the props.conf and transform.conf files seem to have any effect on the new instance..

When I convert the legacy instance to a Light Forwarder ( 'splunk enable app splunkLightForwarder' ) everything starts to work as I want it; props.conf and transform.conf process the events as I want splunk to on the new instance.

So why is this? I assume the(now configured) light forwarder is sending raw/uncooked data and the my new instance just applies the props/transforms fine..

The event data has been transformed (and indexed) on my legacy server and forwarded to my new instance, why don't my props/transforms apply on the new instance?
Can this props/transforms only occurs once in the forwarding pipeline?
How can I reapply props/transform mods on my new Splunk instance?
Can't cooked events be recooked, is this the problem?
Or should multiple props/transforms files of sequential forwarders be possible?

Also tryied setting 'SendCookedData=false’ on the legacy instance - no good 😞

Can some provide some insight please...
Thanks..

0 Karma

kristian_kolb
Ultra Champion

Though I have not experimented with your type of setup, I do believe you're right: data passes through the parsing phase just once.

So when your legacy machine is configured as Heavy Forwarder w Index-and-Forward, the local props and transforms are applied. When configuring it as a LWF, the props and transforms on the new indexer are used.

Much as with porkchops, I don't think you can cook->uncook->re-cook your data, i.e. force the new indexer to redo the parsing. Also, with index-and-forward on the legacy machine, I don't think you can send uncooked data to a Splunk instance further down the processing line, i.e. the legacy machine will parse and index first, then do the forwarding.

Would it not be possible to copy the relevant parts of the props and transforms from the new indexer to the legacy machine and have it do the parsing there?

Hope this helps,

Kristian

0 Karma

mark
Path Finder

This does validate the results I'm seeing.

I'm trying to keep the legacy instance running until the time we switch to the new instance. There are a few things that I need to adjust eg. Data in be stored in 'main' on the legacy insances, but I want it go to an alternate index on the new instance. This is why I want to index twice or an trying to apply the props/transforms config twice...

I figured the 'SendCookedData=false' param would help here - doesn't seem to work though. If you have any suggestion how I could acheive what I'm trying to do, please let me know...

Thanks,
Mark

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...