Getting Data In

With indexAndForward=true on a heavy forwarder, how do we route data to an index with a different name on the target indexer?


Hi Team,

We are planning to migrate our existing indexed data to a new Enterprise Server which is up and running, serving other depts. Our plan is to make an existing server as a heavy forwarder which will send data to Enterprise indexer. Before that, for testing and minimizing the outage, we want to enable indexAndForward=true so that existing instance and new instance will receive same data and user can still work on the existing system while the new system will be tested.

My question: We must change index name on the new system to adhere policy, so how will data routing will work? Say for e.g. on the existing system, data comes in index "abc" and on target there will be another index "def" which should mimic its functionality. Of course we will change the knowledge object to reflect the new index name, but how will data route which comes on index=abc to index=def with indexAndForward=true?

Please help!!


0 Karma

Esteemed Legend

you could do this with aCLONE_SOURCETYPE although it does double the parse load and probably also the license cost (not sure about license). This basically sticks the event back in the top of the indexing pipeline though you do have to change the sourcetype (so it would be useful to be able to do it at INDEX_AND_FORWARD time). You could use RENAME to change the sourcetype back to the original value:

* This name is wrong; a transform with this setting actually clones and
  modifies events, and assigns the new events the specified sourcetype.

* If CLONE_SOURCETYPE is used as part of a transform, the transform will
  create a modified duplicate event, for all events that the transform is
  applied to via normal props.conf rules.
* Use this feature if you need to store both the original and a modified
  form of the data in your system, or if you want to send the original and a
  modified form to different outbound systems.
  * A typical example would be to retain sensitive information according to
    one policy and a version with the sensitive information removed
    according to another policy.  For example, some events may have data
    that you must retain for 30 days (such as personally identifying
    information) and only 30 days with restricted access, but you need that
    event retained without the sensitive data for a longer time with wider
* Specifically, for each event handled by this transform, a near-exact copy
  is made of the original event, and the transformation is applied to the
  copy.  The original event will continue along normal data processing
* The <string> used for CLONE_SOURCETYPE selects the sourcetype that will be
  used for the duplicated events.
* The new sourcetype MUST differ from the the original sourcetype.  If the
  original sourcetype is the same as the target of the CLONE_SOURCETYPE,
  Splunk will make a best effort to log warnings to splunkd.log, but this
  setting will be silently ignored at runtime for such cases, causing the
  transform to be applied to the original event without cloning.
* The duplicated events will receive index-time transformations & sed
  commands all transforms which match its new host/source/sourcetype.
  * This means that props matching on host or source will incorrectly be
    applied a second time. (SPL-99120)
* Can only be used as part of of an otherwise-valid index-time transform.  For
  example REGEX is required, there must be a valid target (DEST_KEY or
  WRITE_META), etc as above.

0 Karma


Thanks for the input woodcock. I have come up with below config, does this looks good?



TRANSFORM-orig_sc = clone_orig_sc


DEST_KEY = _raw

DEST_KEY = _MetaData:Index

Thanks for all your help.

0 Karma

Esteemed Legend

Yes, that should work (but I have not done this myself so I am going by the dox, same as you are).

0 Karma


Thanks, I will try to test this and update the result.

0 Karma
Get Updates on the Splunk Community!

Registration for Splunk University is Now Open!

Are you ready for an adventure in learning?   Brace yourselves because Splunk University is back, and it's ...

Splunkbase | Splunk Dashboard Examples App for SimpleXML End of Life

The Splunk Dashboard Examples App for SimpleXML will reach end of support on Dec 19, 2024, after which no new ...

Understanding Generative AI Techniques and Their Application in Cybersecurity

Watch On-Demand Artificial intelligence is the talk of the town nowadays, with industries of all kinds ...