Getting Data In

How to reroute events to a different index at the indexer

garyjohnson48
Explorer

Hello

I'm trying to reroute certain events as it hits my indexer from a particular source. In the inputs.conf on the UF, the index is set to index=tokens for my source path, but I want to catch certain events from this source and route to a different index at the indexer. So far three events have gotten past my transform and I'm trying to figure out why and what I'm doing wrong.

Below is my original props and transforms

props.conf

[source::...redacted]
TRANSFORMs-mbox_token_reroute = reroute

transforms.conf

[reroute]
REGEX=reg
FORMAT=mbox_tokens
SOURCE_KEY=MetaData:Source
DEST_KEY=_MetaData:Index

This is what I just changed it to and waiting to see if events are rerouted once the trigger action happens.
props.conf

[source::...redacted]
TRANSFORMs-mbox_token_reroute = reroute

transform.conf

[reroute]
REGEX=reg
FORMAT=mbox_tokens
SOURCE_KEY=MetaData:Source
DEST_KEY=_MetaData:Index
WRITE_META=true

What should I do to make sure that the events are getting rerouted?

0 Karma

yannK
Splunk Employee
Splunk Employee

Assuming your regex is correct.

The way to change the index at index time (with props/transforms) is : on the first splunk instance that will parse the data.

  • It means usually on the indexers.
  • But if you have intermediary heavy forwarder, it will have to be on the first one of the forwarding chain.
  • And if your data is parsed by the monitoring forwarder (look for INDEXED_EXTRACTIONS setting in props), then the parsing is happening on the first forwarder (could be the UF)
0 Karma

FrankVl
Ultra Champion

Assuming the issue is not with the regex, the config looks fine. WRITE_META should not be necessary for this (I have similar config here working without that).

The few cases you found that did not reroute properly, are you 100% sure those were indexed after putting the rerouting in place? Not some data that was index prior (but due to timestamp issue or so still popped up in your search when your were validating your approach)?

0 Karma

garyjohnson48
Explorer

Yes. The date in the events were indexed the day after I applied the regex. With the second one however, looking at my index I don't see any events with "reg" but I also don't see any events in my rerouted index either so I'm going to hold tight with what I got to see what's happening.

0 Karma

FrankVl
Ultra Champion

Well, you might want to doublecheck the _indextime (e.g. by doing | eval itime=_indextime | convert ctime(itime)) for the events that did not end up where you expected them. Just to make sure that it is not some event that got indexed into the future due to timezone / clock misconfiguration or so. You wouldn't be the first to fool yourself looking at old events while troubleshooting a change.

0 Karma

somesoni2
SplunkTrust
SplunkTrust

Is the regular expression correctly identifying the data you want to redirect to index mbox_tokens??

Does your data go directly from UF to Indexers and there is no intermediate Splunk Enterprise instance? If first full Splunk enterprise instance in your data flow is indexer, did you restart your indexer after adding/updating the configuration?

0 Karma

garyjohnson48
Explorer

Yes. the regex is correct. It identifies the source email address for the mail server. The regex in my example is a shortened(redacted) version but the same nonetheless.

There is no intermediate forwarder. First stop for the data is my indexer. I did do a splunkd service restart.

Are my configs correct? I'm trying to get the data to go into my mbox_tokens index.

0 Karma
Get Updates on the Splunk Community!

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

March Community Office Hours Security Series Uncovered!

Hello Splunk Community! In March, Splunk Community Office Hours spotlighted our fabulous Splunk Threat ...