Getting Data In

Can I use CLONE_SOURCETYPE to send events to multiple indexes?

ankithreddy777
Contributor

I need to send complete data to index-1 and subset of data to index-2. May I know how to use CLONE_SOURCETYPE to implement this criteria?

0 Karma
1 Solution

DalJeanis
Legend

Assuming that (1) your events come in with their sourcetype as mySourceType and (2) they have a field called MyFieldThatSometimesHasFoo, and (3) whenever an event's value in that field is Foo, no more and no less, then (4) you want it cloned to an index called MyNewIndex with the newsourcetype set to MyNewCloneSourceType - which CANNOT be your old sourcetype or bad things will happen...

It's going to look something like this...

in props.conf...

[mySourceType]
TRANSFORMS-myCloneTransformName

In transforms.conf ...

[myCloneTransformName]

CLONE_SOURCETYPE = MyNewCloneSourceType
SOURCE_KEY       = MyFieldThatSometimesHasFoo
REGEX            = ^Foo$
FORMAT           = MyNewIndex
DEST_KEY         = _MetaData:index 

View solution in original post

DalJeanis
Legend

Assuming that (1) your events come in with their sourcetype as mySourceType and (2) they have a field called MyFieldThatSometimesHasFoo, and (3) whenever an event's value in that field is Foo, no more and no less, then (4) you want it cloned to an index called MyNewIndex with the newsourcetype set to MyNewCloneSourceType - which CANNOT be your old sourcetype or bad things will happen...

It's going to look something like this...

in props.conf...

[mySourceType]
TRANSFORMS-myCloneTransformName

In transforms.conf ...

[myCloneTransformName]

CLONE_SOURCETYPE = MyNewCloneSourceType
SOURCE_KEY       = MyFieldThatSometimesHasFoo
REGEX            = ^Foo$
FORMAT           = MyNewIndex
DEST_KEY         = _MetaData:index 

realhippo33
Explorer

DEST_KEY = _MetaData:index
Should be
DEST_KEY = _MetaData:Index

2 hours well spent 🙂

0 Karma

ankithreddy777
Contributor

Thank you, It worked. Do I need to send the unmatched events for new source type to null-queue? Because unmatched events going to old index.

0 Karma

DalJeanis
Legend

@ankithreddy777 - Everything should go to the old index, and the duplicates should go to both indexes. Is this not happening?

If it is cloning all transactions and leaving some in the old index, then try it this way first:

in props.conf...

 [mySourceType]
 TRANSFORMS-myCloneTransformName

 [MyNewCloneSourceType]
 TRANSFORMS-killNonFoo

In transforms.conf ...

 [myCloneTransformName]
 CLONE_SOURCETYPE = MyNewCloneSourceType
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 FORMAT           = MyNewIndex
 DEST_KEY         = _MetaData:index 

 [killNonFoo]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = (?!^Foo$)^.*$
 DEST_KEY         = queue
 FORMAT           = nullQueue

If that doesn't work, then you can try it this way...

in props.conf...

 [mySourceType]
 TRANSFORMS-myCloneTransformName

 [MyNewCloneSourceType]
 TRANSFORMS-saveFoo
 TRANSFORMS-saveFoo2

In transforms.conf ...

 [myCloneTransformName]
 CLONE_SOURCETYPE = MyNewCloneSourceType
 DEST_KEY         = queue
 FORMAT           = nullQueue

 [saveFoo]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 DEST_KEY         = queue
 FORMAT           = indexQueue

 [saveFoo2]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 DEST_KEY         = _MetaData:index 
 FORMAT           = MyNewIndex
0 Karma

DalJeanis
Legend

@ankithreddy777 - did the extra transaction issue get resolved? if so, which solution worked for you?

0 Karma

DalJeanis
Legend

Please define how you will know the subset of data to be cloned. Does it have a particular value in a particular field?

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...