Getting Data In

Can I use CLONE_SOURCETYPE to send events to multiple indexes?

Contributor

I need to send complete data to index-1 and subset of data to index-2. May I know how to use CLONE_SOURCETYPE to implement this criteria?

0 Karma
1 Solution

SplunkTrust
SplunkTrust

Assuming that (1) your events come in with their sourcetype as mySourceType and (2) they have a field called MyFieldThatSometimesHasFoo, and (3) whenever an event's value in that field is Foo, no more and no less, then (4) you want it cloned to an index called MyNewIndex with the newsourcetype set to MyNewCloneSourceType - which CANNOT be your old sourcetype or bad things will happen...

It's going to look something like this...

in props.conf...

[mySourceType]
TRANSFORMS-myCloneTransformName

In transforms.conf ...

[myCloneTransformName]

CLONE_SOURCETYPE = MyNewCloneSourceType
SOURCE_KEY       = MyFieldThatSometimesHasFoo
REGEX            = ^Foo$
FORMAT           = MyNewIndex
DEST_KEY         = _MetaData:index 

View solution in original post

SplunkTrust
SplunkTrust

Assuming that (1) your events come in with their sourcetype as mySourceType and (2) they have a field called MyFieldThatSometimesHasFoo, and (3) whenever an event's value in that field is Foo, no more and no less, then (4) you want it cloned to an index called MyNewIndex with the newsourcetype set to MyNewCloneSourceType - which CANNOT be your old sourcetype or bad things will happen...

It's going to look something like this...

in props.conf...

[mySourceType]
TRANSFORMS-myCloneTransformName

In transforms.conf ...

[myCloneTransformName]

CLONE_SOURCETYPE = MyNewCloneSourceType
SOURCE_KEY       = MyFieldThatSometimesHasFoo
REGEX            = ^Foo$
FORMAT           = MyNewIndex
DEST_KEY         = _MetaData:index 

View solution in original post

New Member

DEST_KEY = _MetaData:index
Should be
DEST_KEY = _MetaData:Index

2 hours well spent 🙂

0 Karma

Contributor

Thank you, It worked. Do I need to send the unmatched events for new source type to null-queue? Because unmatched events going to old index.

0 Karma

SplunkTrust
SplunkTrust

@ankithreddy777 - Everything should go to the old index, and the duplicates should go to both indexes. Is this not happening?

If it is cloning all transactions and leaving some in the old index, then try it this way first:

in props.conf...

 [mySourceType]
 TRANSFORMS-myCloneTransformName

 [MyNewCloneSourceType]
 TRANSFORMS-killNonFoo

In transforms.conf ...

 [myCloneTransformName]
 CLONE_SOURCETYPE = MyNewCloneSourceType
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 FORMAT           = MyNewIndex
 DEST_KEY         = _MetaData:index 

 [killNonFoo]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = (?!^Foo$)^.*$
 DEST_KEY         = queue
 FORMAT           = nullQueue

If that doesn't work, then you can try it this way...

in props.conf...

 [mySourceType]
 TRANSFORMS-myCloneTransformName

 [MyNewCloneSourceType]
 TRANSFORMS-saveFoo
 TRANSFORMS-saveFoo2

In transforms.conf ...

 [myCloneTransformName]
 CLONE_SOURCETYPE = MyNewCloneSourceType
 DEST_KEY         = queue
 FORMAT           = nullQueue

 [saveFoo]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 DEST_KEY         = queue
 FORMAT           = indexQueue

 [saveFoo2]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 DEST_KEY         = _MetaData:index 
 FORMAT           = MyNewIndex
0 Karma

SplunkTrust
SplunkTrust

@ankithreddy777 - did the extra transaction issue get resolved? if so, which solution worked for you?

0 Karma

SplunkTrust
SplunkTrust

Please define how you will know the subset of data to be cloned. Does it have a particular value in a particular field?

0 Karma