Getting Data In

Can I use CLONE_SOURCETYPE to send events to multiple indexes?

ankithreddy777
Contributor

I need to send complete data to index-1 and subset of data to index-2. May I know how to use CLONE_SOURCETYPE to implement this criteria?

0 Karma
1 Solution

DalJeanis
Legend

Assuming that (1) your events come in with their sourcetype as mySourceType and (2) they have a field called MyFieldThatSometimesHasFoo, and (3) whenever an event's value in that field is Foo, no more and no less, then (4) you want it cloned to an index called MyNewIndex with the newsourcetype set to MyNewCloneSourceType - which CANNOT be your old sourcetype or bad things will happen...

It's going to look something like this...

in props.conf...

[mySourceType]
TRANSFORMS-myCloneTransformName

In transforms.conf ...

[myCloneTransformName]

CLONE_SOURCETYPE = MyNewCloneSourceType
SOURCE_KEY       = MyFieldThatSometimesHasFoo
REGEX            = ^Foo$
FORMAT           = MyNewIndex
DEST_KEY         = _MetaData:index 

View solution in original post

DalJeanis
Legend

Assuming that (1) your events come in with their sourcetype as mySourceType and (2) they have a field called MyFieldThatSometimesHasFoo, and (3) whenever an event's value in that field is Foo, no more and no less, then (4) you want it cloned to an index called MyNewIndex with the newsourcetype set to MyNewCloneSourceType - which CANNOT be your old sourcetype or bad things will happen...

It's going to look something like this...

in props.conf...

[mySourceType]
TRANSFORMS-myCloneTransformName

In transforms.conf ...

[myCloneTransformName]

CLONE_SOURCETYPE = MyNewCloneSourceType
SOURCE_KEY       = MyFieldThatSometimesHasFoo
REGEX            = ^Foo$
FORMAT           = MyNewIndex
DEST_KEY         = _MetaData:index 

realhippo33
Explorer

DEST_KEY = _MetaData:index
Should be
DEST_KEY = _MetaData:Index

2 hours well spent 🙂

0 Karma

ankithreddy777
Contributor

Thank you, It worked. Do I need to send the unmatched events for new source type to null-queue? Because unmatched events going to old index.

0 Karma

DalJeanis
Legend

@ankithreddy777 - Everything should go to the old index, and the duplicates should go to both indexes. Is this not happening?

If it is cloning all transactions and leaving some in the old index, then try it this way first:

in props.conf...

 [mySourceType]
 TRANSFORMS-myCloneTransformName

 [MyNewCloneSourceType]
 TRANSFORMS-killNonFoo

In transforms.conf ...

 [myCloneTransformName]
 CLONE_SOURCETYPE = MyNewCloneSourceType
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 FORMAT           = MyNewIndex
 DEST_KEY         = _MetaData:index 

 [killNonFoo]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = (?!^Foo$)^.*$
 DEST_KEY         = queue
 FORMAT           = nullQueue

If that doesn't work, then you can try it this way...

in props.conf...

 [mySourceType]
 TRANSFORMS-myCloneTransformName

 [MyNewCloneSourceType]
 TRANSFORMS-saveFoo
 TRANSFORMS-saveFoo2

In transforms.conf ...

 [myCloneTransformName]
 CLONE_SOURCETYPE = MyNewCloneSourceType
 DEST_KEY         = queue
 FORMAT           = nullQueue

 [saveFoo]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 DEST_KEY         = queue
 FORMAT           = indexQueue

 [saveFoo2]
 SOURCE_KEY       = MyFieldThatSometimesHasFoo
 REGEX            = ^Foo$
 DEST_KEY         = _MetaData:index 
 FORMAT           = MyNewIndex
0 Karma

DalJeanis
Legend

@ankithreddy777 - did the extra transaction issue get resolved? if so, which solution worked for you?

0 Karma

DalJeanis
Legend

Please define how you will know the subset of data to be cloned. Does it have a particular value in a particular field?

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...