Archive

One source to two indexes

Explorer

We created two indexes at our indexer cluster. Now we need to send the same data to both of them (clear data to the first one and transformed to another one, but from one source, from one univarsal forwarder). How can we implement it? On which host: universal forwarder, heavy forwarder, indexer?

0 Karma
1 Solution

SplunkTrust
SplunkTrust

Hi @m_efremov,

As far as I know you can't clone data to 2 indexes on same indexer cluster with data flow from UF -> Indexer directly but there are ugly way to achieve this as given below but it will double your license usage for that source..

Here I am assuming as of now you are sending data directly from Universal Forwarder to Indexer Cluster and Heavy Forwarder is sending data to same Indexer Cluster.
With below approach data flow will be like

UF -> Indexer Cluster (Index = ABC)
        Heavy Forwarder                       -> Indexer Cluster(Index = XYZ)
  • On Universal Forwarder you can do below configuration to send same data (Cloning of data) to Indexer and Heavy Forwarder

inputs.conf

[monitor:///tmp/]
_TCP_ROUTING = indexers, heavyforwarder
whitelist = mycustom\.log
index = ABC
sourcetype = mysourcetype

outputs.conf

[tcpout]
defaultGroup = indexers

[tcpout:indexers]
server = indexer1:port, indexer2:port

[tcpout:heavyforwarder]
server = hfw:port
  • Configuration require on heavy forwarder (Assuming Heavy Forwarder is already sending data to Indexer Cluster)

props.conf

[mysourcetype]
TRANSFORMS-rouindex = routing_to_index

transforms.conf

[routing_to_index]
REGEX = .
DEST_KEY = _MetaData:Index
FORMAT = XYZ

View solution in original post

SplunkTrust
SplunkTrust

Hi @m_efremov,

As far as I know you can't clone data to 2 indexes on same indexer cluster with data flow from UF -> Indexer directly but there are ugly way to achieve this as given below but it will double your license usage for that source..

Here I am assuming as of now you are sending data directly from Universal Forwarder to Indexer Cluster and Heavy Forwarder is sending data to same Indexer Cluster.
With below approach data flow will be like

UF -> Indexer Cluster (Index = ABC)
        Heavy Forwarder                       -> Indexer Cluster(Index = XYZ)
  • On Universal Forwarder you can do below configuration to send same data (Cloning of data) to Indexer and Heavy Forwarder

inputs.conf

[monitor:///tmp/]
_TCP_ROUTING = indexers, heavyforwarder
whitelist = mycustom\.log
index = ABC
sourcetype = mysourcetype

outputs.conf

[tcpout]
defaultGroup = indexers

[tcpout:indexers]
server = indexer1:port, indexer2:port

[tcpout:heavyforwarder]
server = hfw:port
  • Configuration require on heavy forwarder (Assuming Heavy Forwarder is already sending data to Indexer Cluster)

props.conf

[mysourcetype]
TRANSFORMS-rouindex = routing_to_index

transforms.conf

[routing_to_index]
REGEX = .
DEST_KEY = _MetaData:Index
FORMAT = XYZ

View solution in original post

Explorer

Thank you, @harsmarvania57 , it seems to be a workable solution. My transform.conf also contains "CLONESOURCETYPE", but all other options are same.
[routing
tonewindex]
REGEX = .
CLONESOURCETYPE = mynewsourcetype
FORMAT = my
newindex
DEST
KEY = _MetaData:Index

0 Karma

SplunkTrust
SplunkTrust

I have converted my comment to answer, if it really helps you then you can accept it. Can I ask you why you want CLONE_SOURCETYPE ?

0 Karma

Explorer

I use CLONE_SOURCETYPE for assigning different sourcetype name (not only index) for my new data flow. It is because I want do different transformations for old and new data (may be at indexers side, in their props.conf and transforms.conf). Also i was collect separate statistics about old and new sourcetypes (one of them has transformed events)

0 Karma

SplunkTrust
SplunkTrust

For renaming of sourcetype and routing data to another index, can you please try below configuration on Heavy Forwarder?

props.conf

[mysourcetype]
TRANSFORMS-rouindex = rename_sourcetype, routing_to_new_index

transforms.conf

[rename_sourcetype]
REGEX = .
DEST_KEY = MetaData:Sourcetype
FORMAT = sourcetype::new_sourcetype

[routing_to_new_index]
SOURCE_KEY = MetaData:Sourcetype
DEST_KEY = _MetaData:Index
REGEX = new_sourcetype
FORMAT = XYZ
0 Karma

Champion

what i understand from your question is, you want to send a single log file to two indexes.

from @woodcock 's answer on this post -
https://answers.splunk.com/answers/567223/how-to-send-same-data-source-to-two-or-multiple-in-1.html

[monitor://D:\test\test1.log]
 sourcetype = test
 index = index1

 [monitor://D:\linktotest\test1.log]
 sourcetype = test
 index = index2

The create s symbolic link from linktotest to test:

0 Karma

Explorer

We can't do it at most of our application servers. Some of them not under our control, some of them are working under MS Windows etc. Thank your for answer but it is not general solution.

0 Karma