Deployment Architecture

One source to two indexes

m_efremov
Explorer

We created two indexes at our indexer cluster. Now we need to send the same data to both of them (clear data to the first one and transformed to another one, but from one source, from one univarsal forwarder). How can we implement it? On which host: universal forwarder, heavy forwarder, indexer?

0 Karma
1 Solution

harsmarvania57
Ultra Champion

Hi @m_efremov,

As far as I know you can't clone data to 2 indexes on same indexer cluster with data flow from UF -> Indexer directly but there are ugly way to achieve this as given below but it will double your license usage for that source..

Here I am assuming as of now you are sending data directly from Universal Forwarder to Indexer Cluster and Heavy Forwarder is sending data to same Indexer Cluster.
With below approach data flow will be like

UF -> Indexer Cluster (Index = ABC)
        Heavy Forwarder                       -> Indexer Cluster(Index = XYZ)
  • On Universal Forwarder you can do below configuration to send same data (Cloning of data) to Indexer and Heavy Forwarder

inputs.conf

[monitor:///tmp/]
_TCP_ROUTING = indexers, heavyforwarder
whitelist = mycustom\.log
index = ABC
sourcetype = mysourcetype

outputs.conf

[tcpout]
defaultGroup = indexers

[tcpout:indexers]
server = indexer1:port, indexer2:port

[tcpout:heavyforwarder]
server = hfw:port
  • Configuration require on heavy forwarder (Assuming Heavy Forwarder is already sending data to Indexer Cluster)

props.conf

[mysourcetype]
TRANSFORMS-rouindex = routing_to_index

transforms.conf

[routing_to_index]
REGEX = .
DEST_KEY = _MetaData:Index
FORMAT = XYZ

View solution in original post

harsmarvania57
Ultra Champion

Hi @m_efremov,

As far as I know you can't clone data to 2 indexes on same indexer cluster with data flow from UF -> Indexer directly but there are ugly way to achieve this as given below but it will double your license usage for that source..

Here I am assuming as of now you are sending data directly from Universal Forwarder to Indexer Cluster and Heavy Forwarder is sending data to same Indexer Cluster.
With below approach data flow will be like

UF -> Indexer Cluster (Index = ABC)
        Heavy Forwarder                       -> Indexer Cluster(Index = XYZ)
  • On Universal Forwarder you can do below configuration to send same data (Cloning of data) to Indexer and Heavy Forwarder

inputs.conf

[monitor:///tmp/]
_TCP_ROUTING = indexers, heavyforwarder
whitelist = mycustom\.log
index = ABC
sourcetype = mysourcetype

outputs.conf

[tcpout]
defaultGroup = indexers

[tcpout:indexers]
server = indexer1:port, indexer2:port

[tcpout:heavyforwarder]
server = hfw:port
  • Configuration require on heavy forwarder (Assuming Heavy Forwarder is already sending data to Indexer Cluster)

props.conf

[mysourcetype]
TRANSFORMS-rouindex = routing_to_index

transforms.conf

[routing_to_index]
REGEX = .
DEST_KEY = _MetaData:Index
FORMAT = XYZ

m_efremov
Explorer

Thank you, @harsmarvania57 , it seems to be a workable solution. My transform.conf also contains "CLONE_SOURCETYPE", but all other options are same.
[routing_to_new_index]
REGEX = .
CLONE_SOURCETYPE = my_new_sourcetype
FORMAT = my_new_index
DEST_KEY = _MetaData:Index

0 Karma

harsmarvania57
Ultra Champion

I have converted my comment to answer, if it really helps you then you can accept it. Can I ask you why you want CLONE_SOURCETYPE ?

0 Karma

m_efremov
Explorer

I use CLONE_SOURCETYPE for assigning different sourcetype name (not only index) for my new data flow. It is because I want do different transformations for old and new data (may be at indexers side, in their props.conf and transforms.conf). Also i was collect separate statistics about old and new sourcetypes (one of them has transformed events)

0 Karma

harsmarvania57
Ultra Champion

For renaming of sourcetype and routing data to another index, can you please try below configuration on Heavy Forwarder?

props.conf

[mysourcetype]
TRANSFORMS-rouindex = rename_sourcetype, routing_to_new_index

transforms.conf

[rename_sourcetype]
REGEX = .
DEST_KEY = MetaData:Sourcetype
FORMAT = sourcetype::new_sourcetype

[routing_to_new_index]
SOURCE_KEY = MetaData:Sourcetype
DEST_KEY = _MetaData:Index
REGEX = new_sourcetype
FORMAT = XYZ
0 Karma

inventsekar
SplunkTrust
SplunkTrust

what i understand from your question is, you want to send a single log file to two indexes.

from @woodcock 's answer on this post -
https://answers.splunk.com/answers/567223/how-to-send-same-data-source-to-two-or-multiple-in-1.html

[monitor://D:\test\test1.log]
 sourcetype = test
 index = index1

 [monitor://D:\linktotest\test1.log]
 sourcetype = test
 index = index2

The create s symbolic link from linktotest to test:

thanks and best regards,
Sekar

PS - If this or any post helped you in any way, pls consider upvoting, thanks for reading !
0 Karma

m_efremov
Explorer

We can't do it at most of our application servers. Some of them not under our control, some of them are working under MS Windows etc. Thank your for answer but it is not general solution.

0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...