Getting Data In

How to split data to multiple indexes on the same indexer (index1,index2) from one input source from one Heavyforwarder

vksplunk1
Explorer

Hi,

Could you please let me know How to split data to multiple indexes on the same indexer (index1,index2) from one input source from one Heavyforwarder

 

I tried with the following configuration but all the data going to once index that is defined in inputs.conf. If I remove index from inputs.conf all the events are going to main index.  

 

Thank you in ad

 

Here my configuration and data:

 

INPUTS.CONF
======
[monitor:///opt/splunk/var/log/tesData]
disabled = false
host = heaveforwarder1

 

PROPS.CONF
===========
[source::///opt/splunk/var/log/tesData]
TRANSFORMS-routing=vendorData,secureData

 

TRANSFORMS.conf
==========
[vendorData]
REGEX=5617605039838520
DEST_KEY=_MetaData:Index
FORMAT=index1

[secureData]
REGEX=6794850084423218
DEST_KEY=_MetaData:Index
FORMAT=index2


 testdata:

[08/June/2022:18:23:07] VendorID=5038 Code=C AcctID=5617605039838520
[08/June/2022:18:23:22] VendorID=9109 Code=A AcctID=6794850084423218

 

 

 

Labels (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Generally, there's no way to "split" or "duplicate" an event while indexing. After initial input the event is processed as a whole. You can modify it - trim it, add to it, rewrite parts of it, modify its metadata but it's still within the single event. The only thing that can be "done multiple" with a single event is routing.

So theoretically you could send an event to both the original indexer(s) to one index and to the tcp (so called "syslog") output where it would be re-ingested and processed from the start. But it's a very ugly solution.

0 Karma

vksplunk1
Explorer

Thank you for the response.  To make it clear, I am NOT trying to split the event. I am trying to send the events from the same source to same indexer but to a different INDEXE( index1 or index2 based on the regex )

0 Karma

jamie00171
Communicator

It might be worth taking a look at this .conf presentation: PLA1154C - Advanced pipeline configurations with INGEST_EVAL and CLONE_SOURCETYPE from 2020 .conf. The last example looks very similar to what you are trying to do, you can watch it here: https://conf.splunk.com/watch/conf-online.html 

All of the configuration for the examples is on github: https://github.com/silkyrich/ingest_eval_examples 

 

vksplunk1
Explorer

Thank you for the response. 

 

Regardless of the REGEX  ins transforms.conf in HF, the data always sending to the index in inputs.conf

 

Seems like splunk not allowing to overide index specified in inpouts.conf

 

TRANSFORMS.conf
======
#[cdmops]
#REGEX=.*service=cdmops2.*
#DEST_KEY=_MetaData:Index
#FORMAT=cdmops

#[cdmops2]
#REGEX=.*service=cdmops2.*
#DEST_KEY=_MetaData:Index
#FORMAT=cdmops2

Data:
=====
2022/06/01 10:45:50 service=cdmops server=node3 score=50 seq=55041
2022/06/01 10:45:50 service=cdmops2 server=node1 score=17 seq=55042

 

 

PickleRick
SplunkTrust
SplunkTrust

No. Splunk allows you to manipulate metadata during event ingestion.

Question is - where are you trying to do that.

1) Do you indeed have Heavy Forwarder or a Universal Forwarder on the machine you're reading the logfiles on?

2) Do you send the data directly to indexer(s)? Or do you have any intermediate forwarder(s) in the path? If so, what kind of forwarders are those?

3) Where are you putting those props/transforms? (on which component?)

0 Karma

PickleRick
SplunkTrust
SplunkTrust

OK. You're right. I forgot about this wonderfully misnamed feature 🙂

You can indeed split the processing pipeline and reinject the same event back for another run.

0 Karma

vksplunk1
Explorer

Here is my correct  INPUTS.conf.  Missing index in the earlier inouts.conf

 

INPUTS.CONF
======
[monitor:///opt/splunk/var/log/testData]
disabled = false
host = Haveyforwarder1
index = index1

0 Karma

jamie00171
Communicator

Hi @vksplunk1 

From the docs:

The REGEX must have at least one capturing group, even if the FORMAT does
    not reference any capturing groups.

so it might be worth trying to add a capturing group to the regex?

Thanks, 

Jamie

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...