Splunk Enterprise

Splunk HF not forward all data through a custom app

antoniomarongiu
Loves-to-Learn Lots

Hello everyone,

I have an app on one of our Heavy Forwarders that is supposed to route traffic:

  • All events go to our indexer cluster (my_peers_nodes)

  • If the index is  customers_index , events should also be forwarded to two additional Heavy Forwarders (customers_to_tel).

Here is the configuration:
outputs.conf
[tcpout:customers_to_tel]
disabled = false
server = 10.x.x.177:9997,10.x.x178:9997

props.conf
[default]
TRANSFORMS-routing = allRouting

transforms.conf

[allRouting]

SOURCE_KEY= _MetaData:Index
REGEX= customers_index
DEST_KEY= _TCP_ROUTING
FORMAT= my_peers_nodes,customers_to_tel

The problem is:

  • Some sourcetypes with index=customers_index are correctly forwarded to the additional Heavy Forwarders.
  • But other sourcetypes with the same index=customers_index remain only on our Splunk environment and are not being forwarded to the additional Heavy Forwarders.

So the routing works partially depending on the sourcetype.

My questions are:

  1. Why would events with index=customers_index not always match the transforms.conf rule?

  2. Is it possible that _MetaData:Index is not always available on the Heavy Forwarder if events are already cooked?

  3. What is the best practice to ensure all events with index=customers_index are also forwarded to the extra Heavy Forwarders?

Thanks in advance for your help!

Labels (1)
0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @antoniomarongiu 

  1. Why would events with index=customers_index not always match the transforms.conf rule?
    Answer - I suspect it could be that some of the data arriving in your HF has already been parsed and thus is not parsed again when it reaches this HF. You might be able to achieve this with RULESETS.

  2. Is it possible that _MetaData:Index is not always available on the Heavy Forwarder if events are already cooked?
    Answer - Exactly this, if the events are already cooked/parsed then they wont go through the parsing process again here and the events wont be routed as you are expecting.

  3. What is the best practice to ensure all events with index=customers_index are also forwarded to the extra Heavy Forwarders?
    Answer - you either need to receive the data only from the local HF, from UFs sending into the HF, OR look into using a RULESET.

Ruleset example:

# props.conf
[default] 
RULESET-routeData = routeCustomerData

# transforms.conf
[routeCustomerData]
INGEST_EVAL = _TCP_ROUTING=IF(index=="customer_index", "my_peers_nodes,customers_to_tel",_TCP_ROUTING)

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

antoniomarongiu
Loves-to-Learn Lots

Hi @livehybrid,

you're right, data was previosly parsed by UF  trough a custom application  using the following:

props.conf
[cyber_audit] <<-- sourcetype
INDEXED_EXTRACTIONS = CSV
HEADER_FIELD_LINE_NUMBER = 1
FIELD_DELIMITER = \t

inputs.conf
[monitor:///home/cyber/log_applicativi/log_applicativi.*]
sourcetype = cyber_audit
disabled = false
index = customers_index

 

after removing props.conf contents the logs are now forwarded to my_peer_nodes and  customers_to_tel but without fields extraction as expected.

 

 

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...