Getting Data In

Why is there data Ingestion Issue through Heavy Forwarder?

sahilvats
Engager

Hi ,

I am looking for troubleshooting steps for Data Ingestion Issue through Heavy Forwarder

Labels (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @sahilvats,

could you share some additionsl information?

e.g.: are you ingesting syslog data from the HF?

data are ingested by UFs passing through the HF?

are you speaking ok local data, or DB-Connect data or Cloud data?

Ciao.

Giuseppe

0 Karma

sahilvats
Engager

Hi @gcusello 

 

Thank you  so much for the response.

 

We are ingesting data from remote machines, basically collecting logs from remote machines and forwarding them for further processing.

here i am looking to identify the basic troubleshooting guide for data ingestion issue through HF

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @sahilvats,

good for you, see next time!

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated 😉

gcusello
SplunkTrust
SplunkTrust

Hi @sahilvats,

if you're using an intermediate HF between UFs and Indexers, you don't need any input, but you have to add all the props.conf and transforms.conf to parse your data, because parsing in Splunk is executed in the first full Splunk Instance that te data passing through.

In other words: if you have to collect data from windows UFs, you have to install the Splunk_TA_Windows on UFs, on Search Heads and on HF.

Ciao.

Giuseppe

Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...