Splunk Enterprise

Logs not ingested into Indexer

RAVISHANKAR
Explorer

Logs are not getting ingested into my Indexers.

Suspecting it could be due to my logs are getting rotated i.e. Zipped due to huge size.

How to get this fixed? Does input file requires any tweak?

Thanks

0 Karma

RAVISHANKAR
Explorer

It's an On-Prem Splunk Enterprise - Problem is only one particular log file is not getting ingested to the indexer from universal forwarder - from same server I could be able to ingest the other log successfully to my indexer.

That particular log file frequently getting rotated due to its huge size - is that blocking that log getting ingested to that indexer?

Couldn't understand why that particular log file is not getting ingested.

[monitor:///xyz/xyz.log]
blacklist=(api|maxmind\.gz$)
sourcetype=xyz

0 Karma

PickleRick
SplunkTrust
SplunkTrust

The size on its own should not be a problem. The most typical problem with (not) ingesting files is when they have a common header so the CRC Splunk calculates from it matches another file already ingested. 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

ENOTENOUGHWORDS

But seriously - you're just telling us "something is wrong with my setup; how do I fix it?". How are we supposed to know if we don't know your environment, your configuration, your data...

Describe your environment in more detail then we might be able to help you.

Are you using on-premises Splunk Enterprise installation or a Splunk Cloud instance? Are you using UF for ingesting files or are you trying to ingest file locally from one of your Splunk servers? Do you have proper data routing within your Splunk setup? Do you have any errors in splunkd.log? What does your input configuration look like?

0 Karma

inventsekar
SplunkTrust
SplunkTrust

Hi @RAVISHANKAR 

Could you pls provide more details:
1) Splunk OnPrim or Splunk Cloud
2) from where are you reading the file? on UF or HF or...  windows or linux?
3) was the file getting indexed properly before? or is this the first time you face this issue?
4) is there any changes, upgrades, anything happened
5) Pls provide the inputs.conf for that particular file

 

----------------------------------------------------------------------------------------------
If this post or any post addressed your question, could you pls:
Give it karma to show appreciation

PS - As of May 2026, my Karma Given is 2312 and my Karma Received is 497, lets revamp the Karma Culture!
Thanks and best regards, Sekar
--------------------------------------------------------------------------------------------

0 Karma

richgalloway
SplunkTrust
SplunkTrust

We need more information.

Are not getting any logs or just some of them?  If some, are you getting the internal logs for the forwarder that is supposed to be sending them?

Please share the inputs.conf stanza for the missing logs.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...