Other Usage

Some logging are in the _internal index instead of wineventlog

OsmanElyas
Explorer

I have configured 5 domain controllers to send log to Splunk by installing UF.

I have DC2 and DC5 reporting to Winevenlog as it is configured but I am missing the other 3 DCs.

All logging to _internal what should I do to correct the logging.

Labels (1)
Tags (1)
0 Karma
1 Solution

OsmanElyas
Explorer

I have stopped and restarted the services (Splunk forwarders) on DCs and it fix the issue

View solution in original post

0 Karma

OsmanElyas
Explorer

I have stopped and restarted the services (Splunk forwarders) on DCs and it fix the issue

0 Karma

OsmanElyas
Explorer

I have checked internal events only logging for 5 of them are in the _internal 

I have  windows eventlog for Dc2 and 5 on the Wineventlog

I will check aging the UF to see where is the problem

Much appreciate your help @PickleRick 

 

0 Karma

PickleRick
SplunkTrust
SplunkTrust

What do you mean by "logging to _internal"? Normally in _internal index you'll find... well, internal events coming from the forwarder itself - metrics, forwarder errors and such. And you should have them from all 5 forwarders.

But if you see your windows eventlogs contents in _internal - that's a big misconfiguration. Either someone put index=_internal into inputs.conf defining your windows eventlog inputs (but why would someone do that???) or you have some strange redirecting mechanics defined in your environment by which the events end up in that index. But it's higly unlikely.

There is also a third option - your _internal index is set as a lastChanceIndex (which is a wrong setting - this one should point to a normal - non-_internal - index) and your inputs are misconfigured and try to write to a non-existent index.

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...