Getting Data In

Monitor files perfomance

douglasmsouza
Explorer

Hello,

I need to monitor some Oracle Database agent logs with Splunk Universal Forwarder. The base directory for finding the logs is $ORACLE_HOME.

We´re using this configuration to monitor these logs in a Splunk Enterprise environment:
[monitor://$ORACLE_HOME/log/*/agent/ohasd/oraagent_(grid|oracle)/oraagent_(grid|oracle).log]
...

I know we could configure ORACLE_HOME env in splunk-launch.conf on each UF instance.
However, we have already installed all Universal Forwarders and we don´t know the $ORACLE_HOME env variable on the UF hosts.
we have about 300 hosts, so we decided to do the above configuration to save time:
[monitor:///.../log/*/agent/ohasd/oraagent_(grid|oracle)/oraagent_(grid|oracle).log]

When I execute splunk list monitor its listing all directories under / partition, even if there is one log file per host.

My questions are:

1 - Does Splunk will really look into all directories under /?
2 - If yes, would I have performance problems because the huge amount of directories?

Thanks.

0 Karma

somesoni2
Revered Legend

Yes and Yes. Ideally its not recommended to use wildcard at root level as it'll cause UF to recursive walkthrough all those files/directories. You will see performance impact because of that. (high CPU). Will the $ORACLE_HOME be different in all those UFs?? You can either have the server owner create a symlink for you, that you'll monitor (same symlink pointing to appropriate Oracle installation directory) OR create a monitoring stanza that will take care of variations in $ORACLE_HOME values.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...