Splunk Enterprise Security

In Splunk Enterprise Security, how do you monitor a shared directory on multiple servers?

btawiah
Explorer

server 1
server 2
server 3

monitoring location is shared \server[1-3]\logs\serevr.log

server[1-3] is able to reach all logs since its share.

For example, server 1 will be able to read log files from server 2 and server 3

My current input definitely will index duplicate data since all three servers will be hitting the network storage at a time, which may easily break something:

[monitor://\\server1\logs\serevr.log]
index=main
sourcetype=serverlog

[monitor://\\server2\logs\serevr.log]
index=main
sourcetype=serverlog

[monitor://\\server3\logs\serevr.log]
index=main
sourcetype=serverlog

Question:

How do I monitor this logs so server 1 will only monitor server 1 logs on the shared drive and server 2 will monitor only server 2 logs and same as server 3?

Thank you for your help.

0 Karma
2 Solutions

nickhills
Ultra Champion

You should install a Universal forwarder on each sever, and have it monitor only its local files.

This is far more robust than monitoring over UNC paths, and is the recommended approach for collecting and indexing data.

If you cant do this for a specific reason (such as the file server is actually a NAS/appliance etc) you should configure a single forwarder to perform the collection.

If my comment helps, please give it a thumbs up!

View solution in original post

0 Karma

lakshman239
SplunkTrust
SplunkTrust

Agree with @nickhills . However, if all your logs are only in a shared location accessible to all servers [ e.g. Weblogic clusters writing logs for each service]. Pls config splunk forwarder running in server 1 read only certain log files, e.g ms_server1.log, ms_server2.log etc.. and Splunk UF running server 2 to read ms_server3.log, ms_server4.log etc.. so one UF will read only 1 set off files.

View solution in original post

0 Karma

lakshman239
SplunkTrust
SplunkTrust

Agree with @nickhills . However, if all your logs are only in a shared location accessible to all servers [ e.g. Weblogic clusters writing logs for each service]. Pls config splunk forwarder running in server 1 read only certain log files, e.g ms_server1.log, ms_server2.log etc.. and Splunk UF running server 2 to read ms_server3.log, ms_server4.log etc.. so one UF will read only 1 set off files.

0 Karma

nickhills
Ultra Champion

You should install a Universal forwarder on each sever, and have it monitor only its local files.

This is far more robust than monitoring over UNC paths, and is the recommended approach for collecting and indexing data.

If you cant do this for a specific reason (such as the file server is actually a NAS/appliance etc) you should configure a single forwarder to perform the collection.

If my comment helps, please give it a thumbs up!
0 Karma

btawiah
Explorer

Thanks for you suggestion. We will consider that. In the past we have done it using one server only but the concern is we want to capture the host names of the servers in splunk as well. In this case its showing only one host name.

Maybe the local file monitoring will be best for us at this time.

0 Karma

nickhills
Ultra Champion

Yes, this is certainly the recommended approach for several reasons, one of which (as you have noted) is you get the correct hostname, but there are performance, durability, and management benefits too.

If my comment helps, please give it a thumbs up!
0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...