Splunk Search

How to log file aggregator on centralized log server?

indeed_2000
Motivator

Hi

I have 3 servers that generate log file daily with size about 12GB (12*3=36GB)

How can I gather these files on centralize log server.

 

FYI1: I can't use splunk forwarder in this scenario.

FYI2: rsyslog, filebeat, syslog-ngm, ... are available solution but I can't decide which one is more suitable for this issue.

FYI3: raw data is important , and doesn't be missed.

FYI4: like forwarder when ever servers or network down, after issue resolve it will continuously send data. (AFAIK rsyslog use tracker when server stopped and try to send remain file after service start again)

Any idea?

Thanks,

Labels (3)
Tags (3)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @indeed_2000,

why cannot you use Forwarder? it should be the best solution to send many data as you need.

Anyway, syslog isn't a good solution because you should implement a large architecture (2 heavy Forwarder and 1 Load Balancer) to be sure to avoid to miss data and anyway isn't so efficient.

Could you create a network sharing? in this case you could share the data folders from the three servers and use another server to read these data and send them to Indexers.

As I said, try to use Universal Forwarders on target servers, and in this case you have also to do some configurations on UFs to avoid delays in data transmission.

Ciao.

Giuseppe

0 Karma

indeed_2000
Motivator

@gcusellohard to explain, but it is production environment that I don't have privilege to install splunk forwarder.

Now looking for something like splunk forwarder. (easy setup, minimum dependency, in case of disaster try to send all file in path)

 

1-rsyslog remove from my list as you mention.

 

2-mount network not acceptable by admin of production.

 

3-fluentd has different dependency. (and AFAIK send tail of file)

https://docs.fluentbit.io/manual/administration/configuring-fluent-bit

 

4-How about logstash or filebeat or any other log collector solution?

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @indeed_2000,

could you schedule a copy of your log file into another system where you can installa Forwarder or into the Splunk Server?

Having so large files, it isn't so effcient but probably it's the only solution.

Anyway, if it's relevant for your customer to enable log monitoring, my hint is to force the adoption of Forwarder.

Ciao.

Giuseppe

0 Karma

indeed_2000
Motivator

@gcuselloactually two first solution that come to my mind are 1-mount network disk 2- crontab

but unfortunately neither approve by product owner. (they have some issue with these solution).

as you mention copy file not good idea for large log file, specially they need realtime collection.

I'm curious to know is there any solution like splunk forwarder exit for this scenario?

 

work like this:

server1 >
server2 > centralize log
server3 >

FYI1: rsyslog, filebeat, syslog-ng, fluentd, ... are available solution but I can't decide which one is more suitable for this issue.

FYI2: raw data is important , and doesn't be missed. (don't want to clean data exact log file is important for me)

FYI3: like splunk forwarder whenever servers or network down, after issue resolve it will continuously send data. (AFAIK rsyslog use tracker file when server stopped and try to send remain file after service start again but while service down new file create with different name and structure can't track) (but splunk forwarder can handle this situation even when forwarder service on servers down, whenever start it can discover any file on that path)

FYI4: here is the path of my log /opt/log/*
different file, with different name may create here , I need to dynamically everything on this path send to the centralize log.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @indeed_2000,

if you cannot use Forwarder or mount and scp isn't efficient, you could try with syslog but you have to install an HA infrastructure to avoid to lose data: two receiver (with rsyslog and Universal Forwarder) and a Load Balancer as front end (if you haven't a Load Balancer, you can use DNS).

With all the limits of this solution, for this reason I hinted to explain to the customer that an important infrastructure as its needs an important solution for log management.

Ciao.

Giuseppe

0 Karma

indeed_2000
Motivator

@gcusello  (with rsyslog and Universal Forwarder) as I mention can't use forwarder.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @indeed_2000,

using syslog you have to use a receiver different than the Splunk Indexer and it should be better to have two receivers with a Load Balancer or a DNS configuration to avoid to lose logs during maintenance or fault.

But not Forwarders on the target system: you have to install two dedicated servers with UF to receive syslogs using rsyslog.

Ciao.

Giuseppe

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...