Getting Data In

How to monitor remote logs in a centerized heavy forwarder

davesplunk01
Path Finder

New to splunk. We have a clustered environment with 100 of serveres involved. Without installing universal forwarder how to monitor the logs from those servers. We dont want to install plugin or anything all 100 servers.
We have contoll on all the servers so what would be the best way.

Any ideas or topics to cover will help
Thanks.

0 Karma
1 Solution

vasanthmss
Motivator

Concluding the above,

  1. what kind of data we are dealing with?
  2. what are the reason not to install universal forwarder?
  3. you have mentioned you have full access (We have contoll on all the servers). why can't install universal forwarder

I would suggest the best way would be,

  1. install the universal forwarder in your server. if you still consider forwarder check the below easy options.
  2. update your application to post the data to splunk forwarder.
  3. create a script to perform to post.

Assuming in-case installing forwarder manual is the issue then use some continuous deployment, if you dont want to go with licensed one then create a script to do the work.

  1. what is the reason not to install universal forwarder?
  2. what kind of data we are dealing with?
  3. you have mentioned you have full access (We have contoll on all the servers)

steps to start exploring would be,

http://docs.splunk.com/Documentation/Forwarder/6.5.2/Forwarder/InstallaWindowsuniversalforwarderfrom...

Sample scripts,

https://answers.splunk.com/answers/34896/simple-installation-script-for-universal-forwarder.html

Posting clear would help others to help you further.

V

View solution in original post

nickhills
Ultra Champion

In the spirit of answering the question asked...

You could, install a dedicated forwarder, and then write scripts on that forwarder to copy (ftp/smb/cifs/ssh) the log files off the 'hundreds' of source servers onto the forwarder (perhaps into a directory for each host?) Your inputs file would then have to have an entry for each log type and host name to allow you to override the host name using props/transforms.conf.

You will then need a seperate process to remove the indexed log files from your forwarder.

None of what I have suggested is sensible and the previous answers/comments are all superiour solutions, but if you really are playing with one hand tied behind your back, sometimes "sensible is off the table"

If my comment helps, please give it a thumbs up!

vasanthmss
Motivator

Concluding the above,

  1. what kind of data we are dealing with?
  2. what are the reason not to install universal forwarder?
  3. you have mentioned you have full access (We have contoll on all the servers). why can't install universal forwarder

I would suggest the best way would be,

  1. install the universal forwarder in your server. if you still consider forwarder check the below easy options.
  2. update your application to post the data to splunk forwarder.
  3. create a script to perform to post.

Assuming in-case installing forwarder manual is the issue then use some continuous deployment, if you dont want to go with licensed one then create a script to do the work.

  1. what is the reason not to install universal forwarder?
  2. what kind of data we are dealing with?
  3. you have mentioned you have full access (We have contoll on all the servers)

steps to start exploring would be,

http://docs.splunk.com/Documentation/Forwarder/6.5.2/Forwarder/InstallaWindowsuniversalforwarderfrom...

Sample scripts,

https://answers.splunk.com/answers/34896/simple-installation-script-for-universal-forwarder.html

Posting clear would help others to help you further.

V

davesplunk01
Path Finder

Thanks. We were planning to increase the servers and dont want to do the manual installs. I should have aksked directly. Thanks much!!

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

To read a physical log file on your servers you need a program on your servers - that'd be the Universal Forwarder... but you said it's somehow impossible to use for you. Instead of writing log files to disk, your applications could send the logs directly to Splunk via the HEC. That way you wouldn't need to install anything on your servers as per your requirement.

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

A good alternative can be to let your applications send the logs to Splunk via the HTTP Event Collector: http://docs.splunk.com/Documentation/Splunk/6.5.2/Data/UsetheHTTPEventCollector

0 Karma

davesplunk01
Path Finder

Its a physical log file . how the event collector will help?

0 Karma

martin_mueller
SplunkTrust
SplunkTrust

The best way would be the Universal Forwarder. What's your reason for not wanting to use UFs?

0 Karma

davesplunk01
Path Finder

Thanks for reply. Installing universal forwarder in all server is not possible with my case. We looking remote or alternative feature.

0 Karma

mattymo
Splunk Employee
Splunk Employee

UF, HEC, Syslog, batch file upload, rest call, rsync, bash script....the options are limitless.

What kind of servers/apps you running. what kind of data are you collecting? Might help us workaround your no uf requirement by suggesting what is possible with the source machines/apps

- MattyMo
0 Karma
Get Updates on the Splunk Community!

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...