Getting Data In

Put file in Spunk

callmeshawn
New Member

Hello Splunk Community,

I've got logs on RHEL and CentOS servers. I'd like to be able to upload all logs from /var/log/ to Splunk.
Is there a way I can do something like a cURL -T http:/// ... to transfer files?

Pls accept my apologies if this question has already been asked and answered. I've been reading some of the questions and it seem that folks have asked whether it can be done via REST, but I don't see that the question has been answered.

Thank you,

Radesh

Tags (2)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

The easiest way is to install a Splunk Universal Forwarder (UF) on each RHEL/CentOS box. The UF can monitor files in /var/log and send them to Splunk as they are updated. See https://docs.splunk.com/Documentation/Forwarder/8.0.2/Forwarder/Abouttheuniversalforwarder

If you have more than a few servers then you'll also want to configure a deployment server (DS) to manage your UFs. See https://docs.splunk.com/Documentation/Splunk/8.0.2/Updating/Deploymentserverarchitecture for how to do that.

---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

The easiest way is to install a Splunk Universal Forwarder (UF) on each RHEL/CentOS box. The UF can monitor files in /var/log and send them to Splunk as they are updated. See https://docs.splunk.com/Documentation/Forwarder/8.0.2/Forwarder/Abouttheuniversalforwarder

If you have more than a few servers then you'll also want to configure a deployment server (DS) to manage your UFs. See https://docs.splunk.com/Documentation/Splunk/8.0.2/Updating/Deploymentserverarchitecture for how to do that.

---
If this reply helps you, Karma would be appreciated.
0 Karma

callmeshawn
New Member

Thank you for the suggestion.

I've heard of the forwarder.
We've got many machines (1800 hosts wouldn't be an exaggerated number), so trying to avoid installing the UF on each system and streaming the logs to Splunk. Concerns have been raised about network bandwidth utilization and the amount of data that will be sent to Splunk and how doing so might pose challenges with the amount of data we can ingest.

As such wanted to provide an "on-demand" way of doing it. Thinking was we might utilize Ansible to connect to the host, tar up the data and upload it to Splunk.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

If you have Ansible then you have an easy way to install a UF on each machine. The amount of bandwidth used likely will be less than that used when uploading a tarball and the data size will be the same. Either, way you still have the same ingestion limit. Running a UF on a machine uses very little resources.

Loading log files on demand introduces a significant delay between when an event occurs and when Splunk can detect it.

---
If this reply helps you, Karma would be appreciated.
0 Karma

callmeshawn
New Member

Agreed, distribution of the software is trivial, but constantly sending logs will generate much more traffic than the occasional request for some logs. If Splunk doesn't support a user "dropping" a file then maybe it not the right tool in this situation. Thank you for your feedback.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

I didn't say Splunk couldn't do it. Splunk can do it. Install a single forwarder on a system that monitors an empty directory. Drop files in there when you wish and they will be indexed.

That isn't a very proactive way of monitoring your machines, however.

UFs can be configured to pull only selected Windows event codes to help reduce the amount of traffic.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...