Getting Data In

Ingesting logs from SFTP Server via HF

tokio13
Path Finder

Hello,

Can someone guide me on how can I ingest logs from a SFTP server? I have available Heavy Forwarders that sit outside the SFTP location. 

 

Thanks!

Labels (1)
Tags (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Since you can't directly read from another server, you'd need a way to either sync those files to local server or make a scripted input to log in to the remote server and fetch the results.

I use the first approach - I have a cron-launched rsync (over ssh, not sftp, but the idea is the same; if you have the possibility of using rsync, do so because it can synchronize incrementally) which synchronizes log files from remote servers and then local monitor inputs can ingest them as if they were created on the localhost.

Yet another alternative is to export the file from the original host by means of NFS or CIFS (remember that connecting to CIFS servers with linux clients can have its hiccups), mount such exported directory and read from there.

0 Karma

tokio13
Path Finder

Since with the first solution I would end up with the logs (received from the STFP server) stored on the same VM where the HF is ( to use local monitor input) this would mean that the VM storage needs to increase. Since I don't have the storage available I guess I'm stuck.

Would you be kind to elaborate on the creation of a scripted input to log in to the remote server and fetch the results?

0 Karma

PickleRick
SplunkTrust
SplunkTrust

It depends how you would manage the storage. The key here is file rotation. There are several mechanisms you could deploy depending on your use case but generally you don't want to store too much of your data locally - just read it and get rid of it. Either using batch input or employing logrotate to delete old files. There are some possibilities depending on your setup. But true - you need some buffer space for at least the immediate logs to ingest.

About scripted input - https://docs.splunk.com/Documentation/Splunk/latest/AdvancedDev/ScriptSetup But there is no ready-made solution for your case, you'd have to write something on your own.

But I remembered one more thing. If this is indeed SFTP and your forwarder is on Linux, you could try mounting the "share" using sshfs. Probably using automount to make it more "transparent" and resilient to source restarts.

In fact I do have such setup myself for pulling data from a remote DMZ-located site from which I have no incoming traffic.

It will not be NFS-efficient but could suffice.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...