Security

Can splunk indexer or forwarder in trusted network reach out to servers in a DMZ to grab data?

beaudet
Explorer

I have several public facing web servers on which I want to run Splunk in a light-forwarder configuration. However, rather than opening any firewall ports FROM my DMZ back into my trusted network, I would much prefer a configuration permitting a Splunk indexer on my trusted network to instead just reach into the DMZ to grab data from the light-forwarder Splunk running on the public facing server. Is this possible with Splunk and if so, must I be running a particular version?

Rather than a light-forwarder, I guess what I'm really looking for is a store until retrieved type of configuration, but I'm not sure if that is even possible with Splunk. Thanks!

2 Solutions

gkanapathy
Splunk Employee
Splunk Employee

Not out of the box, but a Splunk forwarder can run arbitrary code, which means it can call APIs to read data, or call scripts to fetch and download files. I'm not sure what you mean specifically by "reach and grab data" from the DMZ.

View solution in original post

0 Karma

ftk
Motivator

Out of the box an indexer cannot reach to a forwarder to pull data -- the system is designed for forwarders to push data to the indexer.

I am running forwarders in my DMZs as well. Here is my setup, this might work for you as well:

I determine the DMZ host that has the least amount of load, and set it up as a regular forwarder. This forwarder is configured to forward via SSL to my indexer as per http://www.splunk.com/base/Documentation/latest/Admin/EncryptandauthenticatedatawithSSL . Furthermore I configure the forwarder to check the receiver's certificate, and require the receiver (indexer) to check the client's (forwarder's) certificate so I can feel better about the correct data coming out of the DMZ. An appropriate firewall ALLOW rule is put in place to let the splunk data port through from that particular DMZ host to the indexer.

Once this DMZ forwarder is working correctly with the indexer, I configure the DMZ forwarder as a receiver as well. Now I deploy lightweight forwarders on all other DMZ hosts, and have them forward to the first (regular) DMZ forwarder. The first (regular) forwarder pools all the data from the lightweight forwarders in the DMZ, cooks (parses) it, and then forwards it over the SSL tunnel to the indexer.

For what it's worth, PCI auditors are fine with this setup.

View solution in original post

ftk
Motivator

Out of the box an indexer cannot reach to a forwarder to pull data -- the system is designed for forwarders to push data to the indexer.

I am running forwarders in my DMZs as well. Here is my setup, this might work for you as well:

I determine the DMZ host that has the least amount of load, and set it up as a regular forwarder. This forwarder is configured to forward via SSL to my indexer as per http://www.splunk.com/base/Documentation/latest/Admin/EncryptandauthenticatedatawithSSL . Furthermore I configure the forwarder to check the receiver's certificate, and require the receiver (indexer) to check the client's (forwarder's) certificate so I can feel better about the correct data coming out of the DMZ. An appropriate firewall ALLOW rule is put in place to let the splunk data port through from that particular DMZ host to the indexer.

Once this DMZ forwarder is working correctly with the indexer, I configure the DMZ forwarder as a receiver as well. Now I deploy lightweight forwarders on all other DMZ hosts, and have them forward to the first (regular) DMZ forwarder. The first (regular) forwarder pools all the data from the lightweight forwarders in the DMZ, cooks (parses) it, and then forwards it over the SSL tunnel to the indexer.

For what it's worth, PCI auditors are fine with this setup.

Lowell
Super Champion

We use a very similar setup. As far as deployment, we manage all of our apps in on our central splunk indexer, and then use a simple rsync process to push them from a local "deployment-apps" folder to the same folder on the DMZ forwarder instance. That instance acts as the splunk deployment sever for all DMZ splunk clients. For a site-wide deploy, I run reload deploy-sever on the central server, then run a simple script that calls rsync for the apps, and then uses ssh to call "splunk reload deploy-server" on the DMZ splunk instance, and that's it.

beaudet
Explorer

That's encouraging. Thanks.

0 Karma

ftk
Motivator

Really not all that hard to pull off. I would recommend setting up a lab environment with a few VMs or old servers to try it out.

0 Karma

beaudet
Explorer

This seems like a good compromise if the other approach doesn't work for me. It seems a little more complicated in terms of Splunk deployment than my Splunking skills allow for right now, but I especially appreciate that PCI auditors are fine with it.

0 Karma

gkanapathy
Splunk Employee
Splunk Employee

Not out of the box, but a Splunk forwarder can run arbitrary code, which means it can call APIs to read data, or call scripts to fetch and download files. I'm not sure what you mean specifically by "reach and grab data" from the DMZ.

0 Karma

beaudet
Explorer

on second thought, it would be rsync not scp due to potentially large log file sizes... rsync only transfers the changed blocks. The only potential gotcha could be performance of rsync. I'll just have to test it.

0 Karma

beaudet
Explorer

by "reach and grab", I mean a network connection is initiated FROM the trusted network TO the DMZ where the logs reside. Thanks for the solution - seems like running a script on a forwarder to SCP or secure RSYNC files from the DMZ back into the trusted network would work for us. I'll give it a shot.

0 Karma
Get Updates on the Splunk Community!

Introducing Splunk Enterprise 9.2

WATCH HERE! Watch this Tech Talk to learn about the latest features and enhancements shipped in the new Splunk ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...