Getting Data In

How to encrypt communication between a universal forwarder and heavy forwarder?

shaileshmali
Path Finder

I am not able to configure heavy forwarder inputs.conf file to receive encrypted traffic.

1) config inputs.conf on heavy forwarder is given below. but netstat -an does not show heavy forwarder is listening on port 9998

[splunktcp-ssl:9998]
compressed = true

[SSL]
password = $1$YJqLPm4skNlFOQ==
rootCA = /opt/splunk/etc/certs/ca.pem
serverCert = /opt/splunk/etc/certs/splunk-dev.pem

2) On universal forwarder I am using app with outputs.conf

[tcpout]
defaultGroup = splunkssl
sendCookedData = true
dnsResolutionInterval = 300

[tcpout:splunkssl]
compressed = true
server = heavy forwarder name :9998
sslCertPath = C:\Program Files\SplunkUniversalForwarder\etc\apps\FW_DEV_NA_Encrypt\default\certs\splunk-dev.pem
sslPassword = testmy123
sslRootCAPath = C:\Program Files\SplunkUniversalForwarder\etc\apps\FW_DEV_NA_Encrypt\default\certs\ca.pem
sslVerifyServerCert = false
0 Karma

m4him7
Path Finder

I had a similar issue and found that I had an outputs.conf file in my etc/system/local directory that was being used instead of the outputs.conf in my app directory. I renamed the wrong outputs.conf file and restarted Splunk. You can tell if your outputs.conf file is being used as the sslPassword = testmy123 will be encrypted after it is read the first time. If you have an outputs.conf file in another directory being used first it may be that it is using a different port which is another indication that the wrong outputs.conf file is being used.

0 Karma
Get Updates on the Splunk Community!

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...