Getting Data In

How to forward logs from a local data center to a Splunk Enterprise Indexer in AWS

devenjarvis
Path Finder

I have been trying at this for a couple of weeks now with no luck. We have a Splunk Enterprise setup in AWS with a search head, 2 indexers, and an auto-scaled group of forwarders for cloud watch log data we are passing in. It's working great right now. We would now like to use this existing setup to consume logs from servers that sit in our own Data Center (not AWS).

My thought was to simply add a Universal Forwarder on the server of choice, throw an Elastic Loadbalancer in front of one of the Indexers in AWS (eventually we will send this data to both indexers if possible), and use Route53 in front of the Load Balancer to give it a domain for the UF to point at. The UF is set to forward to the Route53 domain on port 443. The load balancer takes traffic on port 443 and passes it to the indexer on port 9995 which we have setup as a receiving port on the indexer.

Conceptually I think this should work, and I have verified firewalls are open to allow this traffic, but 'splunk list forward-servers' on the UF reveals that the host is 'configured but inactive'. The splunkd.log file isn't especially helpful from what I can tell, the only error I see is about payload_size being too large, but searching on the answers forums reveals that this could be related to a number of network issues and none of the solutions I found seemed to be relevant/work. So, my question is how can I troubleshoot what is wrong with my networking that isn't allowing logs to be forwarded to my indexer?

All help or ideas are appreciated!

0 Karma
1 Solution

devenjarvis
Path Finder

Ironically after working on this for weeks I finally found the answer. The load balancer was set to listen on HTTP traffic not TCP. Making that switch fixed this. I apologize for the unnecessary question.

View solution in original post

devenjarvis
Path Finder

Ironically after working on this for weeks I finally found the answer. The load balancer was set to listen on HTTP traffic not TCP. Making that switch fixed this. I apologize for the unnecessary question.

ppablo
Retired

No apologies needed @devenjarvis 🙂 thanks for sharing your solution with the community to close your question out.

0 Karma
Get Updates on the Splunk Community!

Built-in Service Level Objectives Management to Bridge the Gap Between Service & ...

Wednesday, May 29, 2024  |  11AM PST / 2PM ESTRegister now and join us to learn more about how you can ...

Get Your Exclusive Splunk Certified Cybersecurity Defense Engineer Certification at ...

We’re excited to announce a new Splunk certification exam being released at .conf24! If you’re headed to Vegas ...

Share Your Ideas & Meet the Lantern team at .Conf! Plus All of This Month’s New ...

Splunk Lantern is Splunk’s customer success center that provides advice from Splunk experts on valuable data ...