Splunk Cloud Platform

Issue with logs forwarding from HF to Splunk Cloud

SplunkExplorer
Contributor

Hi Splunkers, I have an issue with a log forwarding from an HF to Splunk Cloud and I need a suggestion about troubleshooting.

In this environment, some firewalls has been set to send data to an HF. Then, data goes to a Splunk Cloud. So, the global flow is:

Firewall's ecosystem -> HF -> Splunk Cloud

On HF, a network TCP input has been configured and it works fine: all firewall added until now send correctly data to Cloud. Yesterday, firewall admin has configured a new one to send data to Splunk, but I cannot see logs on our env.

So, first of all, I asked to Network Admin a check about log forwarding configuration and all has been done properly.
Then, I checked if logs are coming from Firewall to HF; a simple tcpdump on configured port show no reset or other suspicious flags. All packe captured has [P] and [.] flag, with ack. So, data arrives where they are supposed to be collected.
So, I checked _internal logs, based on firewall IP; no error are shown by this search. I got logs from metrics.log and license.log (mainy from metrics), but no error message are returned.

By the way, when I query configured index and sourcetype (that collect properly logs from other firewalls), I cannot see the new one. I used both IP and hostname of firewall device, but no logs are returned. I thought: it could be possible that data arrive to HF but, then, they don't go to Cloud? But in such a case, I presume some error logs should spawn. And supposing my assumption is correct, I could I check it?

Labels (2)
0 Karma

Richfez
SplunkTrust
SplunkTrust

It sounds like you've done pretty good basic troubleshooting already and confirmed that the data *should* be coming in.

So it very well may be, but the reason you can't find it is because the time on the device is off?

Maybe it's a week or a day behind, or even worse it's set to next month.  You *could* try searching for its IP address over all time just to see if this is the case.  Maybe it's just that its timezone is mis- or unspecified, and it's showing up always from 4 hours ago so all searches running in timeframes closer to now than 4 hours ago are just missing it.  (E.g. "now" ends up being squirreled away in Splunk as X hours ago, so "last 4 hours" never shows it).

 

That's my guess, give that a think and a try and see what you find.

 

Happy Splunking,

Rich

 

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Take Action Automatically on Splunk Alerts with Red Hat Ansible Automation Platform

 Are you ready to revolutionize your IT operations? As digital transformation accelerates, the demand for ...

Calling All Security Pros: Ready to Race Through Boston?

Hey Splunkers, .conf25 is heading to Boston and we’re kicking things off with something bold, competitive, and ...

Beyond Detection: How Splunk and Cisco Integrated Security Platforms Transform ...

Financial services organizations face an impossible equation: maintain 99.9% uptime for mission-critical ...