Splunk Cloud Platform

Issue with logs forwarding from HF to Splunk Cloud

SplunkExplorer
Contributor

Hi Splunkers, I have an issue with a log forwarding from an HF to Splunk Cloud and I need a suggestion about troubleshooting.

In this environment, some firewalls has been set to send data to an HF. Then, data goes to a Splunk Cloud. So, the global flow is:

Firewall's ecosystem -> HF -> Splunk Cloud

On HF, a network TCP input has been configured and it works fine: all firewall added until now send correctly data to Cloud. Yesterday, firewall admin has configured a new one to send data to Splunk, but I cannot see logs on our env.

So, first of all, I asked to Network Admin a check about log forwarding configuration and all has been done properly.
Then, I checked if logs are coming from Firewall to HF; a simple tcpdump on configured port show no reset or other suspicious flags. All packe captured has [P] and [.] flag, with ack. So, data arrives where they are supposed to be collected.
So, I checked _internal logs, based on firewall IP; no error are shown by this search. I got logs from metrics.log and license.log (mainy from metrics), but no error message are returned.

By the way, when I query configured index and sourcetype (that collect properly logs from other firewalls), I cannot see the new one. I used both IP and hostname of firewall device, but no logs are returned. I thought: it could be possible that data arrive to HF but, then, they don't go to Cloud? But in such a case, I presume some error logs should spawn. And supposing my assumption is correct, I could I check it?

Labels (2)
0 Karma

Richfez
SplunkTrust
SplunkTrust

It sounds like you've done pretty good basic troubleshooting already and confirmed that the data *should* be coming in.

So it very well may be, but the reason you can't find it is because the time on the device is off?

Maybe it's a week or a day behind, or even worse it's set to next month.  You *could* try searching for its IP address over all time just to see if this is the case.  Maybe it's just that its timezone is mis- or unspecified, and it's showing up always from 4 hours ago so all searches running in timeframes closer to now than 4 hours ago are just missing it.  (E.g. "now" ends up being squirreled away in Splunk as X hours ago, so "last 4 hours" never shows it).

 

That's my guess, give that a think and a try and see what you find.

 

Happy Splunking,

Rich

 

0 Karma
Get Updates on the Splunk Community!

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

Watch On Demand the Tech Talk, and empower your SOC to reach new heights! Duration: 1 hour  Prepare to ...

Splunk Observability as Code: From Zero to Dashboard

For the details on what Self-Service Observability and Observability as Code is, we have some awesome content ...