Hello. My Palo Alto firewall logs were successfully forwarding to Splunk for a while, except today I noticed that for the past week it has not been working. Everything from the Palo Alto end configuration looks good. Not sure where to look in the Splunk end first, any ideas or documentation where to look?
On the Splunk instance that is collecting the firewall logs, examine the splunkd.log - this would usually be a forwarder. The splunkd.log should be under SPLUNK_HOME/var/log/splunk. Check for any errors or warnings. I would also run
btool check on the forwarder to see if there are any typos in your Splunk configuration files.
If the forwarder's splunkd.log indicates that it reading the files (or the port) and sending them, then check to see that the data is going to the proper index. Also check to see that the user running the search has the proper permissions to see the index.
You can also go to the search head (or indexer) and search the _internal index - are there any entries arrving from that forwarder? If yes, it is probably a problem with the input settings for the PAN logs on the forwarder (inputs.conf or props.conf) for the specific data? If there is no data coming from the forwarder, it is probably the outputs.conf settings on the forwarder or the inputs.conf settings on the receiver(s) [the indexers].
Finally, if there is every indication that the data is being forwarded, and it appears in the _internal log, then check to see if the indexer is excluding that data before indexing: for example, it could be sending the data to the nullQueue in transforms.conf. This would be pretty unusual, but it could be that someone made a mistake when they set up the transforms.
I found 6 different outputs.conf files on my heavy forwarder. Which one do I look at? This is all odd because the Palo firewall logs were coming in fine until about a week ago and no changes were made to the configs...