All Apps and Add-ons

Why are some fields not extracted properly at Heavy Forwarder? (Connected to Splunk Cloud)

azer271
Explorer

I believe I have managed to get myself confused and would like to request assistance about field extraction.

I have a new heavy forwarder, which is going to connect Splunk Cloud. First, the heavy forwarder will act as a simple Splunk Enterprise instance, before connecting to Splunk Cloud. The HF installed apps, such as 

Fortinet Fortigate Add-on for Splunk, 

Splunk Add-on for Palo Alto Networks, 

Splunk Add-on for Microsoft Windows,

Splunk Add-on for Checkpoint Log Exporter.

I just simply installed and created inputs in local folder and they are good to go in HF. In the Splunk Enterprise instance, all inputs work fine. All fields are parsed properly, such as checkpoint logs, PA logs, Windows xml logs, fortigate logs.

However, after connecting to Splunk Cloud, the universal forwarder credentials package is downloaded from Splunk Cloud and the app is installed in the HF. The connection is fine and logs are receiving. The weird issue is ONLY checkpoint and fortigate logs' fields are all extracted successfully, when I search in Splunk Cloud.

For some reason, the Windows logs show a surprisingly small number of fields being extracted, when I search in Splunk Cloud. When I search the windows logs (old data in test index) in HF, it shows a LOT of interesting fields (>300), which is great. The PA logs only extracted host, index, source, sourcetype, _time (including default ones like linecount, punct, splunk_server), when I search in Splunk Cloud.

I am confused because checkpoint and fortigate logs are all extracted successfully, but others are not. I understand that the apps are recommended to install across the deployment (https://docs.splunk.com/Documentation/AddOns/released/Overview/Wheretoinstall), but I would like to know a reason why some apps work and some apps do not. They are only installed in HF and the fields should be all extracted in the forwarder layer? Is it possible that the field extraction is not finished, since there are just too much data coming or too much data in total (PA logs >10000 events last 30 mins, windows logs >2000 events last 30 mins)?

Thanks. I appreciate your help.

0 Karma
1 Solution

ivan5593
Explorer

Usually at the forwarding layer, the TA extracts metadata fields such as time, and parses data according to props and transforms of every TA. 
Once extracted, the forwarders sends cooked data to indexing tier. But essentially forwarders can act as a simple log redirector or an indexer (in terms of extracting data) depending on your configuration.

Also, Splunk have other type of fields that are extracted at search time. Basically, you store your "raw" log and extract the fields when you do the search. This is why sometimes the TA must be installed at Search tier (your cloud instance). Otherwise this kind of calculated or lookup fields wont work.

Please follow the documentation of each Technical add-on to know if you need to install it in the search tier, it often will be.

View solution in original post

ivan5593
Explorer

Usually at the forwarding layer, the TA extracts metadata fields such as time, and parses data according to props and transforms of every TA. 
Once extracted, the forwarders sends cooked data to indexing tier. But essentially forwarders can act as a simple log redirector or an indexer (in terms of extracting data) depending on your configuration.

Also, Splunk have other type of fields that are extracted at search time. Basically, you store your "raw" log and extract the fields when you do the search. This is why sometimes the TA must be installed at Search tier (your cloud instance). Otherwise this kind of calculated or lookup fields wont work.

Please follow the documentation of each Technical add-on to know if you need to install it in the search tier, it often will be.

azer271
Explorer

thanks. i get it now!

0 Karma
Get Updates on the Splunk Community!

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

The Latest Cisco Integrations With Splunk Platform!

Join us for an exciting tech talk where we’ll explore the latest integrations in Cisco + Splunk! We’ve ...

AI Adoption Hub Launch | Curated Resources to Get Started with AI in Splunk

Hey Splunk Practitioners and AI Enthusiasts! It’s no secret (or surprise) that AI is at the forefront of ...