Hello, is it possible in Splunk HEC from Kafka to receive raw events on HF in order to parse fields with addons?
It seems we can only receive json data with "event" field and may not be able to extract fields within standard addons?
The HEC event may also contain target index and sourcetype.
Thanks.
Thanks @livehybrid then Splunk should parse correctly fields for addons? Do you mean _raw will be the original event from source host and sending to targered index/sourcetype?
Hi @splunkreal
If using the raw endpoint then _raw will be whatever is sent from the source. Different Splunkbase / Custom apps can perform different field extractions depending on the source of the data.
Are you sending a particular type of log or from a specific vendor/tool via Kafka? I'd be happy to investigate if there is an appropriate add-on to export the data for it. Note, however, that Kafka may result in the data not being in the original format and thus might not extract correctly and might need further work.
Please let us know what the source data is in and I'd be happy to help.
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing
Hi @splunkreal
Are you using Splunk Connect for Kafka? If so you should be able to set it to use raw HEC endpoint:
"splunk.hec.raw" : "true",
For more info check out https://help.splunk.com/en/splunk-cloud-platform/get-data-in/splunk-connect-for-kafka/2.2/configure/...
🌟 Did this answer help you? If so, please consider:
Your feedback encourages the volunteers in this community to continue contributing
Hello @livehybrid looks like the json is from Vector agent to Kafka, that's why we may end up with json or is it possible to convert json to raw log in Splunk?