Getting Data In

Why am I getting the following Http Event Collector (HEC) errors?: {"text":"Invalid token","code":4}

sylim_splunk
Splunk Employee
Splunk Employee

I created 100s of HEC tokens and put them in an app, which has been pushed down to several Heavy Forwarders. Most of them are working fine, but strangely, several of them are not working and give the following invalid token error.

They are all in the inputs.conf as the other working ones in the app. Configurations all appear Okay to me.

curl -k https://splunk-hec.abc.net:8088/services/collector -H 'Authorization: Splunk adf401c2-43ef-4689-a56c-ba47f907eca8' -d '{"sourcetype": "https:hectest", "event":"HEC Token Test", "index":"index_hectest"}'

{"text":"Invalid token","code":4}
0 Karma
1 Solution

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

View solution in original post

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...