Getting Data In

Why am I getting the following Http Event Collector (HEC) errors?: {"text":"Invalid token","code":4}

sylim_splunk
Splunk Employee
Splunk Employee

I created 100s of HEC tokens and put them in an app, which has been pushed down to several Heavy Forwarders. Most of them are working fine, but strangely, several of them are not working and give the following invalid token error.

They are all in the inputs.conf as the other working ones in the app. Configurations all appear Okay to me.

curl -k https://splunk-hec.abc.net:8088/services/collector -H 'Authorization: Splunk adf401c2-43ef-4689-a56c-ba47f907eca8' -d '{"sourcetype": "https:hectest", "event":"HEC Token Test", "index":"index_hectest"}'

{"text":"Invalid token","code":4}
0 Karma
1 Solution

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

View solution in original post

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...