Getting Data In

Why am I getting the following Http Event Collector (HEC) errors?: {"text":"Invalid token","code":4}

sylim_splunk
Splunk Employee
Splunk Employee

I created 100s of HEC tokens and put them in an app, which has been pushed down to several Heavy Forwarders. Most of them are working fine, but strangely, several of them are not working and give the following invalid token error.

They are all in the inputs.conf as the other working ones in the app. Configurations all appear Okay to me.

curl -k https://splunk-hec.abc.net:8088/services/collector -H 'Authorization: Splunk adf401c2-43ef-4689-a56c-ba47f907eca8' -d '{"sourcetype": "https:hectest", "event":"HEC Token Test", "index":"index_hectest"}'

{"text":"Invalid token","code":4}
0 Karma
1 Solution

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

View solution in original post

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

Get Updates on the Splunk Community!

Your Guide to Splunk Digital Experience Monitoring

A flawless digital experience isn't just an advantage, it's key to customer loyalty and business success. But ...

Data Management Digest – November 2025

  Welcome to the inaugural edition of Data Management Digest! As your trusted partner in data innovation, the ...

Upcoming Webinar: Unmasking Insider Threats with Slunk Enterprise Security’s UEBA

Join us on Wed, Dec 10. at 10AM PST / 1PM EST for a live webinar and demo with Splunk experts! Discover how ...