Getting Data In

Why am I getting the following Http Event Collector (HEC) errors?: {"text":"Invalid token","code":4}

sylim_splunk
Splunk Employee
Splunk Employee

I created 100s of HEC tokens and put them in an app, which has been pushed down to several Heavy Forwarders. Most of them are working fine, but strangely, several of them are not working and give the following invalid token error.

They are all in the inputs.conf as the other working ones in the app. Configurations all appear Okay to me.

curl -k https://splunk-hec.abc.net:8088/services/collector -H 'Authorization: Splunk adf401c2-43ef-4689-a56c-ba47f907eca8' -d '{"sourcetype": "https:hectest", "event":"HEC Token Test", "index":"index_hectest"}'

{"text":"Invalid token","code":4}
0 Karma
1 Solution

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

View solution in original post

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

Get Updates on the Splunk Community!

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Accelerating Observability as Code with the Splunk AI Assistant

We’ve seen in previous posts what Observability as Code (OaC) is and how it’s now essential for managing ...