Getting Data In

Why am I getting the following Http Event Collector (HEC) errors?: {"text":"Invalid token","code":4}

sylim_splunk
Splunk Employee
Splunk Employee

I created 100s of HEC tokens and put them in an app, which has been pushed down to several Heavy Forwarders. Most of them are working fine, but strangely, several of them are not working and give the following invalid token error.

They are all in the inputs.conf as the other working ones in the app. Configurations all appear Okay to me.

curl -k https://splunk-hec.abc.net:8088/services/collector -H 'Authorization: Splunk adf401c2-43ef-4689-a56c-ba47f907eca8' -d '{"sourcetype": "https:hectest", "event":"HEC Token Test", "index":"index_hectest"}'

{"text":"Invalid token","code":4}
0 Karma
1 Solution

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

View solution in original post

sylim_splunk
Splunk Employee
Splunk Employee

It turned out to be an issue of duplicate names in the http stanza. Like [http://abc] , several of them have the same name by mistake and those were being ignored by Splunk for that reason.

curl -v -k https://splunk.. gave more details about the transaction and below has the details about the status code.

http://docs.splunk.com/Documentation/Splunk/7.2.0/Data/TroubleshootHTTPEventCollector#Possible_error...

Due to a number of stanzas defined all at once, it was created by human mistake. I used the below to get which are duplicate.

$ egrep "\[http://.*\]" inputs.conf | sort |uniq -c | awk '{ if ($1 > 1) print $1 $2; }'

After the fix, all tokens are working fine.

Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In the last month, the Splunk Threat Research Team (STRT) has had 2 releases of new security content via the ...

Announcing the 1st Round Champion’s Tribute Winners of the Great Resilience Quest

We are happy to announce the 20 lucky questers who are selected to be the first round of Champion's Tribute ...

We’ve Got Education Validation!

Are you feeling it? All the career-boosting benefits of up-skilling with Splunk? It’s not just a feeling, it's ...