Getting Data In

Is there a way to send JSON data (via HTTPS Post) to Splunk's API without defining an index?

packet_hunter
Contributor

I am trying to send alerts from an appliance to a Splunk (HF) forwarder.
I have the appliance sending to Splunk's Rest API and I can see data pushed to tcp.port 8089 on the forwarder, but the index and sourcetype are not predefined on the HF. Is there a way for to send the data to the API and pass the index and sourcetype in the Post statement?

Thank you

0 Karma

lguinn2
Legend

If you want to send something from an appliance to a heavy forwarder, you might consider using the HTTP Event Collector (HEC) instead. HEC allows the data to include sourcetype, host and index - OR to let that default. If defaulted, the Splunk Admin can specify the index, etc. HEC works well with json data.

Using the HEC also means that the appliance does not need a Splunk username/password to connect. Instead it uses a token which is easy to control.

Here is the Introduction to the HTTP Event Collector.

packet_hunter
Contributor

Thank you for the reply.
I looked into the HEC with Fireeye before and we could not get it to work.
Do you know anyone who successfully setup a fireeye appliance NX, EX, or HX with HEC ?
Thank you

0 Karma

sunilpanda023
Path Finder

make sure there is no network connectivity issues, its pretty neat and simple.

try from CLI - http://docs.splunk.com/Documentation/Splunk/7.1.1/Data/UseHECfromtheCLI
how to set up - http://docs.splunk.com/Documentation/Splunk/7.1.1/Data/UsetheHTTPEventCollector

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...