Installation

UDP data input

jmrubio
Explorer

Hello!

I am trying to send data to Splunk using UDP, I tried to set it up using the documentation and seen a few videos on how to set it up but can't get it right. I have the data coming into my HF from network devices and then should be sent to my indexers. After going through the set up I get this error message "Search peer splunk_indexer_02 has the following message: Received event for unconfigured/disabled/deleted index=<index> with source="source::udp:514" host="host::xx.xx.xx.xx" sourcetype="sourcetype::<sourcetype>. So far received events from 2 missing index(es)." I created a new index during the set up but there is no data to search.

Labels (2)
0 Karma

JohnEGones
Path Finder

 Here are some useful references:

Get data from TCP and UDP ports - Splunk Documentation
Create custom indexes - Splunk Documentation

Note the section in the first link, copied below, where ...



Configure a UDP network input

This type of input stanza is similar to the TCP type, except that it listens on a UDP network port. If you provide <remote server>, the port that you specify only accepts data from that host. If you don't specify anything for <remote server>, the port accepts data that comes from any host.

[udp://<remote server>:<port>]
<attrbute1> = <val1>
<attrbute2> = <val2>
...

The following settings control how the Splunk platform stores the data:

Setting Description Default
host = <string>Sets the host field to a static value for this stanza. Also sets the host key initial value. Splunk Cloud Platform uses this key during parsing and indexing, in particular to set the host field. It also uses the host field at search time. The <string> is prepended with host::.The IP address or fully-qualified domain name of the host where the data originated.index = <string>Sets the index where Splunk Cloud Platform stores events from this input. The <string> is prepended with index::.main or whatever you set the default index to
sourcetype = <string>

Sets the sourcetype field for events from this input. Also declares the source type for this data, as opposed to letting Splunk Cloud Platform determine it. This is important both for searchability and for applying the relevant formatting for this type of data during parsing and indexing.

Sets the sourcetype key initial value. Splunk Cloud Platform uses the key during parsing and indexing, in particular to set the source type field during indexing. It also uses the source type field that it used at search time.

The <string> is prepended with sourcetype::.

Splunk Cloud Platform picks a source type based on various aspects of the data. There is no hard-coded default.
source = <string>Sets the source field for events from this input. The <string> is prepended with source::.

Do not override the source key unless absolutely necessary. The input layer provides a more accurate string to aid in problem analysis and investigation by recording the file from which the data is retrieved. Consider use of source types, tagging, and search wildcards before overriding this value.

The input file path.indexQueueSets where the input processor deposits the events that it reads. Set to parsingQueue to apply the props.conf file and other parsing rules to your data. Set to indexQueue to send your data directly into the index.parsingQueue_rcvbuf = <integer>Sets the receive buffer for the UDP port, in bytes. If the value is 0 or negative, Splunk Cloud Platform ignores the value.1,572,864 unless the value is too large for an OS. In this case, Splunk Cloud Platform halves the value from this default continuously until the buffer size is at an acceptable level.
no_priority_stripping = true | false

Sets how Splunk Enterprise handles receiving syslog data.

If you set this setting to true, Splunk Cloud Platform does not strip the <priority> syslog field from received events.

Depending on how you set this setting, Splunk Cloud Platform also sets event timestamps differently. When set to true, Splunk Cloud Platform honors the timestamp as it comes from the source. When set to false, Splunk Enterprise assigns events the local time.

false (Splunk Cloud Platform strips <priority>.)
no_appending_timestamp = true | falseSets how Splunk Cloud Platform applies timestamps and hosts to events.

If you set this setting to true, Splunk Cloud Platform does not append a timestamp and host to received events.

Do not configure this setting if you want to append timestamp and host to received events.

false (Splunk Cloud Platform appends timestamps and hosts to events)

@jmrubio

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jmrubio ,

if you have this message on Indexer, it seems that you forgot to create the index on Indexers or that maybe there's a difference between the index name and the index that you configured in the inputs.con of the HF.

If the message is in the HF, it seems that there's an issue in forwardring configuration.

Ciao.

Giuseppe

0 Karma

richgalloway
SplunkTrust
SplunkTrust

It sounds like the new index was created on the HF, but not on the indexers.  The index must exist on the indexers so they have a place to store the data.

---
If this reply helps you, Karma would be appreciated.

jmrubio
Explorer

@richgalloway  So I just have to create an index wit the same name on the indexers?

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jmrubio ,

you mainly have to create an index on the indexers.

Then, if you like but itisn't mandatory, you can also create an index on the HF, but only to have the index in the dropdowns, this index will never be used.

Ciao.

Giuseppe

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Correct.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...