Getting Data In

How to parse json from syslog messages coming in on udp:514?

Path Finder

Hi

I want to send the same json-encoded structures on HTTP Event collector/REST API as well as syslog udp/tcp.
One of the fields in the structure is sourcetype=JSON, and I have a proper entry for JSON in prop.conf.

Yet when syslog udp:514 messages come in, they are tagged sourcetype=udp:514, and the fields don't get extracted.
I suppose I could enable JSON parsing for udp:514, but this seems wrong, since the majority of syslog data is not structured.

How can I "deflect" these specific messages to be handled as a different sourcetype?

rama

0 Karma

Path Finder

Hi plumsdaine22
(I was sure to copy-paste this...)

Thank you VERY MUCH for your speedy suggestion.

I tried dedicated syslog "data inputs" (with TCP:515, for instance), but no luck.

I ended up serializing the structures as attribute=value list when sending to syslog, and encode_json when sending to HTTP.

I 'll update this question later on if I come across a good way to accomplish my initial goal, or I become more fluent with Splunk.

Thanks again
rama

0 Karma

Influencer

When you say no luck, do you mean the JSON wasn't indexed properly on your new 515 input? It should work

0 Karma

Path Finder

Right. I did not see fields extracted.
Since I've been "talking Splunk" for ~two weeks only, I probably did things wrong.

Since I have the events reported with common ground now (and they all can be picked up by the same set of reports), I am good to go.

Thank you so much

rama

0 Karma

Influencer

no worries - You'll save money by using field=value instead of JSON, as it takes up less space 🙂

After you get more comfortable have a look on splunk answers for other questions relating to JSON field extractions, I am sure it will make more sense to you.

0 Karma

Influencer

You should probably handle this with syslog itself. That is, have syslog send these messages to a different port, and create an alternative input on your Splunk indexer.

Even better, have a server that collects all your syslog messages centrally, and install a universal forwarder on that host. That way if something happens to Splunk you can still collect the data

0 Karma