Getting Data In

Stripping syslog-ng headers from Snort/idstools-u2json JSON files

BongoTheWhippet
Path Finder

Hello fellow Splunk community members

I've finally got a workable solution for running Snort on my home router, outputting JSON to send across to my Raspberry Pi-homed UF. It works a treat, but for one thing. If you're curious, it's dd-wrt running Entware Snort, processing u2fast logs into JSON with python3-idstools.

The Snort JSON output log on the router looks like this:

 

 

{"msg": "ET POLICY iTunes User Agent", "classification": "Potential Corporate Privacy Violation", "sensor-id": 0, "event-id": 354, "event-second": 1605757495, "event-microsecond": 660579, "signature-id": 2002878, "ge
nerator-id": 1, "signature-revision": 6, "classification-id": 33, "priority": 1, "sport-itype": 57226, "dport-icode": 80, "protocol": 6, "impact-flag": 0, "impact": 0, "blocked": 0, "mpls-label": null, "vlan-id": null, "pad2": null, "source-ip": "192.168.1.25", "destinat
ion-ip": "17.253.35.206"}}

 

 

It's JSON-lint validated output too so that's a bonus.

But then syslog-ng gets it's hands on it. I've delved deep into the balabit syslog-ng administration manual and despite adding all of the relevant syslog-ng.conf attributes to prevent syslog-ng adding its own header, syslog-ng can't seem to help itself! On the router sending the logs to the UF, the syslog-ng.conf looks like this:

 

 

** CHOPPED FOR BREVITY **

source s_snort_json {
    file("/tmp/alerts.json" follow-freq(1) flags(no-parse));
};

destination d_tcp_splunk_forwarder { network("192.168.1.92" template("${MESSAGE}\n") port(1514)); };

log {
        source(s_snort_json);
        destination(d_tcp_splunk_forwarder);
};

 

 

I've tried using the built in json parser with syslog-ng, but it doesn't really work and simply adds to the problem that I don't really want syslog-ng to fiddle with the JSON at all. I just want to send it to the UF as it is.

On the receiving UF system, the log is received using syslog-ng again. The syslog-ng.conf on that box looks like this:

 

 

** CHOPPED FOR BREVITY **

source s_network_tcp {
    network(
        ip("0.0.0.0")
        transport("tcp")
        port(1514)
        flags(no-parse)
    );
};
destination d_snort { file("/var/log/snort.json"); };

log { source(s_network_tcp); destination(d_snort); };

 

 

Note, the flags(no-parse) and template (which both appear to have no effect) - syslog-ng still adds it's own data!

The output now (inexplicably) looks like this in /var/log/snort.json:

 

 

Nov 19 03:44:56 192.168.1.1 {"type": "event", "event": {"msg": "ET POLICY iTunes User Agent", "classification": "Potential Corporate Privacy Violation", "sensor-id": 0, "event-id": 354, "event-second": 1605757495, "event-microsecond": 660579, "signature-id": 2002878, "ge
nerator-id": 1, "signature-revision": 6, "classification-id": 33, "priority": 1, "sport-itype": 57226, "dport-icode": 80, "protocol": 6, "impact-flag": 0, "impact": 0, "blocked": 0, "mpls-label": null, "vlan-id": null, "pad2": null, "source-ip": "192.168.1.25", "destinat
ion-ip": "17.253.35.206"}}

 

 

Syslog-ng seems to be like a stubborn child. No matter how carefully you tell it not to do something,  it still does exactly what it wants!

Props.conf to the rescue here, right? On the UF, my props.conf looks like this:

 

 

[sourcetype=json]
KV_MODE = json
INDEXED_EXTRACTIONS = json
TIME_PREFIX= \"event-second\"\:
# I've tried SEDCMD-strip_prefix = s/^[^{]+// here too
SEDCMD-strip_prefix = s/^[^{]+//g
NO_BINARY_CHECK = true
disabled = false
pulldown_type = true

 

 

In Splunk however, the syslog-ng added header remains. I don't have a reliable way of testing the SEDCMD outputs as the Splunk version seems not to be a GNU syntax compatible sed implementation.

Does anyone have any suggestions either for the syslog-ng pipeline conf(s) or in the props.conf where I'm going wrong?

(I can't use rsyslog on the router BTW - opkg has no package available).

Many thanks and all the best

Labels (3)
0 Karma
Get Updates on the Splunk Community!

A Guide To Cloud Migration Success

As enterprises’ rapid expansion to the cloud continues, IT leaders are continuously looking for ways to focus ...

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...