Getting Data In

A better way to extract JSON?

patpro
Path Finder

Hello,

I want to get Rspamd logs into Splunk with every info available. The best I could do with Rspamd config yields to this:

2023-11-03 13:02:24 #56502(rspamd_proxy) <7fcfc8>; lua; [string "return function (t...:4: METATEST {"qid":"8BC8C2F741","user":"unknown","ip":"188.68.A.B","header_from":["foo bar via somelist <somelist@baz.org>"],"header_to":["list <somelist@baz.org>"],"header_subject":["proper subject"],"header_date":["Fri, 3 Nov 2023 08:00:43 -0400 (EDT)"],"scan_time":2457,"rcpt":["me@myself.net"],"size":6412,"score":-5.217652,"subject":"proper subject","action":"no action","message_id":"4SMK7v2HQTzJrP1@spike.bar.org","fuzzy":[],"rspamd_server":"rack.myself.net","from":"somelist-bounces@baz.org","symbols":[{"score":-0.500000,"group":"composite","groups":["composite"],"name":"RCVD_DKIM_ARC_DNSWL_MED"},{"score":0,"group":"headers","groups":["headers"],"name":"FROM_HAS_DN"},{"score":0,"group":"headers","options":["somelist@baz.org","somelist-bounces@baz.org"],"groups":["headers"],"name":"FROM_NEQ_ENVFROM"},{"score":-0.010000,"group":"headers","groups":["headers"],"name":"HAS_LIST_UNSUB"},{"score":0,"group":"headers","options":["somelist@baz.org"],"groups":["headers"],"name":"PREVIOUSLY_DELIVERED"},{"score":-1,"group":"abusix","options":["188.68.A.B:from"],"groups":["abusix","rbl"],"name":"RWL_AMI_LASTHOP"},{"score":-0.100000,"group":"mime_types","options":["text/plain"],"groups":["mime_types"],"name":"MIME_GOOD"},{"score":-0.200000,"group":"headers","options":["mailman"],"groups":["headers"],"name":"MAILLIST"},{"score":1,"group":"headers","groups":["headers"],"name":"SUBJECT_ENDS_QUESTION"},{"score":-0.200000,"group":"policies","options":["+ip4:188.68.A.B"],"groups":["policies","spf"],"name":"R_SPF_ALLOW"},{"score":-1,"group":"policies","options":["list.sys4.de:s=2023032101:i=1"],"groups":["policies","arc"],"name":"ARC_ALLOW"},{"score":0,"group":"ungrouped","options":["asn:19xxxx, ipnet:188.68.A.B/20, country:XY"],"groups":[],"name":"ASN"},{"score":0.100000,"group":"headers","groups":["headers"],"name":"RCVD_NO_TLS_LAST"},{"score":0,"group":"headers","groups":["headers","composite"],"name":"FORGED_RECIPIENTS_MAILLIST"},{"score":0,"group":"policies","options":["baz.org:+","bar.org:-"],"groups":["policies","dkim"],"name":"DKIM_TRACE"},{"score":0,"group":"headers","groups":["headers"],"name":"REPLYTO_DOM_NEQ_FROM_DOM"},{"score":0,"group":"policies","options":["bar.org:s=dktest"],"groups":["policies","dkim"],"name":"R_DKIM_REJECT"},{"score":-2.407652,"group":"statistics","options":["97.28%"],"groups":["statistics"],"name":"BAYES_HAM"},{"score":0,"group":"headers","groups":["headers"],"name":"TO_DN_ALL"},{"score":0,"group":"composite","groups":["composite"],"name":"DKIM_MIXED"},{"score":-0.200000,"group":"policies","options":["baz.org:s=20230217-rsa"],"groups":["policies","dkim"],"name":"R_DKIM_ALLOW"},{"score":0,"group":"headers","options":["3"],"groups":["headers"],"name":"RCVD_COUNT_THREE"},{"score":-0.600000,"group":"rbl","options":["188.68.A.B:from","188.68.A.B:received","168.100.A.B:received"],"groups":["rbl","dnswl"],"name":"RCVD_IN_DNSWL_MED"},{"score":-0.100000,"group":"rbl","options":["188.68.A.B:from"],"groups":["rbl","mailspike"],"name":"RWL_MAILSPIKE_GOOD"},{"score":0,"group":"policies","options":["baz.org"],"groups":["policies","dmarc"],"name":"DMARC_NA"},{"score":0,"group":"headers","options":["1"],"groups":["headers"],"name":"RCPT_COUNT_ONE"},{"score":0,"group":"mime_types","options":["0:+"],"groups":["mime_types"],"name":"MIME_TRACE"},{"score":0,"group":"headers","groups":["headers","composite"],"name":"FORGED_SENDER_MAILLIST"},{"score":0,"group":"headers","groups":["headers"],"name":"TO_EQ_FROM"},{"score":0,"group":"headers","options":["foo@bar.org"],"groups":["headers"],"name":"HAS_REPLYTO"}]}

 

Currently I’m extracting JSON with a props.conf & a transforms.conf:

props.conf

[rspamd]
KV_MODE = json
TRANSFORMS-json_extract_rspamd = json_extract_rspamd

transforms.conf

[json_extract_rspamd]
SOURCE_KEY = _raw
DEST_KEY = _raw
LOOKAHEAD = 10000
#REGEX = ^([^{]+)({.+})$
REGEX = ^(\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d) (#\d+)\(([^)]+)\) ([^;]+); lua[^{]+{(.+})$
FORMAT = {"date":"$1","ida":"$2","process":"$3","idb":"$4",$5
CLONE_SOURCETYPE = _json

I end up with this in splunk:

Screenshot 2023-11-03 at 13.47.57.png

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

From here, I have 2 problems.

1st problem: contrary to native JSON (from my Amavis logs for example), Splunk does not extract nor process basic stats about fields unless I’m explicitly extract them… That’s quite a pain.
Is there a way / config setting to instruct Splunk to automagically extract every fields?

2nd problem: this JSON is crap. Every object in "symbols[]" looks like this:Screenshot 2023-11-03.png

It’s almost unuseable as it prevent me from linking the name of the symbol to its score and to its options.

Is there a parsing option / function I could use to reliably transform this into something I can work with?

A good result could be

turning 
{
   group: abusix
   groups: [
     abusix
     rbl 
   ]
   name: RWL_AMI_LASTHOP
   options: [
    A.B.C.D:from
   ]
   score: -1 
} 

into 

RWL_AMI_LASTHOP: [
   group: abusix
   groups: [
     abusix
     rbl 
   ]
   name: RWL_AMI_LASTHOP
   options: [
    A.B.C.D:from
   ]
   score: -1 
]

 

I’m open to suggestions, I’ve been working for years with the great JSON logs of Amavis (perfect parsing and usability). This problem is new to me…

Labels (3)
0 Karma
Get Updates on the Splunk Community!

Index This | When is October more than just the tenth month?

October 2025 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Observe and Secure All Apps with Splunk

  Join Us for Our Next Tech Talk: Observe and Secure All Apps with SplunkAs organizations continue to innovate ...

What’s New & Next in Splunk SOAR

 Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us for an ...