Getting Data In

Kaspersky Syslog Data Field Extraction

MoienABO
Loves-to-Learn Lots

Recently, I changed Kaspersky Security Center log format to syslog (because of limitation of CEF) and We're receiving these logs in SPLUNK. but I found there is no suitable TA for such logs. so I decided to create transforms.conf and props.conf files to parse this log format. Here is my sample log:

Jul 26 16:44:56 172.31.0.254 1 2023-07-25T03:43:00.000Z comuter1 KES|11.0.0.0 - 000000g1 [event@23448 et="000000g1" tdn="Protection" etdn="Protection components are disabled" hdn="COMPUTER1" hip="172.24.7.139" gn="GPA" kscfqdn="something.root.holdings"] Event type: Protection components are disabled\r\nName: test.exe\r\nApplication path: C:\Program Files (x86)\Kaspersky Lab\KES.12.0.0\r\nProcess ID: 18446744073709551615\r\nUser: COMPUTER1\Administrator (Active user)\r\nComponent: Protection

and here is my props.conf and transforms.conf

props.conf:

 

[kasperskylab:securitycenter:syslog]
SHOULD_LINEMERGE = false
KV_MODE = none
REPORT-outer_fields = get_outer_fields, get_inner_fields

 

transforms.conf:

 

[get_outer_fields]
REGEX = ^(?<timestamp>[a-zA-Z]{3}\s+\d{1,2}\s+\d{1,2}:\d{1,2}:\d{1,2})\s+(?<device>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})\s+\S+\s+\d+-\d+-\d+\w+:\d+:\d+\.\d+\w+\s+(?<src>\S+)\s+(?<app>[^\s]*)\s+-\s+\S+\s+.*?hip="(?<src_ip>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})"\s+gn="(?<gn>[^"]+)"\s+kscfqdn="(?<fqdn>[^"]+)"]\s+(?<key_value_list>.*)

[get_inner_fields]
SOURCE_KEY  = key_value_list
DELIMS = "\\r\\n", ":"

 

 but it seems only my first part (get_outer_fields) only works and nothing happens in second part (get_innter_fields). i also change my configs to replace "\r\n" with ";". here is my changes:

props.conf:

 

[kasperskylab:securitycenter:syslog]
SHOULD_LINEMERGE = false
KV_MODE = none
SEDCMD-event_cleaner = s/\\r\\n/;/g
REPORT-outer_fields = get_outer_fields, get_inner_fields

 

transforms.conf:

 

[get_outer_fields]
REGEX = ^(?<timestamp>[a-zA-Z]{3}\s+\d{1,2}\s+\d{1,2}:\d{1,2}:\d{1,2})\s+(?<device>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})\s+\S+\s+\d+-\d+-\d+\w+:\d+:\d+\.\d+\w+\s+(?<src>\S+)\s+(?<app>[^\s]*)\s+-\s+\S+\s+.*?hip="(?<src_ip>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})"\s+gn="(?<gn>[^"]+)"\s+kscfqdn="(?<fqdn>[^"]+)"]\s+(?<key_value_list>.*)

[get_inner_fields]
SOURCE_KEY  = key_value_list
DELIMS = ";", ":"

 

but the result not changed. any idea??

Any help is greatly appreciated. 

Labels (2)
0 Karma

MoienABO
Loves-to-Learn Lots

hi @gcusello 

Thanks for your answer. I use second transformation because second part of log contains useful fields such as Application, user and .... I need to extract them

I also know about defining knowledge Objects related to this add on and will define them later. but now, field extraction is my concern and it not worked event with your configurations too. maybe you' re right and i should use Add-On Builder App.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @MoienABO,

at first: why do you want to use the second transformation, isn't the first sufficient to extract all fields?

Anyway, did you tried to put the transformations in two different rows?

[kasperskylab:securitycenter:syslog]
SHOULD_LINEMERGE = false
KV_MODE = none
REPORT-outer_fields1 = get_outer_fields
REPORT-outer_fields2 = get_inner_fields

 At least, when you create a custom add on, use the Add-On Builder (https://splunkbase.splunk.com/app/2962) to normalize your data having a CIM compliant add-on.

In few words, you have also to:

  • create some eventtypes and tags,
  • create alias for your fields,
  • create calculated fields to normalize some fields values.

Ciao.

Giuseppe

0 Karma
Get Updates on the Splunk Community!

Developer Spotlight with Brett Adams

In our third Spotlight feature, we're excited to shine a light on Brett—a Splunk consultant, innovative ...

Index This | What can you do to make 55,555 equal 500?

April 2025 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Say goodbye to manually analyzing phishing and malware threats with Splunk Attack ...

In today’s evolving threat landscape, we understand you’re constantly bombarded with phishing and malware ...