All Apps and Add-ons

Trend Micro Deep Security for Splunk - sourcetype rewrite in distributed environment?

akadmin
Observer

It appears that according to the documentation, I should be able to configure Trend to forward all module events to Splunk via syslog, and then store it into a file and monitor that file with Splunk as an input, and that sourcetype is supposed to be "deepsecurity".

 

On the "details" page on Splunk's website for the app - it states this:

"It is highly recommended that you follow the Splunk best practices for syslog, and that you configure rsyslog or syslog-ng to write syslog output to a file which can then be collected by a Splunk forwarder and sent to the Splunk server. You need to ensure that the Splunk forwarder sets the sourcetype to deepsecurity when forwarding events to a Splunk receiver."

 

I have an on-prem HF, and a Splunk Cloud Search Head.  The app is installed on the Search Head, and the HF is storing the trend data as "deepsecurity" sourcetype.  Then, what's supposed to happen (to my understanding) is the app is configured to rewrite the sourcetype field to the appropriate module.

 

Ex:  deepsecurity -> deepsecurity-antimalware

 

The problem

--------------------------

Search Head gets the data and the app never rewrites the sourcetype, so everything is being seen as "deepsecurity" and none of the dashboard panels populate.

 

What am I doing wrong?  Do I need the Trend app on my HF as well?  The HF doesn't index anything.

Labels (1)
0 Karma

maaneeel
Explorer

Hello,

 

I have the same problem on my splunk (all servers are on prem), I installed the app on Heavy Forwarders, Indexers and Search Heads, and I'm receiving all logs but all events are tagged as deepsecurity instead of deepsecurity-antimalware, deepsecurity-firewal....

At the beggining I had the app on Indexers and Search Heads, and Heavy forwarders had a simple app created by me that just monitor the file writted by rsyslog, now I deployed the DeepSecurity app and copy inputs.conf inside.

 

What could I check?

 

Thanks

 

0 Karma

akadmin
Observer

Just FYI - I had absolutely 0 luck finding any support for this, so I just did separate syslog listeners for each module and stored them into the sourcetype name the app was requesting.  Fortunately Trend has it configurable to do it like that within the admin console (diff syslog stream for each module) so it wasn't a tough setup, just sucks it does not work as advertised and there's no support.

0 Karma

maaneeel
Explorer

So, you deployed a different input app on Heavy Forwarders for each sourcetype and then set these sourcetypes on the same input, right'

 

thanks for you quick reply!

0 Karma

akadmin
Observer

Yeah, so I'm just configuring syslog inputs on separate ports into different sourcetypes, one for each Trend module.

Ex:  I configure trend to send syslog for the "AntiMalware" module to my HF on port 5140, then on the HF, I configure a syslog input that feeds into a sourcetype called "deepsecurity-antimalware", then I'd use port 5141 and so-on for the rest of the modules

If you do that the app picks up the data in the proper dashboards on the search head.

 

Also, you may want to consider feeding the syslog into text files in /var/log on your HF, then configuring the HF to read from each file and input it into the sourcetype that way.  For some reason syslog doesn't seem to be reliable when you feed it into the HF directly as a Splunk input.

0 Karma

maaneeel
Explorer

Hello,

 

A few minutes ago, I discovered that my problem was produced by another log management platform that we have betweeen deep security and splunk (I thought that it was sending the log to the Heavy Forwarders) but it was sending these logs directly to the indexers, so the transmorfation has not been applied.

 

After try to send logs directly to Heavy Forwarders the sourcetype is being replaced properly, a change that I also did is modify all files inside default directory and write the name of the index that I'm using before the queries that just contains sourcetype.

 

Thanks!

0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...