Splunk Enterprise Security

Why are OSSEC logs being sent to Splunk but not being processed by Splunk's IDS Datamodel?

DaMushroomCloud
Engager

Hello Splunk Community,

History of problem:
I recently was trying to update OSSEC agents and some needed to be reinstalled to be fixed. So in my plan of action for targeting, the main OSSEC server got targeted and reinstalled as an agent instead, thus having me to reconfigure everything becauase we did not have a backup. After a weekend of configuration recovery, all 100+ agents got reconnected and authenticated with new keys and updated configs for both the server and agents. Everything is connected and communicating, reporting alerts to the alerts.log, local_rules.xml file is updated. We use Splunk Forwarder to forward the logs and we get log information, but it looks like it's not being processed correctly....this is our issue.

Problem and End-Result Needed:
Splunk is receiving information from OSSEC server but the data is having trouble being processed by the IDS data model. Splunk field for Log "sourcetype" seems to be the root of the issue. We need to have Datamodels process this information or have the Splunk OSSEC add-on properly configured because we have the path on the splunk server, but not fully configured:

/opt/splunk/etc/apps/Splunk_TA_ossec/

What we noticed in Splunk search:

Before
Sourcetype=alerts

Current needing fix
Sourcetype=alerts-4 and Sourcetype=alerts-5

IDS (Intrusion Detection) Splunk data model needs to process logs by severity and update the dashboard accordingly

Splunk Add-on is not properly configured

Some pictures are attached of the issue.
---------------------
Resources
---------------------

https://docs.splunk.com/Documentation/AddOns/released/OSSEC/Setup

https://docs.splunk.com/Documentation/CIM/5.0.1/User/IntrusionDetection

https://docs.splunk.com/Documentation/AddOns/released/OSSEC/Sourcetypes
Alert_Log_Information2.png

sourcetype issue.png

0 Karma
1 Solution

DaMushroomCloud
Engager

We found our solution to this issue. However, our splunk dashboard is having difficulty understanding the severity level. Overtime we will fix this and update this post.

The problem was that we needed to edit our inputs.conf file with the "alerts" sourcetype in the splunkforwarder:

Path: /opt/splunkforwarder/etc/apps/Splunk_TA_nix_ossec/local/inputs.conf

Added:

[monitor:///var/ossec/logs/alerts/alerts.log]
disabled = false
index = ids
sourcetype=alerts

More information:
https://uit.stanford.edu/service/ossec/install-source

View solution in original post

0 Karma

DaMushroomCloud
Engager

We found our solution to this issue. However, our splunk dashboard is having difficulty understanding the severity level. Overtime we will fix this and update this post.

The problem was that we needed to edit our inputs.conf file with the "alerts" sourcetype in the splunkforwarder:

Path: /opt/splunkforwarder/etc/apps/Splunk_TA_nix_ossec/local/inputs.conf

Added:

[monitor:///var/ossec/logs/alerts/alerts.log]
disabled = false
index = ids
sourcetype=alerts

More information:
https://uit.stanford.edu/service/ossec/install-source

0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...