Splunk Enterprise

Help on basic question about data ingestion in SPLUNK ES

jip31
Motivator

Hi

Except if i am mistaken, Splunk ES contains a collection of add-ons. In combination, these add-ons provide the dashboards, searches, and tools that summarize the security posture of the enterprise, allowing users to monitor and act on security incidents and intelligence

Does it means that Splunk ES works without any forwarder?  How the correlaation is done beteween these addns and the enterprise infrastructure? Is it automatic?

The data are sent to the indexers lije with Splunk Enterprise or just to a search head?

Sorry for these questions, but I am rookie in Splunk ES and I need to understand how the security events are ingested

Thanks

Labels (1)
Tags (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @jip31,

ES is composed by many modules:

  • Domain Add-Ons (DA): Views, UI components,
  • Tech Add-Ons (TA): input Normalization,
  • Supporting Add-Ons (SA). searches, macros, data models, utilities.

Data models are in one of these SAs, called "Splunk Common Information Model" (https://splunkbase.splunk.com/app/1621) that you can install also outside ES.

Then in your ES setting, you have to choose which of the available Data Models you want to accelerate, depending on the data you have available.

The only components that Best practices hint to install are TAs, that could be more updated than the ones in the ES package.

Anyway, the components to install are exactly described in the installation procedure : https://docs.splunk.com/Documentation/ES/7.1.0/Install/InstallEnterpriseSecurity

About CIM 4.x compliance, yes it's mandatory, otherwise your data doesn't populate Data Models, infact there's a problem when you must use data without a CIM 4.x compliant TA that you must manually normalize using the Splunk Add-on builder (https://splunkbase.splunk.com/app/2962) or the Splunk CIM Validator (https://splunkbase.splunk.com/app/2968) following the documentation at https://dev.splunk.com/enterprise/docs/developapps/testvalidate/appinspect or https://dev.splunk.com/enterprise/docs/developapps/testvalidate

About using macros to filter data Models, yes it's possible, but remember that using Data Models you don't have the raw data, you have something like a database table with its fields, so the macros must use one of the Data model fields and not raw data.

Tell me if you need more help otherwise, please accept one answer for the other people of Community. 

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated 😉

View solution in original post

PickleRick
SplunkTrust
SplunkTrust

Enterprise Security is a "pack" of apps which provides you with additional features for tracking security events across assets and users, assesing risk and so on. But it's still based on the data you have. So it's up to you to get the security-related events into Splunk. ES on its own just generates its own "internal" events based on searches performed on your "normal" data from your indexes.

It does use some additional tools to populate lookups which it uses for tracking assets, users, threat intel but this data is not stored as events in indexes.

Anyway, the good practice, which simplifies work with Splunk (and using third party content like searches and whole apps), regardless of whether it's ES-related, security related in general or something else, is to make the data you're ingesting CIM-compliant.

You can always try Security Essentials first https://splunkbase.splunk.com/app/3435 - this is an app which contains huge library of use cases and lets you verify the quality of your data vs. what you need for particular use cases. That's a good start before attempting to implement ES.

gcusello
SplunkTrust
SplunkTrust

Hi @jip31,

Splunk ES is a Premium App, but anyway is a Splunk App.

This means that it uses the Splunk infrastructure for its features.

Infact, in a correct ES implementation project, Splunk Professional Services indicates as a best practice to install ES after data ingestion completing.

Anyway, Data ingestion is performed by Splunk in the usual way: forwarders, syslogs, etc... using the usual TAs.

This is an attention point because ES uses CIM datamodels, so all the TA used for data ingestion must be at least CIM 4.x compliant; if you have custom data sources, you have to develop a custom TA checking the CIM compliance.

Splunk data are usually extracted by ES and stored in Data Models created by CIM app (part of ES).

Ciao.

Giuseppe

0 Karma

jip31
Motivator

Thans Giuseppe, I have another questions concerning rightly the datamodel

Datamodel are automatically stored in Splunk ES? We have nothing to install? Contrary to the TA that we need to install or the is some TA pre installed in Splunk ES?

If the TA used for data ingestion is CIM 4.x compliant, all the data ingested for example by a forwarder or Syslog are automatically stored in the datamodel?

Last thing, in the example below, what is `bogonlist_src_dest_subsearch'All_Traffic`? A macro, an eventtype stored in the datamodel or external to the datamodel?

| tstats count(sourcetype) from datamodel=Network_Traffic.ALL_Traffic where All_Traffic.action="allowed" AND `bogonlist_src_dest_subsearch'All_Traffic` by.....

 Thanks

Tags (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jip31,

ES is composed by many modules:

  • Domain Add-Ons (DA): Views, UI components,
  • Tech Add-Ons (TA): input Normalization,
  • Supporting Add-Ons (SA). searches, macros, data models, utilities.

Data models are in one of these SAs, called "Splunk Common Information Model" (https://splunkbase.splunk.com/app/1621) that you can install also outside ES.

Then in your ES setting, you have to choose which of the available Data Models you want to accelerate, depending on the data you have available.

The only components that Best practices hint to install are TAs, that could be more updated than the ones in the ES package.

Anyway, the components to install are exactly described in the installation procedure : https://docs.splunk.com/Documentation/ES/7.1.0/Install/InstallEnterpriseSecurity

About CIM 4.x compliance, yes it's mandatory, otherwise your data doesn't populate Data Models, infact there's a problem when you must use data without a CIM 4.x compliant TA that you must manually normalize using the Splunk Add-on builder (https://splunkbase.splunk.com/app/2962) or the Splunk CIM Validator (https://splunkbase.splunk.com/app/2968) following the documentation at https://dev.splunk.com/enterprise/docs/developapps/testvalidate/appinspect or https://dev.splunk.com/enterprise/docs/developapps/testvalidate

About using macros to filter data Models, yes it's possible, but remember that using Data Models you don't have the raw data, you have something like a database table with its fields, so the macros must use one of the Data model fields and not raw data.

Tell me if you need more help otherwise, please accept one answer for the other people of Community. 

Ciao and happy splunking

Giuseppe

P.S.: Karma Points are appreciated 😉

Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...