Getting Data In

Sending Specific Events to Separate Indexes

jamie1
Communicator

Hi There,

I am currently trying to set up specific events to be sent to a separate index.

The documentation on how to do this was quite confusing for me.

I assume I am making a very obvious mistake.

I can provide any necessary information,

Any help would be appreciated,

Jamie

0 Karma

richgalloway
SplunkTrust
SplunkTrust

That method can be confusing. The typical use case, however, is to discard events that match a regular expression.  I haven't seen it used for changing the index name.

Try using INGEST_EVAL, instead.  See https://docs.splunk.com/Documentation/Splunk/9.1.0/Admin/Transformsconf#transforms.conf.spec for details.

INGEST_EVAL = index=if(EventCode=7036,"risk", "foo")

You also can use Ingest Actions.  See https://docs.splunk.com/Documentation/Splunk/latest/Data/DataIngest for more about that.

 

---
If this reply helps you, Karma would be appreciated.

gcusello
SplunkTrust
SplunkTrust

Hi @jamie1,

let me understand: you want to override the defauls index definition on event basis, is it correct?

if this is your need, on your Indexers or (if present) on your Heavy Forwarders, you have to:

on props.conf

[your_sourcetype]
TRANSFORMS-index = overrideindex

on transforms.conf

[overrideindex]
DEST_KEY =_MetaData:Index
REGEX = <your_regex>
FORMAT = my_new_index

the main problemn is to define a regex to identify the events for index overriding, respect to the original definition (in inputs.conf).

Ciao.

Giuseppe

0 Karma

jamie1
Communicator

I want events to be put in either the main index or and index called "risks" based on Windows event ID.

I have created an index named "risks" but am not sure how to filter the events there.

I attempted to implement the commands you recommended, but searching index="risks" shows nothing.

Jamie

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jamie1 ,

this means that you should configure index=main in the inputs.con event if I don't like to use the main index I always prefer to use anothe index than main.

then in the Indexers or (if present) in the first Heavy Forwarder the the logs pass through, you have to add:

on props.conf (if wineventlog is the sourcetype of this data source):

[wineventlog]
TRANSFORMS-index = overrideindex

on transforms.conf (if the EventCodes to send to the risk index are 4624 or 4625 or 4634):

[overrideindex]
DEST_KEY =_MetaData:Index
REGEX = EventCode\=4624|4625|4634
FORMAT = risk

 Obviously adapt the regex to your requirements and your logs, in other words, insert the EventCodes you need and check if there are spaces inside this string (between EventCode and = and between = and the values).

Put attention to the location of these files: you must analyze your Splunk architecture to locate them in the first full Splunk instance (not Universal Forwarder) that the Data Source pass through. 

Ciao.

Giuseppe

0 Karma

jamie1
Communicator

Hi Giuseppe,

I am using Splunk Cloud, would this make a difference to this process?

Jamie

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jamie1 ,

if you are using Splunk Cloud, if the events of this Data Source are on premise, I suppose that your data pass throgh one or (better) two Heavy Forwarders as concentrators (to avoid to open all the Routes from your systems and Splunk Cloud.

In this case, you can put these conbffiles on these Heavy Forwarders.

If you directly send logs from your Universal Forwarders to Splunk Cloud (this that I don't hint!) the only solution is to create an add-on containing these two files and upload it to Splunk Cloud..

Ciao.

Giuseppe

0 Karma

jamie1
Communicator

I do indeed use Universal Forwarders, not heavy forwarders.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jamie1 ,

as I said, I always use one or (better) two Heavy Forwarders as Concentrators to avoid to open all connections between each UF and Splunk Cloud and I hint to consider this evolution to your architecture.

Anyway, for now in your case the only solution is to create an add-on, containing the two conf files, and upload it to Splunk Cloud.

Ciaoi.

Giuseppe

0 Karma

jamie1
Communicator

From what you're saying it seems like setting up the Heavy Forwarders seems to be the best long term solution. In your opinion, which sort of device would be most suited for being a Heavy Forwarder?

Thanks for your help so far,

Jamie

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @jamie1 ,

an Heavy Forwarder is a full instance Splunk Server con figured to forward all log to Splunk Cloud.

It should be a normal Splunk server (12 CPUs and 12 GB RAM) but if you haven't too many events, you can also use less resourcer (8+8).

You should install in the HF the add-on that you downloaded from Splunk Cloud.

Then you have to configure all your Universal Forwarder to send their logs to the HF.

In this way you don't need to open all routes between UFs and Splunk Cloud, in addition, you can use this HF for syslogs and if you haven't too many UFs (less than 50) also as Deployment Server.

The best approach is to have two HFs to avoid Single Points of Failure.

In this way you can make transformations before sending logs to Splunk Cloud.

Ciao.

Giuseppe

0 Karma

richgalloway
SplunkTrust
SplunkTrust

I did not provide any commands.  I offered an example transforms.conf setting that you would have to load on your indexer/Heavy Forwarder, restart, and then ingest new data.  I'm surprised you could do all of that in 3 minutes.

INGEST_EVAL = index=if(EventCode=7036,"risks", "main")
---
If this reply helps you, Karma would be appreciated.
0 Karma

jamie1
Communicator

Hi Rich,

That reply was not to your message. I am yet to try your solution.

Thanks for the advice though.

Jamie

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to May Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars this month. This ...

They're back! Join the SplunkTrust and MVP at .conf24

With our highly anticipated annual conference, .conf, comes the fez-wearers you can trust! The SplunkTrust, as ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...