Getting Data In

DATA COMING IN SPLUNK ENTERPRISE

rahulkumar
Path Finder

Hi All

My issue is i have logstash data coming in splunk logs source type is Http Events and logs are coming in JSON format. I need to know how can i use this data to find something meaningful that i can use also as we get event code in windows forwarders so i block unwanted  event codes giving repeated information but in logstash data what we can do if i want to do something like this. How to take out information which we can use in splunk?

Labels (4)
0 Karma

rahulkumar
Path Finder

Hi I check the fields are correct but i want to know that using spath for extraction from json and using props and transforms gives same result or not? As I am getting same message value that was unstructured earlier which was coming after using below statement in conf file 
 [securelog_override_raw]
INGEST_EVAL = message := json_extract(_raw, "message")

the value in message is still same unstructured with lots of messed up data like below do i have to separate this myself in this case using search queries?

2025-02-11 20:20:46.192 [com.bootstrapserver.runtim] DEBUG Stability run result : com.cmp.bootstrapserver.runtime.internal.api.RStability@36464gf
2025-02-11 20:20:46 [com.bootstrapserver.runtim] DEBUG Stability run result :com.cmp.bootstrapserver.runtime.interndal.api.RStability@373638cgf

after spath same message came from message field and now using conf file with prof and tranforms its still the same. will it extract like this only?

 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @rahulkumar ,

the results using INDEXED_EXTRACTIONWS=JSON or spath should be the same.

The advantage of the first one is that it's automatic and you don't need to use spath command at every search.

The problem without transforming message in _raw is that the standard add-ons usually don't run with this data structure because it's different than the one they are waiting for.

Ciao.

Giuseppe

0 Karma

rahulkumar
Path Finder

so if i am getting below data

This below using spath for message field:

2025-02-11 20:20:46.192 [com.bootstrapserver.runtim] DEBUG Stability run result
com.cmp.bootstrapserver.runtime.internal.api.RStability@36464gf

2025-02-11 20:20:46 [com.bootstrapserver.runtim] DEBUG Stability run result :com.cmp.bootstrapserver.runtime.interndal.api.RStability@373638cgf

This below using props and transforms for message field:
 

2025-02-11 20:20:46.192 [com.bootstrapserver.runtim] DEBUG Stability run result
com.cmp.bootstrapserver.runtime.internal.api.RStability@36464gf

2025-02-11 20:20:46 [com.bootstrapserver.runtim] DEBUG Stability run result :com.cmp.bootstrapserver.runtime.interndal.api.RStability@373638cgf

Both way got same data so is it correct or wrong? or it should be different after using transforms and props i need to know this first and if not then what?

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @rahulkumar ,

as I said, in the message field there's the original _raw field, in other words the original event.

So you have to restore the original event deleting the additional fields in the json structure otherwise the standard add-ons don't read them in a correct way.

The configurations I hinted makes this restore:

they extract metedata from the json fieldas and restore in _raw the original event.

You cannot use spath because the parsers work on the _raw field, for this reason you have to configure the original event restore using props.conf and transforms.conf.

Ciao.

Giuseppe

0 Karma

rahulkumar
Path Finder

Can you please check syntax and everything is correct?

I have used the same thing in my terminal and after this I am writing command in my search and reporting search bar that is 
index="mycloud" sourcetype="httpevent" | table message

props.conf


[source::http: LogStash]
sourcetype = httpevent
TRANSFORMS-00 = securelog_set_default_metadata
TRANSFORMS-01 = securelog_override_raw


transforms.conf

[securelog_set_default_metadata]
INGEST_EVAL = host = json_extract(_raw, "host.name")

[securelog_override_raw]
INGEST_EVAL = _raw = json_extract(_raw, "message")

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @rahulkumar ,

configurations seem to be correct, but the only effective confirmetion is your: do they run?

if using your search will give you the events without json and the correct metedata host, they are correct.

Only one additional information: I see that you have still the original sourcetype httpevent that is not useful for your parsing rules, so I hint to add another rule to assign the correct sourcetype, always starting from json fields.

e.g. if you have a field called e.g. path.file.name and when you have in it the value "/var/log" these are linux secure logs, you could use these configurations:

props.conf

[source::http: LogStash]
sourcetype = httpevent
TRANSFORMS-00 = securelog_set_default_metadata
TRANSFORMS-01 = securelog_override_sourcetype
TRANSFORMS-02 = securelog_override_raw

 transforms.conf

[securelog_set_default_metadata]
INGEST_EVAL = host = json_extract(_raw, "host.name")

[securelog_override_sourcetype]
INGEST_EVAL = _raw = case(path.file.name="/var/log",json_extract(_raw, "message")

[securelog_override_raw]
INGEST_EVAL = _raw = json_extract(_raw, "message")

in this way you assign the correct sourcetype to your logs.

Obviously, you have to analyze your logs identifying all the different types of logs and the rules to identify each of them, then you can insert these rules in the case of the second transformation.

It's important that all the transformations that use json fields awill be done before the final transformation of the _raw.

Ciao.

Giuseppe

0 Karma

rahulkumar
Path Finder

yes fields are correct but values coming out are same as what i was doing using spath statement. is there any difference i will get if iam using props and transofrms conf

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @rahulkumar ,

I worked in a project in which we were receiving losconcentrated and exported in logstash format.

the problem is that you cannot use normal add-ons because the format is different.

You have two choices:

modufy all your parsing rules of the used add-ons.

Convert your logstash logs in the original format and it isn't s simple job but it's long!

In few words, you have to extract metadata from the json using INGEST_EVAL and then convert in _raw the original log field.

For more infos see at https://conf.splunk.com/files/2020/slides/PLA1154C.pdf

Ciao.

Giuseppe

0 Karma
Get Updates on the Splunk Community!

Unlock Database Monitoring with Splunk Observability Cloud

  In today’s fast-paced digital landscape, even minor database slowdowns can disrupt user experiences and ...

Purpose in Action: How Splunk Is Helping Power an Inclusive Future for All

At Cisco, purpose isn’t a tagline—it’s a commitment. Cisco’s FY25 Purpose Report outlines how the company is ...

[Upcoming Webinar] Demo Day: Transforming IT Operations with Splunk

Join us for a live Demo Day at the Cisco Store on January 21st 10:00am - 11:00am PST In the fast-paced world ...