Getting Data In

Detect/handle parsing error and log format change

ikulcsar
Communicator

Hi,

I have been asked about log parsing and parser error detection in Splunk.

The questions are: In general
- how can and should I detect parsing errors in Splunk? (New version of log source, etc without notification to Splunk admin, etc)
- how should I handle the new log format? There are already data in the index with the old source type. If I modify the sourcetype definitions, it will break the search time field extraction, is it? Clone and modify the source type?

I don't find a guide or best practice in the docs...

Thanks,
István

Tags (2)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi ikulcsar,
to answer to your questions:

  1. you could detect parsing errors identifing, for each sourcetype) one or more fields with controlling values (e.g.: two or three fields with a limited number of values) that you can store in one or more lookups and periodically (e.g. one time a day) check; in other words you have identify a field, put all the correct values in a lookup and check if there are more values than the lookup, if there are maybe there's a parsing error to manually check.
  2. to check new sourcetypes, you could use the same method (put all the correct sourcetypes in a lookup an run a search).
  3. to handle the modified sourcetypes you could follow different solutions depending by your situation: a. create a new sourcetype for the modified logs and manage the situation using eventtypes (it's a good practice use eventypes in searches), b. modify extracted fields managing the new and old versions using coalesce funtion; in other words, for each field you maintain the old field extraction, you create a new one and you create a calculated field rule (eval my_field=coalesce(new_field,my_field) to manage the presence of two versions of the same field.

I hope to be helpful for you.

Bye.
Giuseppe

View solution in original post

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi ikulcsar,
to answer to your questions:

  1. you could detect parsing errors identifing, for each sourcetype) one or more fields with controlling values (e.g.: two or three fields with a limited number of values) that you can store in one or more lookups and periodically (e.g. one time a day) check; in other words you have identify a field, put all the correct values in a lookup and check if there are more values than the lookup, if there are maybe there's a parsing error to manually check.
  2. to check new sourcetypes, you could use the same method (put all the correct sourcetypes in a lookup an run a search).
  3. to handle the modified sourcetypes you could follow different solutions depending by your situation: a. create a new sourcetype for the modified logs and manage the situation using eventtypes (it's a good practice use eventypes in searches), b. modify extracted fields managing the new and old versions using coalesce funtion; in other words, for each field you maintain the old field extraction, you create a new one and you create a calculated field rule (eval my_field=coalesce(new_field,my_field) to manage the presence of two versions of the same field.

I hope to be helpful for you.

Bye.
Giuseppe

0 Karma

ikulcsar
Communicator

Hi Giuseppe,

Thank you, we will try this tips.

Regards,
István

0 Karma
Get Updates on the Splunk Community!

New This Month in Splunk Observability Cloud - Metrics Usage Analytics, Enhanced K8s ...

The latest enhancements across the Splunk Observability portfolio deliver greater flexibility, better data and ...

Alerting Best Practices: How to Create Good Detectors

At their best, detectors and the alerts they trigger notify teams when applications aren’t performing as ...

Discover Powerful New Features in Splunk Cloud Platform: Enhanced Analytics, ...

Hey Splunky people! We are excited to share the latest updates in Splunk Cloud Platform 9.3.2408. In this ...