Getting Data In

Indexed Extractions not working properly

klowke_svbz
Observer

Hi all,

we collect some json data from a logfile with a universal forwarder.
Most times the events were indexed correctly with already extracted fields, but for a few events the fields are not automatically extracted. 

If i reindex the same events the indexed extraction is also fine.

I did not find any entries in splunkd.log that it is not working.

Following props.conf is on the Universal fowarder and Heavy Forwarder (maybe someone could explain which parameter is needed on UF and which on HF):

[svbz_swapp_task_activity_log]

CHARSET=UTF-8
SHOULD_LINEMERGE=true
LINE_BREAKER=([\r\n]+)
NO_BINARY_CHECK=true
INDEXED_EXTRACTIONS=json
KV_MODE=none
category=Custom
disabled=false
pulldown_type=true
TIMESTAMP_FIELDS=date_millis
TIME_FORMAT=%s%3N


following props.conf is on the Searchhead:

[svbz_swapp_task_activity_log]
KV_MODE=none

The first time when it was indexed automatically it looks like:

klowke_svbz_2-1713950346225.png

 


When i reindex the same Event again to another index it looks fine:

klowke_svbz_1-1713950230552.png

In last 7 days it was working correctly for about 32000 event but for 168 events the automatic field extraction was not working.

Here is also the example event:

{"task_id": 100562, "date_millis": 1713475816310, "year": 2024, "month": 4, "day": 18, "hour": 23, "minute": 30, "second": 16, "action": "start", "step_name": "XXX", "status": "started", "username": "system", "organization": "XXX", "workflow_id": 14909, "workflow_scheme_name": "XXX", "workflow_status": "started", "workflow_date_started": 1713332220965, "workflow_date_finished": null, "escalation_level": 0, "entry_attribute_1": 1711753200000, "entry_attribute_2": "manual_upload", "entry_attribute_3": 226027, "entry_attribute_4": null, "entry_attribute_5": null}

Does someone have an idea why it is sometimes working and sometimes not?

When i would now change the KV_Mode on search head the fields are shown correctly for these 168 events but for all others the fields are extracted twice. Using spath with same names would extract it only once.

What is the best workaround for already indexed events to get proper search results.

Thanks and kind regards
Kathrin

Labels (2)
0 Karma
Get Updates on the Splunk Community!

Modern way of developing distributed application using OTel

Recently, I had the opportunity to work on a complex microservice using Spring boot and Quarkus to develop a ...

Enterprise Security Content Update (ESCU) | New Releases

Last month, the Splunk Threat Research Team had 3 releases of new security content via the Enterprise Security ...

Archived Metrics Now Available for APAC and EMEA realms

We’re excited to announce the launch of Archived Metrics in Splunk Infrastructure Monitoring for our customers ...