Getting Data In

Unexpected results with field values - Splunk Enterprise

akarivaratharaj
Communicator

With some of the events, we are facing the unexpected format of the query results. Actually in the raw event there is no issue at all, and each field is showing their own values. But when it is queried and displayed in the statistics section as results, the values of few fields are displaying incorrectly.

Usually the search results show key-values. But with some events, the search results are showing as "fieldname1=fieldname1=value" and in some cases "fieldname1=fieldname3=value". 

Example1: Request_id=Request_id=12345

(Expected to be -> "Request_id=12345")

Example2: Parent_id=message_id=456

(Expected to be -> "Parent_id=321")

Example3: Parent_id=category=unknown

(Expected to be -> "Parent_id=321")

Is this related with parser or something else? We are unable to find what could be the issue lying over here.

Could anyone please help us on fixing this issue at the earliest?

Labels (1)
0 Karma

akarivaratharaj
Communicator

This issue is resolved after making few changes to props.conf where the field extraction is set.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

It looks to like you either have a problem with your data (raw events), your ingest config e.g. transforms.conf or your search query. Unfortunately, since you have shared none of these, it is rather difficult to offer anything more constructive.

0 Karma

akarivaratharaj
Communicator

If I run a search query, there is no issue with raw events. From the Events tab, everything looks in perfect format and can't say that there is a Data quality issue in the events.

Only when this is visualised from statistics tab I could see this. Also this is happening only with some events in the results set. I have attached the screenshot of the normal results and the results with Data Quality issue.

Expected results with Request Id and other fields.

akarivaratharaj_0-1716459479128.png

But what it is displaying (Refer the highlighted rows)

akarivaratharaj_3-1716460402285.png

 

Here is the event of one of the request ids where the key value pair is as expected format

akarivaratharaj_4-1716460839107.png

 

 

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

What is the search?

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

How are the fields extracted?

0 Karma

akarivaratharaj
Communicator

I am using just the table command

index=main host=* sourcetype=* source=* | table _time, Request_id, Future_id

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

So it looks like it is to do with how the fields are extracted. Please can you share these details?

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Also you shared (a picture of) an event which works, but not one which doesn't. Please can you share the raw text of a "failing" event in a code block (rather than a picture) - you can obfuscate any sensitive details as appropriate.

0 Karma

akarivaratharaj
Communicator

Actually I have shared picture of the raw event of the failed ones only (just masked the confidential fields). They look similar to the other events which work.

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

So it looks like it is to do with how the fields are extracted. Please can you share these details?

0 Karma

akarivaratharaj
Communicator

I have observed one more thing with these failed events. In the event section, usually at the end of each event, the default fields like host, sourcetype, etc., will be appended and displayed.

Similarly, in addition to those default fields, I could see the Request_ID field is also displayed in that section after each event. In that place I could see the format of Request_ID is in unexpected form.

Please check the below screenshot (After the field CT=1, the section of default fields is shown)

akarivaratharaj_0-1716463332683.png

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

This is showing that the fields have been extracted incorrectly.

Yet again I ask if you could please share your configurations which are being used to extract the fields for this sourcetype - this is likely to be where your problem lies, so if you want a resolution, you are going to have to give us more information.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud’s AI Assistant in Action Series: Analyzing and ...

This is the second post in our Splunk Observability Cloud’s AI Assistant in Action series, in which we look at ...

Elevate Your Organization with Splunk’s Next Platform Evolution

 Thursday, July 10, 2025  |  11AM PDT / 2PM EDT Whether you're managing complex deployments or looking to ...

Splunk Answers Content Calendar, June Edition

Get ready for this week’s post dedicated to Splunk Dashboards! We're celebrating the power of community by ...