Splunk Enterprise

Splunk Fields Extraction - Open telemetry Collector

mattt
Loves-to-Learn

Good morning,

I’m experiencing an issue with the following log:

15:41:41,341	
2025-05-13 15:41:41,340 DEBUG [org.jbo.res.rea.cli.log.DefaultClientLogger] (vert.x-eventloop-thread-1) requestId=31365aee-0e03-43bc-9ccd-fd465aa7a4ca Request: GET http://something.com/something/else Headers[Accept=application/json If-Modified-Since=Tue, 13 May 2025 04:00:27 GMT User-Agent=Quarkus REST Client], Empty body
2025-05-13 15:41:39,970 DEBUG [org.jbo.res.rea.cli.log.DefaultClientLogger] (vert.x-eventloop-thread-1) requestId=95a1a839-2967-4ab8-8302-f5480106adb6 Response: GET http://something.com/something/else, Status[304 Not Modified], Headers[access-control-allow-credentials=true access-control-allow-headers=content-type, accept, authorization, cache-control, pragma access-control-allow-methods=OPTIONS,HEAD,POST,GET access-control-allow-origin=* cache-control=no-cache server-timing=intid;desc=4e7d2996fd2b9cc9 set-cookie=d81b2a11fe1ca01805243b5777a6e906=abae4222185903c47a832e0c67618490; path=/; HttpOnly]

A bit of context that may be relevant: these logs are shipped using Splunk OTEL collectors.

mattt_0-1747144157701.png

In the _raw logs, I see the following field values:

FieldValue
requestID95a1a839-2967-4ab8-8302-f5480106adb6 Response: GET http://something.com/something/else
requestIDrequestId=31365aee-0e03-43bc-9ccd-fd465aa7a4ca Request: GET http://something.com/something/else

 

What I want is for the requestID, and the Request or Response parts to be extracted into separate fields.

I’ve already added the following to my props.conf:

[sourcetype*]
EXTRACT-requestId = requestId=(?<field_request>[a-f0-9\-]+)
EXTRACT-Response = Response:\s(?<field_response>([A-Z]+)\s([^\s,]+(?:[^\r\n]+)))
EXTRACT-Request = Request:\s(?<field_request>([A-Z]+)\s([^\s,]+(?:[^\r\n]+)))

I verified on regex101 that the regex matches correctly, but it's not working in Splunk.

Could the issue be that the log show Response: instead of Response= and Splunk doesn’t treat it as a proper field delimiter? Unfortunately, I’m unable to modify the source lo

What else can I check? Do I need to modify the .yml configuration for the Splunk OTEL collector, or should I stick to using props.conf and transforms.conf?

 

Thank you in advance,

Best Regards.

Matteo

Labels (2)
0 Karma

mattt
Loves-to-Learn

Hi @livehybrid  sorry for the late response. Unfortunately, the answer you provided does not work.

mattt_0-1747929334661.png

Still no extraction and same behaviour.
Can you help me split and create the correct splunk field?
Thanks in advance,

Matt

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Looking back it looks like I might have pasted in the wrong bit as I didnt add the "in <field>"

How about this?

EXTRACT-requestId = (?<field_requestId>[a-f0-9\-]{36}) in requestId

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

0 Karma

livehybrid
SplunkTrust
SplunkTrust

Hi @mattt 

I'm a little confused as to why requestId= is still present in the second event example.

If you want to run the regex extract against the "requestID" field then you need to add "in <fieldName>" to your extract:

<regex> in <src_field>

See the docs here

For example:

EXTRACT-requestId = (requestId=)?(?<field_requestId>[a-f0-9\-]{36})
EXTRACT-Response = Response:\s(?<field_response>([A-Z]+)\s([^\s,]+(?:[^\r\n]+)))
EXTRACT-Request = Request:\s(?<field_request>([A-Z]+)\s([^\s,]+(?:[^\r\n]+)))

🌟 Did this answer help you? If so, please consider:

  • Adding karma to show it was useful
  • Marking it as the solution if it resolved your issue
  • Commenting if you need any clarification

Your feedback encourages the volunteers in this community to continue contributing

0 Karma
Get Updates on the Splunk Community!

Automatic Discovery Part 1: What is Automatic Discovery in Splunk Observability Cloud ...

If you’ve ever deployed a new database cluster, spun up a caching layer, or added a load balancer, you know it ...

Real-Time Fraud Detection: How Splunk Dashboards Protect Financial Institutions

Financial fraud isn't slowing down. If anything, it's getting more sophisticated. Account takeovers, credit ...

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...