Splunk Cloud Platform

Transforming error logs

vickyingle
Engager

I'm trying to transform a error log

Below is a sample log (nginx_error)

2024/11/15 13:10:11 [error] 4080#4080: *260309 connect() failed (111: Connection refused) while connecting to upstream, client: 210.54.88.72, server: mpos.mintpayments.com, request: "GET /payment-mint/cnpPayments/v1/publicKeys?callback=jQuery360014295356911736334_1731369073329&X-Signature=plkb810sFSSSIbASLb818BMXxgtUM76QNvhI%252FBA%253D&X-Timestamp=1731368881376&X-ApiKey=CSSSAPXXXXXXPxmO7kjMi&X-CompanyToken=d1111e8lV1mpvljiCD2zRgEEU121p&_=1731369073330 HTTP/1.1", upstream: "https://10.20.3.59:28076//cnpPayments/v1/publicKeys?callback=jQuery360014295356911736334_17313690733...", host: "test.mintpayments.com", referrer: "https://vicky9.mintpayments.com/testing??asd

We are trying to
1) GET query parameters must not be logged
2) Referrer must not contain the query string

I have updated my config as below

[04:59 PM] [root@dev-web01 splunkforwarder]# cat ./etc/system/local/props.conf
[source::///var/log/devops/nginx_error.log]
TRANSFORMS-sanitize_referer = remove_get_query_params, remove_referer_query

[04:59 PM] [root@dev-web01 splunkforwarder]# cat ./etc/system/local/transforms.conf
[remove_get_query_params]
REGEX = (GET|POST|HEAD) ([^? ]+)\?.*
FORMAT = $1 $2
DEST_KEY = _raw
REPEAT_MATCH = true

[remove_referer_query]
REGEX = referrer: "(.*?)\?.*"
FORMAT = referrer: "$1"
DEST_KEY = _raw
REPEAT_MATCH = true

Verified that the regex is correct and when I run below to list the changes, its present
/opt/splunkforwarder/bin/splunk btool transforms list --debug
/opt/splunkforwarder/bin/splunk btool props list --debug

Still I can see no transformation in the logs, what could be the issue here ?
We are using custom splunkforwarder in our env.

Labels (1)
0 Karma
1 Solution

isoutamo
SplunkTrust
SplunkTrust

Is this custom forwarder a Heavy Forwarder instead of Universal Forwarder?
You can use transforms.conf only in HF.

Your sample didn't contain end " which you are expecting on REGEX.

Should those regex are like https://regex101.com/r/iDjLlJ/1 and https://regex101.com/r/kuIxoI/1 as you are basically replacing _raw on both case with your matching groups?

(.*)(GET|POST|HEAD) ([^? ]+)\?([^\"]+)(\".*)
=> $1$2 $3$5
(.*referrer: ")([^\?]+\?)\?([^"]+)(")
=> $1$2$4

 

View solution in original post

richgalloway
SplunkTrust
SplunkTrust

FWIW, REPEAT_MATCH is ignored when DEST_KEY=_raw.  I believe DEST_KEY is not needed here since FORMAT says where the capture groups go.

---
If this reply helps you, Karma would be appreciated.
0 Karma

isoutamo
SplunkTrust
SplunkTrust

Is this custom forwarder a Heavy Forwarder instead of Universal Forwarder?
You can use transforms.conf only in HF.

Your sample didn't contain end " which you are expecting on REGEX.

Should those regex are like https://regex101.com/r/iDjLlJ/1 and https://regex101.com/r/kuIxoI/1 as you are basically replacing _raw on both case with your matching groups?

(.*)(GET|POST|HEAD) ([^? ]+)\?([^\"]+)(\".*)
=> $1$2 $3$5
(.*referrer: ")([^\?]+\?)\?([^"]+)(")
=> $1$2$4

 

vickyingle
Engager

Is there any way I can transform these logs once I receive them in Splunk (cloud) ?
These are nginx error logs which contains sensitive data, and in nginx we can not sanitize the error_logs.
Any suggestions will be highly appreciated.

0 Karma

isoutamo
SplunkTrust
SplunkTrust
The easiest way is setup HF on your own site to do it.
0 Karma

vickyingle
Engager

I'm using a universal forwarder, hence the transforms are not working, appreciate your response.

0 Karma
Get Updates on the Splunk Community!

Harnessing Splunk’s Federated Search for Amazon S3

Managing your data effectively often means balancing performance, costs, and compliance. Splunk’s Federated ...

Infographic provides the TL;DR for the 2024 Splunk Career Impact Report

We’ve been buzzing with excitement about the recent validation of Splunk Education! The 2024 Splunk Career ...

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...