Splunk Search

How to filter out the logs during indexing using regex?

anandhalagaras1
Contributor

Hi Team,


I want to filter out the logs during the indexing level itself i.e. If the event comes with the following format as mentioned below with GET / - 111 then the logs should not be ingested into Splunk.

So kindly help out with the props and transforms for the same.

Sample Event:

2020-07-22 12:53:53 xx.xxx.xx.xx GET / - 111 - xx.xxx.x.xxx - - xxx x x xx

 

Labels (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @anandhalagaras1,

try to add to the regex hinted by @to4kawa a backslash before the slash, in other words try this regex:

GET\s\/\s-\s111

as you can see in https://regex101.com/r/kloFhs/1 if you don't escape slash, the regex doesn't run.

Ciao.

Giuseppe

View solution in original post

0 Karma

to4kawa
Ultra Champion

https://docs.splunk.com/Documentation/Splunk/8.0.5/Forwarding/Routeandfilterdatad

transforms.conf

[null]
REGEX = GET\s/\s-\s111
DEST_KEY = queue
FORMAT = nullQueue

 

0 Karma

anandhalagaras1
Contributor

@to4kawa ,

Eventhough i used your props and transforms i am can still see the logs are getting ingested into Splunk. Also the way i have setup is that the UF is installed in the client machines and from there i have mentioned the outputs as the HF and from HF it will reach the Splunk Cloud indexers. So i have placed the props and transforms in the HF as you have mentioned but still I can see the logs are getting parsed into Search head.

Sample format of logs for reference. So if the below "GET / - 111" is seen then the logs should not be ingested into Splunk. So kindly help with the props and transforms.

2020-07-24 02:27:38 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 67
2020-07-24 02:27:34 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 67
2020-07-24 02:27:29 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 6
2020-07-24 02:27:29 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 6
2020-07-24 02:27:23 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 67
2020-07-24 02:27:19 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 67

0 Karma

anandhalagaras1
Contributor

Eventhough i used your props and transforms i am can still see the logs are getting ingested into Splunk. Also the way i have setup is that the UF is installed in the client machines and from there i have mentioned the outputs as the HF and from HF it will reach the Splunk Cloud indexers. So i have placed the props and transforms in the HF as you have mentioned but still I can see the logs are getting parsed into Search head.

Sample format of logs for reference. So if the below "GET / - 111" is seen then the logs should not be ingested into Splunk. So kindly help with the props and transforms.

2020-07-24 02:27:38 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 67
2020-07-24 02:27:34 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 67
2020-07-24 02:27:29 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 6
2020-07-24 02:27:29 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 6
2020-07-24 02:27:23 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 67
2020-07-24 02:27:19 09.876.543.21 GET / - 111 - 12.345.6.789 - - 123 4 5 67

 

So anyone kindly help on my request please to filter out the logs.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @anandhalagaras1,

try to add to the regex hinted by @to4kawa a backslash before the slash, in other words try this regex:

GET\s\/\s-\s111

as you can see in https://regex101.com/r/kloFhs/1 if you don't escape slash, the regex doesn't run.

Ciao.

Giuseppe

0 Karma

anandhalagaras1
Contributor

Thank you. It worked like a charm..

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @anandhalagaras1,

good, have a good splunkg!

Ciao and next time.

Giuseppe

P.S.: Karma Points are appreciated 😉

0 Karma
Get Updates on the Splunk Community!

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...

Splunk MCP & Agentic AI: Machine Data Without Limits

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses ...