Getting Data In

Splitting of sourcetype

rahulg
Explorer

Hello there

I am monitoring files using input.conf and define source source type there

i am trying to split sourcetype in to multiple sourcetype

 

inputs.conf

[monitor:///opt/splunk/etc/apps/out/bin/out/.../*.gz]
disabled=0
index=security_abc_index
sourcetype=abd_s3
source=abd
interval=60

 

this props.conf  here i am doing  parsing 

[abd_s3]
LINE_BREAKER = ""{"
NO_BINARY_CHECK = 1
TRUNCATE = 0
SHOULD_LINEMERGE = false
TRANSFORMS-splitsourcetype = event1,  event2, event3,  event4

 

and TRANSFORMS.conf, event2, 3, 4 are having regex which i want to put in source type , everything else which is not matching to regex to event1

[event1]
DEST_KEY = MetaData:Sourcetype
REGEX = .
FORMAT = sourcetype::event1

[event2]
DEST_KEY = MetaData:Sourcetype
REGEX = \{\"AgentLoadFlags\".*
FORMAT = sourcetype::event2


[event3]
DEST_KEY = MetaData:Sourcetype
REGEX = \{\"GatewayIP\".*
FORMAT = sourcetype::event3


[event4]
DEST_KEY = MetaData:Sourcetype
REGEX= \{\"ComputerName\".*
FORMAT = sourcetype::event4

 

Output in index i am getting in to sourcetype event1  which not macthing to regex

which ever matched to regex not getting monitored not even index, am i doing anything wrong

Labels (2)
0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

The regex in the [event1] transform matches everything.  That's why everything is in that sourcetype.  Try changing the order in the TRANSFORMS attribute.

TRANSFORMS-splitsourcetype = event2, event3, event4, event1
---
If this reply helps you, Karma would be appreciated.

View solution in original post

richgalloway
SplunkTrust
SplunkTrust

The regex in the [event1] transform matches everything.  That's why everything is in that sourcetype.  Try changing the order in the TRANSFORMS attribute.

TRANSFORMS-splitsourcetype = event2, event3, event4, event1
---
If this reply helps you, Karma would be appreciated.

rahulg
Explorer

Thank you

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...