Archive

How to approach custom field extractions for new source with deployment server

Explorer

New to Splunk Enterprise. Confused as to what the best approach for configuring multiple field extractions for a new sourcetype in a multihost deployment. Looking for search time extractions on a log like the one below. We have a deployment server and non-clustered search heads. I can build field extractions using the built in GUI tool but was told I need to migrate these to props/transforms conf files to get them working across my SH deployment. I've read through some these files included in the TAs we've onboarded but not sure I'm following the mapping correct.

  1. What files need what content to get a new key/value pair mapped out?
  2. I can use the built in tool to extract >1 field in the regular expression. Do I break these out or use a consolidated expression?
  3. For search time expressions do I need to deploy the config to any other location other than search heads?

Example log:
INFO | jvm 3 | 2017/03/12 22:07:13 | ERROR [QueryExecutor[sourcesystem] ] [22:07:13,198]: No historical information could be found for any of the specified paths. Check that paths are correct. Paths: [example/path]

Existing props.conf file put in by PS to handle the wonky jvm log format:

[widget]
SHOULDLINEMERGE=false
NO
BINARYCHECK=true
CHARSET=UTF-8
MAX
TIMESTAMPLOOKAHEAD=50
TRUNCATE=99999
disabled=false
TIME
FORMAT=%Y/%m/%d %H:%M:%S
TIMEPREFIX=^.*jvm\s1\s+|\s
LINE
BREAKER=([\r\n]+).*\d{4}/\d{2}/\d{2}\s\d{2}:\d{2}:\d{2}\s| \w

Tags (1)
0 Karma

Path Finder

Hi uhaba,

The prop.conf you show here contain fields which are not related to search time field extractions but have to do with how Splunk should store the data in the index.

For props.conf and transforms.conf the docs are very good. The same for deployment.

0 Karma