Getting Data In

Why is the KV pair extraction with custom delimiters not working?

Communicator

Hello all,

I'm trying to get extraction to work on a dynamic key value log.

I've tried the following without any success (open to other suggestions away from this).

Ideally the output should be:

Thread=5\=/blah/blah
Method=GET
URI=/
Protocol=HTTP/1.1
IP=1.2.3.4
Port=54809
Referer=https://referrer
field=value
.
.
.
field=value

props.conf

[testsourcetype_log]
CHARSET=UTF-8
KV_MODE=none
NO_BINARY_CHECK=true
SHOULD_LINEMERGE=false
category=Testing
description=Test KV log sourcetype
disabled=false
pulldown_type=true
REPORT-kv=kv_extraction
EXTRACT-status=^(\d{4}\-\d{2}\-\d{2}T\d{2}:\d{2}:\d{2})\s\[(?<status>\w+)

transforms.conf

[kv_extraction]
DELIMS = "]", ":"
MV_ADD=true   

log snip:

2019-03-01T09:42:01 [status] [Thread: 5=/blah/blah] [Method: GET] [URI: /blah/blah]  [Protocol: HTTP/1.1] [IP: 1.2.3.4] [Port: 54809] [Referer: https://referrer] [..] ... [..] text string here

References:
https://www.splunk.com/blog/2008/02/12/delimiter-based-key-value-pair-extraction.html
https://answers.splunk.com/answers/170826/set-delimiter.html

Thanks in advance

UPDATE 6/25:
I've tried combinations from @FrankVl, @VatsalJagani, @woodcock but it seems none of them work.

Naturally, I've restarted splunk after each change. Here is the output from btool to show that I'm not going insane

/opt/splunk/bin/splunk cmd btool props list
[testsourcetype_log]
ADD_EXTRA_TIME_FIELDS = True
ANNOTATE_PUNCT = True
AUTO_KV_JSON = true
BREAK_ONLY_BEFORE =
BREAK_ONLY_BEFORE_DATE = True
CHARSET = UTF-8
DATETIME_CONFIG = /etc/datetime.xml
DEPTH_LIMIT = 1000
HEADER_MODE =
KV_MODE = none
LEARN_MODEL = true
LEARN_SOURCETYPE = true
LINE_BREAKER_LOOKBEHIND = 100
MATCH_LIMIT = 100000
MAX_DAYS_AGO = 2000
MAX_DAYS_HENCE = 2
MAX_DIFF_SECS_AGO = 3600
MAX_DIFF_SECS_HENCE = 604800
MAX_EVENTS = 256
MAX_TIMESTAMP_LOOKAHEAD = 128
MUST_BREAK_AFTER =
MUST_NOT_BREAK_AFTER =
MUST_NOT_BREAK_BEFORE =
NO_BINARY_CHECK = true
SEGMENTATION = indexing
SEGMENTATION-all = full
SEGMENTATION-inner = inner
SEGMENTATION-outer = outer
SEGMENTATION-raw = none
SEGMENTATION-standard = standard
SHOULD_LINEMERGE = false
TRANSFORMS =
TRANSFORMS-kv = kv_extraction
TRUNCATE = 10000
category = Testing
description = Test KV log sourcetype
detect_trailing_nulls = false
disabled = false
maxDist = 100
priority =
pulldown_type = true
sourcetype =

/opt/splunk/bin/splunk cmd btool transforms list
[kv_extraction]
CAN_OPTIMIZE = True
CLEAN_KEYS = True
DEFAULT_VALUE =
DEPTH_LIMIT = 1000
DEST_KEY =
FORMAT = $1::$2
KEEP_EMPTY_VALS = False
LOOKAHEAD = 4096
MATCH_LIMIT = 100000
MV_ADD = true
REGEX = \[([^:[]+):\s+([^\]]+)]
SOURCE_KEY = _raw
WRITE_META = False

Updated props.conf

[testsourcetype_log]
CHARSET=UTF-8
KV_MODE=none
NO_BINARY_CHECK=true
SHOULD_LINEMERGE=false
category=Testing
description=Test KV log sourcetype
disabled=false
pulldown_type=true
TRANSFORMS-kv=kv_extraction

updated transforms.conf

[kv_extraction]
REGEX = \[([^:[]+):\s+([^\]]+)]
FORMAT = $1::$2
MV_ADD=true

UPDATE 6/27:

Using a clean splunk docker image, I:

  • recreated indexers, inputs, props, transforms on the docker instance (external volume)
  • stripped those files to a bare minimum
  • renamed the sourcetype (to be sure that Splunk is reading props/transforms)
  • moved the configs from being inside an app to system/local/*.conf
  • checked the knowledge object existence via the gui (new/renamed transform is listed)
  • checked the knowledge object permissions (global)
  • and restarted after each change

nada, log is being ingested but no new fields created (except for the value of thread that is field: 5 value: /blah/blah/)

current config:

$ cat system/local/inputs.conf
[default]
host = 7278c011e1e0

[monitor:///opt/splunk/var/log/testlogs/*.log]
disabled=false
sourcetype=blahblah
index = testindex

$ cat system/local/props.conf
[blahblah]
CHARSET=UTF-8
KV_MODE=none
NO_BINARY_CHECK=true
SHOULD_LINEMERGE=false
category=Testing
description=Test KV log sourcetype access
disabled=false
pulldown_type=true
TRANSFORMS-blahkv=blahkvextraction
#TRANSFORMS-replace_source = replacedefaultsource2

$ cat system/local/transforms.conf
[blahkvextraction]
FORMAT = $1::$2
MV_ADD = 1
#REGEX = \[([^:[]+):\s+([^\]]+)]
REGEX = \[([^:\]]+):\s+([^\]]+)\]

BTW: for @FrankVl, @VatsalJagani, @woodcock, thanks. I have used iterations of each of your code and strongly believe that it works. I've done variations of the below to prove that your solutions work and it does (I get one instance of field1=Thread, field2= value😞

index=testindex sourcetype=blahblah
| rex field=_raw "\[(?<field1>[^:\]]+):\s+(?<field2>[^\]]+)\]"
0 Karma
1 Solution

Communicator

Firstly, thanks to @VatsalJagani, @woodcock for your answers and a special mention to @FrankVl for persisting.

There were 3 main problems:

  1. The regex was incorrect, all three contributors provided working solutions (I upvoted all three solutions)
  2. Knowledge object sharing, inputs/props/transforms are part of an app and @FrankVl pointed out that permissions/sharing could be incorrect, which turned out to be correct. Object sharing was not set, defaulting to app only and not global sharing
  3. Field extraction should have been REPORT- but was incorrectly set to TRANSFORMS- somewhere between changes. I believe this was due to the original app being pushed down to the server overwriting any changes. I had to revert back to the original app at one stage to restore the defaults but missed the changes. This was the main reason why I built a new server to avoid pushing applications multiple times in order to troubleshoot.

View solution in original post

0 Karma

Communicator

Firstly, thanks to @VatsalJagani, @woodcock for your answers and a special mention to @FrankVl for persisting.

There were 3 main problems:

  1. The regex was incorrect, all three contributors provided working solutions (I upvoted all three solutions)
  2. Knowledge object sharing, inputs/props/transforms are part of an app and @FrankVl pointed out that permissions/sharing could be incorrect, which turned out to be correct. Object sharing was not set, defaulting to app only and not global sharing
  3. Field extraction should have been REPORT- but was incorrectly set to TRANSFORMS- somewhere between changes. I believe this was due to the original app being pushed down to the server overwriting any changes. I had to revert back to the original app at one stage to restore the defaults but missed the changes. This was the main reason why I built a new server to avoid pushing applications multiple times in order to troubleshoot.

View solution in original post

0 Karma

Builder

In all of this long explanation and request, I don't see any sample data. And the answers that are being provided don't seem to match with the first example at the top. Do you have an example 2 or 3 actual events with dynamic key/value data (sanitized of course)?

0 Karma

Communicator

in short, the answers/extractions work, the solution is that my config changed somewhere during testing so I was using TRANSFORMS- instead or REPORT- in props.conf.

a log snip (granted, not actual events) is there, search for 'log snip' in the original post

I'm now working backwards to see if there are any other issues that contributed to this problem

0 Karma

Builder

Ok, great. You should choose the answer that your worked off of for your solution.

0 Karma

Communicator

I will update once I traverse back through my steps

0 Karma

Ultra Champion

On the 6/27 update: "nada, log is being ingested but no new fields created (except for thread)"

So the Thread field is extracted correctly? Just the other fields that are missing? That's even more weird...

You are executing this in smart or verbose mode, right? What Splunk version are you on?

I'm running 7.3 here and just ingested a test file with the sample you shared. Set up the REPORT extraction with the regex I suggested and it works like a charm. All fields in [] are being extracted. So conceptually the solutions provided are fine. No clue why in your specific case it is not working...

0 Karma

Communicator

Sorry, I need to clarify what I meant about thread, I'll correct the update
Splunk extracts the value of thread not thread (that is 5) due to the = in the value

using the example

[Thread: 5=/blah/blah]

Splunk extracts

 5=/blah/blah

Other questions:
- yes to verbose
- docker instance is 7.3.0

The difference at the moment is that in props.conf, the transform is set to TRANSFORMS- and not REPORT

0 Karma

Ultra Champion

Oh, haha, we've all been overlooking that (probably because you had it as REPORT at the start) 😄

Must be REPORT. TRANSFORMS is for index time extractions and then this $1::$2 method doesn't work I believe.

Communicator

...yes, sorry, I changed it a few times but let me try again quickly

0 Karma

Communicator

face palm

0 Karma

Communicator

I have no idea when/why it changed...

ok, I'm going to slowly migrate this back to the other server and see if all of this works

0 Karma

Communicator

thanks @FrankVl for your persistence

0 Karma

Esteemed Legend

Mine definitely works on your sample data set; see here:
https://regex101.com/r/s0ACKL/1

0 Karma

Ultra Champion

Commenting on the update:
Have you deployed the config on the searchhead(s)? Did you confirm from the GUI (settings - fields ->...) that the TRANSFORMS-kv is indeed enabled and shared Global?

0 Karma

Communicator

I initially installed this on a standalone SH, btool output that I posted says that the config is loaded.

Having said that, the props/transforms is part of an app not system wide. so it may be the problem.

I'm currently standing up a clean SH/indexer instance to retest and check the perms. I'll report back shortly.

0 Karma

Ultra Champion

Be aware: btool says nothing about runtime loaded config. btool just shows the merged version of all config files on disk. And indeed also doesn't show you whether the config is visible in the app where you perform the search.

0 Karma

Communicator

I've been restarting after each props/transforms change

0 Karma

Esteemed Legend

Try this in transforms.conf:

[kv_extraction]
REGEX = \[([^:[]+):\s+([^\]]+)]
FORMAT = $1::$2
MV_ADD = true

Esteemed Legend

Mine definitely works on your sample data set; see here:
https://regex101.com/r/s0ACKL/1

0 Karma

Communicator

I believe you 😉 I'm currently standing up a clean SH/indexer instance to retest. I'll report back shortly.

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!