My question is similar to others around extracting new fields, but the answers I've tried to date haven't worked.
When I click on Extract New Fields, the Select Sample Event screen will end up selecting somewhere around 20 actual log lines. It will read them as a single sample event instead of around 20 separate events (one per log line).
Included in the log message at the end of the log line are sometimes very large JSON strings or typical multi-line Java exceptions.
The log pattern is as follows:
time:stamp LOGTYPE [java-thread-id-1234][JavaClass:LineNumber] Log message goes here. Usually is a short message. Sometimes includes *very* large single-line JSON strings. Sometimes includes a multi-line Java exception.
A practical example would be as follows:
08:33:09,372 INFO [http-bio-8080-exec-4687][ServicesController:125] JSON returned={"succeeded":true,"data":{"example1":[],"example2":"","example3":null},"message":""}
Or:
09:47:13,215 INFO [http-bio-8080-exec-4678][ServicesController:125] Example log message goes here.
When I setup the forwarder, the Source Type was set to Automatic, not log4j. We're using slf4j for our logger. Does Splunk understand slf4j? I'm assuming it does, but if it doesn't, do I need to find an app that will add support for slf4j?
Bottom line, is it possible to extract these fields including the large JSON strings and multi-line Java exceptions?
... View more