Getting Data In

How to use Splunk to detect custom log pattern when using sleuth for traces?

volijaadu
New Member

Sample log file output

2018-01-29 17:46:35.341  INFO [hello-service,ca62f5d265c65e37,ca62f5d265c65e37,true] 9404 ---  [  XNIO-2 task-1] io.undertow.servlet                       Initializing Spring FrameworkServlet 'dispatcherServlet'
2018-01-29 17:46:35.342  INFO [hello-service,ca62f5d265c65e37,ca62f5d265c65e37,true] 9404 ---  [  XNIO-2 task-1] o.s.web.servlet.DispatcherServlet         FrameworkServlet 'dispatcherServlet': initialization started
2018-01-29 17:46:35.456  INFO [hello-service,ca62f5d265c65e37,ca62f5d265c65e37,true] 9404 ---  [  XNIO-2 task-1] o.s.web.servlet.DispatcherServlet         FrameworkServlet 'dispatcherServlet': initialization completed in 114 ms
2018-01-29 17:46:35.523  INFO [hello-service,ca62f5d265c65e37,ca62f5d265c65e37,true] 9404 ---  [  XNIO-2 task-1] com.demo.services.web.Controller          Received request
2018-01-29 17:46:42.009  INFO [hello-service,f3f5b0389dcdd2e9,f3f5b0389dcdd2e9,false] 9404 ---  [  XNIO-2 task-2] com.demo.services.web.Controller          Received request
2018-01-29 17:46:46.534  INFO [hello-service,7f2fed9d81715a4e,7f2fed9d81715a4e,true] 9404 ---  [  XNIO-2 task-3] com.demo.services.web.Controller          Received request

LOG_PATTERN used in logback-spring.xml configuration file

    "%d{yyyy-MM-dd HH:mm:ss.SSS}
    ${LOG_LEVEL_PATTERN:-%5p} ${PID:- }
    [%15.15t] %-40.40logger{39}  %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}"

I am using Spring Cloud Sleuth to generate the traceId in the logs using the above pattern in logback configuration.

When I Import the log file to Splunk, I want Splunk to recognize the fields
timestamp,loglevel,[serviceId,traceId,segmentId,exportToZipkin),processId and so on corresponding to the pattern in the LOG_PATTERN

I want Splunk to recognize because I intend to do aggregate searches in Splunk based on the serviceId,traceId,segmentIdfields logged in the file.

Does Splunk automatically recognize this log pattern or is there a way to teach Splunk to recognize this format so as to enable it to index the new data accordingly, and to provide options to search by the respective fields.

0 Karma

p_gurav
Champion

Hi volijaadu,

After indexing data you can use auto extractor of splunk or write an regex to extract required fields.
Use this link to use splunk field extractor:
https://docs.splunk.com/Documentation/Splunk/7.0.1/Knowledge/FXSelectSamplestep

0 Karma

volijaadu
New Member

I do not want to do it at request/search time as there will be a performance impact. I would like to do it at index time so that searches are faster at search time.

0 Karma
Get Updates on the Splunk Community!

Webinar Recap | Revolutionizing IT Operations: The Transformative Power of AI and ML ...

The Transformative Power of AI and ML in Enhancing Observability   In the realm of IT operations, the ...

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...