Getting Data In

How to use Splunk to detect custom log pattern when using sleuth for traces?

volijaadu
New Member

Sample log file output

2018-01-29 17:46:35.341  INFO [hello-service,ca62f5d265c65e37,ca62f5d265c65e37,true] 9404 ---  [  XNIO-2 task-1] io.undertow.servlet                       Initializing Spring FrameworkServlet 'dispatcherServlet'
2018-01-29 17:46:35.342  INFO [hello-service,ca62f5d265c65e37,ca62f5d265c65e37,true] 9404 ---  [  XNIO-2 task-1] o.s.web.servlet.DispatcherServlet         FrameworkServlet 'dispatcherServlet': initialization started
2018-01-29 17:46:35.456  INFO [hello-service,ca62f5d265c65e37,ca62f5d265c65e37,true] 9404 ---  [  XNIO-2 task-1] o.s.web.servlet.DispatcherServlet         FrameworkServlet 'dispatcherServlet': initialization completed in 114 ms
2018-01-29 17:46:35.523  INFO [hello-service,ca62f5d265c65e37,ca62f5d265c65e37,true] 9404 ---  [  XNIO-2 task-1] com.demo.services.web.Controller          Received request
2018-01-29 17:46:42.009  INFO [hello-service,f3f5b0389dcdd2e9,f3f5b0389dcdd2e9,false] 9404 ---  [  XNIO-2 task-2] com.demo.services.web.Controller          Received request
2018-01-29 17:46:46.534  INFO [hello-service,7f2fed9d81715a4e,7f2fed9d81715a4e,true] 9404 ---  [  XNIO-2 task-3] com.demo.services.web.Controller          Received request

LOG_PATTERN used in logback-spring.xml configuration file

    "%d{yyyy-MM-dd HH:mm:ss.SSS}
    ${LOG_LEVEL_PATTERN:-%5p} ${PID:- }
    [%15.15t] %-40.40logger{39}  %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}"

I am using Spring Cloud Sleuth to generate the traceId in the logs using the above pattern in logback configuration.

When I Import the log file to Splunk, I want Splunk to recognize the fields
timestamp,loglevel,[serviceId,traceId,segmentId,exportToZipkin),processId and so on corresponding to the pattern in the LOG_PATTERN

I want Splunk to recognize because I intend to do aggregate searches in Splunk based on the serviceId,traceId,segmentIdfields logged in the file.

Does Splunk automatically recognize this log pattern or is there a way to teach Splunk to recognize this format so as to enable it to index the new data accordingly, and to provide options to search by the respective fields.

0 Karma

p_gurav
Champion

Hi volijaadu,

After indexing data you can use auto extractor of splunk or write an regex to extract required fields.
Use this link to use splunk field extractor:
https://docs.splunk.com/Documentation/Splunk/7.0.1/Knowledge/FXSelectSamplestep

0 Karma

volijaadu
New Member

I do not want to do it at request/search time as there will be a performance impact. I would like to do it at index time so that searches are faster at search time.

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...