I have the following logback configuration and I am using it in a simple java application that does nothing but logging one line. When I uncomment the splunk appender line it won't let the applicaiton exit, eventhough the application is finished. Is there a way to terminate all the logging threads so let the main application exits?
logback.xml
<appender name="SPLUNK" class="com.splunk.logging.HttpEventCollectorLogbackAppender">
<url>${splunkUrl}</url>
<token>${splunkToken}</token>
<source>${projectName}</source>
<host>${COMPUTERNAME}</host>
<sourcetype>batch_application_log:json</sourcetype>
<disableCertificateValidation>true</disableCertificateValidation>
<!--<messageFormat>json</messageFormat>-->
<!--<retries_on_error>1</retries_on_error>-->
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>"%msg"</pattern>
</layout>
</appender>
<root level="INFO">
<!--<appender-ref ref="SPLUNK"/>--> if I uncomment this line application never exits
</root>
Here is the java code:
public class Main {
public static void main(String[] args) {
final Logger logger = LoggerFactory.getLogger(Main.class);
logger.info("******");
}
}
I also tried the same and facing the same issue. Initially I thought application is not getting terminated on getting exception but same is happening without any exception in Hello World Application. Is there any configuration to prevent this?
Hi vijay_iiita
This question was posted almost 5 months ago. If the comments and the answer weren't able to help you with your question, please post a new question so you can get maximum exposure and help.
Thanks
Hi asiddique_splunk,
The only answer suggested here is to add custom middleware which check completion of all events. This doesn't seems to be standard solution and more like workaround. Instead I would expect to have exiting behavior as part of Appender configuration if that is intentional. Non existing behavior may be acceptable for web application but might not be suitable for other application which runs on demand like AWS lambda or Spark Job. I am trying to upload logs using Splunk HEC appender for Spark job.
Thanks
Here is the answer to your question.
I can't see any answer, maybe you forgot to add a link or something.
Any resolution on this?
I also tried the same and facing the same issue. Initially I thought application is not getting terminated on getting exception but same is happening without any exception in Hello World Application. Is there any configuration to prevent this?
I haven't found any solution yet. Had to move on with the Universal Forwarders for now.