Getting Data In
Highlighted

Logging to Job Inspector from Custom Search Command

Engager

Hi folks.
I have a custom search command and I am using self.logger to log messages from the command. Please see my logging.conf attached.

[loggers]
# root is mandatory.
keys = root, CustomSearchCommand

[handlers]
keys = ConsoleHandler

[formatters]
keys = SimpleFormatter

[logger_root]
level = INFO
handlers = ConsoleHandler

[logger_CustomSearchCommand]
level = INFO
handlers = ConsoleHandler
# qualname is mandatory.
qualname = CustomSearchCommand
# propagate is disabled in order not to log same events twice.
propagate = 0

[handler_ConsoleHandler]
class = StreamHandler
# sys.stdout causes weird errors.
args = (sys.stderr,)
level = INFO
formatter = SimpleFormatter

[formatter_SimpleFormatter]
format = %(asctime)s - %(process)s - %(name)s - %(levelname)s - %(message)s

First of all, I don't understand why Splunk doesn't allow us to use sys.stdout in ConsoleHandler, it just keeps failing with a very strange error message:

11-18-2019 16:49:38.169 ERROR ChunkedExternProcessor - Failed attempting to parse transport header: 2019-11-18 16:49:38,168 - 23427 - CustomSeachCommand - INFO - <ORIGINAL LOG MESSAGE>

At the same time, using sys.stderr in ConsoleHandler works fine, though the output doesn't look good:

11-18-2019 16:40:46.346 ERROR ChunkedExternProcessor - stderr: 2019-11-18 16:40:46,345 - 22025 - CustomSearchCommand - INFO - <ORIGINAL LOG MESSAGE>

So why does Splunk always log messages from child processes as errors? It's clear that each final log record consists of two parts: 11-18-2019 16:40:46.346 ERROR ChunkedExternProcessor - stderr: (from the parent process, I guess) and 2019-11-18 16:40:46,345 - 22025 - CustomSearchCommand - INFO - <ORIGINAL LOG MESSAGE> (from the child process, executing my custom search command). By the way, I'm inheriting my own command from StreamingCommand and using chunked = true.

Speak Up for Splunk Careers!

We want to better understand the impact Splunk experience and expertise has has on individuals' careers, and help highlight the growing demand for Splunk skills.