I'm currently trying to setup the streaming of the kubernetes / docker logs into Splunk.
As you might now docker stores his container logs into files with a json syntax :
{log: "this is one log line", stream: "stdout", time: "2017-10-30T21:30:19.379796735Z"}
{log: "this is another log line", stream: "stdout", time: "2017-10-30T21:30:19.45Z"}
I have setup the Universal Splunk Forwarder to ingest those log files and send them to the indexers but the feedback I have used from the developers is that this is completely unreadable and I tend to agree with them ...
I've tried everything in the Search requests to be able to remodel the event object and discard everything but the log field of the docker json without success.
Another problem I have is that mixed sources (from different software) are written in those logs so different format can end up in the log field: raw text, json (escaped by docker) ... etc.
The first thing I'd like to do is to extract the log field of the docker json and send only that to splunk.
Then I'd like that to apply the correct source type to the log data, i.e. : json, access combined or anything else.
Regards.
... View more