Getting Data In

How do I send journald logs to my Splunk indexer?

ctjd81
New Member

I would like to separate these logs into units (ie - etcd.service, kube-apiserver.service, kube-controller-manager.service, etc)
I'd then like to send those different logs to Splunk.
Do I have to force these logs to a file first, then move them?

0 Karma
1 Solution

renjith_nair
Legend

Since splunk can't read the default binary format of journal , you should write to a text file and then forward (don't remember but read it somewhere)

There is a blog which talks about this in detail for different flavors, might be useful for you .

http://blogs.splunk.com/2015/04/30/integrating-splunk-with-docker-coreos-and-journald/

---
What goes around comes around. If it helps, hit it with Karma 🙂

View solution in original post

renjith_nair
Legend

Since splunk can't read the default binary format of journal , you should write to a text file and then forward (don't remember but read it somewhere)

There is a blog which talks about this in detail for different flavors, might be useful for you .

http://blogs.splunk.com/2015/04/30/integrating-splunk-with-docker-coreos-and-journald/

---
What goes around comes around. If it helps, hit it with Karma 🙂

ctjd81
New Member

This got me a long way to the answer, thanks!!!

0 Karma

alastor
Path Finder

This is a terrible answer. Splunk should put this into the TA for NIX. expecting each customer to figure out some crap method of this is BS.

tgurantz_splunk
Splunk Employee
Splunk Employee

Agreed... I'm going through these older journald posts for other reasons, but it looks like no one has updated responses here that there's better ways now? Starting in Splunk 8.1 there is native journald input support (separate from any TA for *NIX): 

https://docs.splunk.com/Documentation/Splunk/latest/Data/CollecteventsfromJournalD

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...

Community Content Calendar, September edition

Welcome to another insightful post from our Community Content Calendar! We're thrilled to continue bringing ...