I tried googling it but without any luck, also I tried searching official mailing lists, but also without any information.
Is it possible to install Splunk Universal Forwarder to OpenShift platform?
Maybe create our own custom cartridge?
As a POC, I have been able to get the Splunk forwarder working but it requires a lot of tweaking (def. not enterprise ready).
Basically I followed this guide: https://docs.splunk.com/Documentation/Forwarder/7.2.6/Forwarder/Makeauniversalforwarderpartofahostim...
In order for this to work, here are the things I did:
Note: I running this as a side-car solution where I have the app in the same project. The app is writing logs to another persistent volume that is shared with the splunk-forwarder pod and splunk-forwarder is configured to read from that persistent volume.
Going forward we're going to look at fluentd forwarding logs to splunk.
We have developed an application for Monitoring OpenShift clusters, including forwarding logs (container, host and openshift components) and monitoring stats (CPU, Memory, IO, etc) on level of processes, containers, pods, hosts. You can get the application from Splunkbase https://splunkbase.splunk.com/app/3836/ and find installation instructions on this page https://www.outcoldsolutions.com/docs/monitoring-openshift/
Hi MicTech,
OpenShift v3 is based on Kubernetes and includes the same default logging layer (Fluentd). Fluentd can send messages to Splunk with some community-built plugins that will need to be configured for sending to either the Splunk API, HTTP Event Collector, or TCP receiver.
Documentation for the latest version (3) of OpenShift about logging, and configuring the specific output can be found here:
https://docs.openshift.com/enterprise/3.1/install_config/aggregate_logging.html
Treasure Data also offers Fluentd Enterprise that has a supported Splunk plugins for the HTTP Event Collector and over TCP. More information can be found on https://fluentd.treasuredata.com or email me at a @ treasuredata.com
Thanks,
Anurag
Given OpenShift v3 now supports Docker - https://blog.openshift.com/openshift-v3-platform-combines-docker-kubernetes-atomic-and-more/
The blog "Collecting docker logs and stats with Splunk" - http://blogs.splunk.com/2015/08/24/collecting-docker-logs-and-stats-with-splunk/ may be a good starting point for you as it includes how to install a Splunk forwarder on Docker.
Also there is the recently released "Splunk logging driver" for Docker - https://docs.docker.com/engine/admin/logging/splunk/ for Docker which uses the HTTP event collector which may be an alternative for you. A blog introducing it can be found at http://blogs.splunk.com/2015/12/16/splunk-logging-driver-for-docker/
msivill, I have tried to configure splunk-logging-driver-for-docker as you suggested. It worked like a charm. But, it stops working as soon as OpenShift gets added to the mix. Reading OpenShift source code ... got complex and "hairy in a hurry."
If anyone has any real-world implementations of the Splulnk App suggested by outcoldman, I would love to hear your thoughts. PM is ok.
Y.
Openshift is another type of container just like Docker. I would not recommend trying to install a forwarder in Openshift but instead use Splunk's HTTP event collector. Rather than trying to wrote to a log have your application send logs directly to the HTTP event collector. Possible set up a queueing service like zmq or sqs which your application log too directly and have splunk poll the queue. This is great for ephemeral services and servers.
Of course this will require some dev time.
Shouldnt it work by installing from source vs rpm?
Copy the tarball to your OpenShift server, extract, change the permissions, run splunk...
forgive me I have 0 experience with openShift. I do run splunk on RHEL in AWS often though.
I too am looking for a Splunk forwarder for Open Shift.
is there any solution on this?
I also have been trying to do this and I would appreciate any insights.