I have installed the Universal Forwarder on my Solaris Global server, but no data is getting to my Indexer. Looking at the Splunk log file, I can see that the Forwarder is trying to read the current day's log file (the not_terminated one), but it can't because the audit log is not a text file. Is there a way around this?
Splunk will read any plaintext file with ease - I do this faairly often on my Solaris systems. Have you checked all permissions are sorted and traffic is allowed between instances?
Thanks for the quick response. I'm very new to Splunk. I am forwarding to an Indexer run by a third party, so determining if anything is working is difficult. The Forwarder is able to access the Solaris audit log file, so based on the Splunk log file I figured the problem is the fact that the Solaris audit log is not text. I used the praudit utility to output the current audit log to a text file in the same directory as the audit log file, but nothing showed up on the Indexer (no problems with RHEL or Windows). Ideas?
Hi
here is instructions how to convert auditd log (at least partially) to syslog and then collect those via syslog or directly from syslog file as text.
https://docs.oracle.com/cd/E23823_01/html/816-4557/audittask-44.html
When you are already exporting those logs to text files (I propose to export those to another directory and give UF user's to enough access to read those (instead of to real audit directory), you must define a new monitor in inputs.conf to read those files to splunk.
r. Ismo