I am interested in using Splunk's app for monitoring AS/400 logs. Does anyone have examples of how they collect log data on the AS/400 and forward it to Splunk? Thanks.
Here's a link to the app:
http://splunk-base.splunk.com/apps/24097/splunk-for-as400-iseries
Thanks for the help.
Is there any filtration possible on the control scripts running on the AS/400 server?
Production is currently outputting close to 24 Gigs a day !
It says file is Binary?
Asked as400 admin to set OUTFILFMT to
*TYPE5Below is error in Splunk:
04-08-2015 15:22:40.029 +0530 WARN
FileClassifierManager - The file
'D:AS400sample logsauditdta.txt' is
invalid. Reason: binary 04-08-2015
15:22:40.029 +0530 INFO
TailingProcessor - Ignoring file
'D:AS400sample logsauditdta.txt' due
to: binary
Got it! Now the app is populating many reports & data ingested.
If Splunk identifies a non ASCI character in any event it will flag the file as binary and it will log an event in splunkd.log as follow:
10-22-2012 17:53:21.734 +0000 INFO TailingProcessor - Ignoring file '/usr/local/rex/azkaban/logs/azkaban.log' due to: binary
To identify non ASCI characters you can use the following linux command line. The non ASCI characters will be higlighted.
grep --color='auto' -P -n -r "[x80-xff]" azkaban.log
Solution:
Use the "Binary file configuration" in props.conf as presented in the previous answer.
NO_BINARY_CHECK = [true|false]
* When set to true, Splunk processes binary files.
* Can only be used on the basis of [], or [source::], not [host::].
* Defaults to false (binary files are ignored).
It should be fairly straightforward to make the QAUDJRN data work with the app, without having to rework any of the searches. All the searches in the app make the assumption that the data was collected in the 'iseries' index. All the fields in use by the app are reported on the intro page when you enter the app. If your fields are named differently, just use FIELDALIAS or rename them to match those used by the app. In this way, it should be an almost drop-in solution. Note that the app was built around QAUDJRN data, so QHST and QSYSOPR won't appear in the canned reports. Having the data come in via syslog will make it very easy to parse with Splunk.
Thanks Ron, I'll take a wack at configuring it in our test environment prior to indexing on production.
Elliot, we are using a product to stream off our QAUDJRN, QHST, QSYSOPR messages off the iSeries our our central log collector via Syslog.
Ron, here is a question for you. Our plan is to import into Splunk. Do you think there will be any issues to adapt to this app?
Thanks
CL has just been added to the app. The scripts should help you provide FTP automation, as well as QAUDJRN exports.