Is anyone collecting Audit and Activity events from the CCURE 9000 application? The logs are in a SQL DB so I assume using the DBConnect2 app is the way to go. I am interested in any advice on what and how to collect the data. Also, any information on the impact to the application caused by collecting the data.
Thanks,
Ken
Our DB Query (for sake of comparison):
SELECT MessageUTC, MessageType, PrimaryObjectName, PrimaryObjectIdentity, SecondaryObjectName, SecondaryObjectIdentity, XmlMessage
FROM "SWHSystemJournal"."dbo"."JournalLog"
WHERE MessageUTC > ?
AND MessageType <>'OperatorActivity'
AND MessageType <> 'SystemActivity'
AND MessageType <> 'ObjectChangedState'
AND MessageType <> 'LogMessage'
AND MessageType <> 'DeviceActivity'
ORDER BY MessageUTC ASC
Thank you! Finally an example of someone using a rising column and the associated SQL!! Much appreciated.
DB performance is only a consideration on initial set-up and if you have a lot of historical.
That said, we brought-in some 10-15,000 transactions from CCURE9000 and it only took a few seconds to ingest when we set-up the DB Connect. We have ours on rising column against MessageUTC, set to run every 15 minutes.
We have not experienced any performance hits.
Im having the same problem getting data from CCure DB into Splunk. I saw a CCure plugin that interacts with the DBConnect app https://splunkbase.splunk.com/app/4333/ but it still lacks any information on how to create a SQL statement that will work with the CCure DB out of the box.
Problem with DBConnect is the input methods. You cant use "batch input" because it creates duplicates after every SQL run. Your left with Rising Input, which is also problematic because there is no column that is a good candidate for rising because that column should be incremental. Closest thing is some mathematical calculation of a few columns to create a proper timestamp, which Splunk does not recommend anyway due to a number of issues per the DBConnect docs. Not only that but "Rising method" of DBConnect requires the rising column name in the "where" clause which MSSQL does not support since the timestamp needs to be calculated and assigned an alias and aliases are not valid in "where" clause. Sigh...
MessageUTC has worked for us... it's not IDEAL, but it works well.
We are getting them via DB connect. We have the DBA create a table for us to read off of (try to make sure we are not going to negatively effect the DB performance.
The logs come in pretty straight but we need to prop\transforms things to adjust fields, etc... but no biggy.