I have read Splunk MySql docs, but I have a question:
Do we have to install a universal forwarder on the MySQL machine to get MySQL general and error logs? or just only install add-on on the indexer and enable DB Connect inputs?
You do not have to install anything on your DB, instead follow below approach -
Step 1. Install DB connect app on a Heavy forwarder
Step 2. Create Identities - https://docs.splunk.com/Documentation/DBX/3.2.0/DeployDBX/Createandmanageidentities
Step 3. Create Connections - https://docs.splunk.com/Documentation/DBX/3.2.0/DeployDBX/Createandmanagedatabaseconnections
Step 4. Create inputs - https://docs.splunk.com/Documentation/DBX/3.2.0/DeployDBX/Createandmanagedatabaseinputs
NOTE - You can configure inputs to send your db events to any of the index/sourcetype.
From my research it appears you do need to install a heavy forwarder or single-instance Splunk Enterprise directly on the MySQL machine to get the general and error logs (and the other local server logs). This is an omission in the Splunk Add-on for MySQL documentation which I will correct now. 🙂
Because the configuration is best performed from the Spunk Web UI (add-on setup page) and the add-on uses python scripts, universal or light forwarders are not supported.
Thanks Hjauch , i agree with you .
so i need to install a heavy forwarder and install mysql addon on it .
Yes, I believe that is correct. I am checking with the developer and will update this information if it is not correct.
So the architecture shall be as following :
- heavy forwarder on MYSQL server with Splunk MySQL add-on installed on it .
- DB Coonect on the indexer ( Right ?)
My understanding is you need to have DB Connect and MySQL add-on installed on search heads and heavy forwarders only. Not indexers.
From the documentation ( http://docs.splunk.com/Documentation/AddOns/latest/MySQL/Hardwareandsoftwarerequirements#DB_Connect_... 😞
You must have Splunk DB Connect v2 installed to your heavy forwarders to collect MySQL performance and configuration logs with this add-on. The Splunk Add-on for MySQL works with DB Connect version 2.0 and above. See "Deploy and Use Splunk DB Connect" in the Splunk DB Connect manual for information.
I want to state up front that I do not have experience in the field, but I confirmed with the developer of the app that MySQL add-on needs to be installed on heavy forwarders and the heavy forwarder with the add-on must be installed on the MySQL box if you are collecting local logs.
my deployment is simple , so i don't have a search head .
i will install MySQL addon on a heavy forwarder , but what about DB connect installed on my indexer ?
Please see the "Deploy and Use Splunk DB Connect" section of doc for single-instance and distributed deployments: http://docs.splunk.com/Documentation/DBX/2.1.3/DeployDBX/Singleserverdeployment
Installing DB Connect on an indexer is not recommended, but it can work for small customers. For instance, if you've got one indexer and don't plan to add more hardware, it would work. The catch is that installing it on an indexer means that gathering data is competing with the indexing and search.
Another option could be to install a Universal Forwarder on the database to get the files. However, you would have to configure the DB Connect inputs manually in the .conf files, you would not be able to use the UI from a UF.
You don't have to install anything locally if you don't want to.
All you need to do is to deploy a full Splunk Enterprise (universal forwarder won't work), install DB Connect, configure DB Connect.
The Splunk Enterprise instance could perform one of the following roles in your deployment: Heavy Forwarder, Search Head, Indexer.
It all depends on your architecture and your needs.
Hope that helps
EDIT ( as I missed part of your question):
The DB COnnect addon will give you access to whichever information you've got in your SQL tables. If that includes the logs you are talking about then great.
If not you could install a Heavy Forwarder locally and perform both tasks from there: monitor local files, use DB Connect to retrieve SQL data. Same applies to a remote one but you will need to share those log directories if any.