I’m testing the Splunk App for Nextcloud.
I installed a Splunk enterprise server, and a Splunk universal forwarder (my Nextcloud instance and the server are on different hosts).
Looks like it’s working, and I do collect data from my Nextcloud instance, however not all categories of data.
Shortlist of what IS collected:
Shortlist of what is NOT being retrieved (and not displayed in the Splunk web pages), mainly some usage data:
I would welcome ideas about what is left to configure, or what I’m doing wrong with the setup.
Thanks in advance!
Jean-Claude
Hi! Which data did you insert in NCSERVER in the file TA-nextcloud.config?
The missing data corresponds with the data which is pulled in with the Nextcloud Add-on (https://splunkbase.splunk.com/app/3397/). The Add-on contains two scripts which pull the data from some Nextcloud REST API endpoints. The install guide (link to which is in the Splunkbase entry for the Nextcloud App) explains how to configure the Add-on so the scripts can pull the information from your Nextcloud server.
Once the Add-on is installed (or in case it is already installed, you may also wish to consult the “Other > App debugging” dashboard which contains information useful to understand why the scripts may not be working correctly.
Again, thank you for your reply.
Of course, curl was not installed on the splunk server (looks like i can't read 🙂
Looks like things are getting better. I do get some data that was previously missing.
However, I still dont understand why i get 0 active users, and no results in defined users. Mind, I'm not sure I understand what an "active" nor a "defined" user is...
As for the shares and storage results, the charts are there but they show absolutely no existing numbers or curves in either category, i.e. shares, number of files, free disk space, storages. Well, my test instance of NC is very small, though i still have a few users, several dozens of files, and several shares. Is there something left to configure for that matter ?
Anyway, thanks again for your time. Much appreciated.
Assuming this is a "standard" NC installation, the missing dashboards should "light up" after a certain amount of data has been retrieved. The default time period for the dashboards is 7 days, you could try to choose a shorter time period and see if the dashboards then populate. I know that some data doesn't populate if for example single sign-on is configured, but if that is not the case and you still don't see the dashboards light up, do get in touch.
Defined users are the number of users you have created on the NC server (assuming you are using the built-in NC authentication methods) and active users is (I assume) the number of users that have been active on the NC server during a given time period. You can see the same data (but only in realtime, no historical data) if you log in as the NC admin on the NC server, then click the "icon" representing that user in the upper right corner. Now choose "Settings". A menu will appear on the left side, click on "System".
Thanks a lot for your reply, much appreciated.
Well, i double checked the installation and setup steps, stricly following the install guide (I might have misconfigured something though).
The add-on is installed, and both scripts are there.
As for the app debugging dashboard :
Last status message of nextcloud-status script
Last status message of nextcloud-info script
Any curl error codes?
-> Search did not return any events
Log messages in app debugging dashboard :
- Several occurences of "The maximum number of concurrent historical searches on this instance has been reached."
- A few "ERROR ExecProcessor - message from "/opt/splunk/etc/apps/TA-nextcloud/bin/nextcloud-info.sh" /opt/splunk/etc/apps/TA-nextcloud/bin/nextcloud-info.sh: line 9: curl: command not found" (and the same for nextcloud-status.sh).
Is this info of some help?
TIA,
jean-claude
Based on the error message, my guess is that either the "curl" command isn't installed on the host where the scripts are, or the scripts can't load the curl command. So check for the curl command first. If it does exist, find the location (the "whereis" command is useful for that) and modify the scripts so the path precedes the curl command.