All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Gururaj1  Just to check - apart from the errors you mentioned - Does Splunk install correctly? Those errors dont necessarily mean there is an issue - its likely that part of the debian preinst s... See more...
Hi @Gururaj1  Just to check - apart from the errors you mentioned - Does Splunk install correctly? Those errors dont necessarily mean there is an issue - its likely that part of the debian preinst script which calls a "temp_splunk-preinstall" file - this is looking for those locations to do *something* - "find" is usually used to "find" something (file/folder) and then do something to it like update permissions or something.  If the find returns no files the script looks to continue, finally finishing with "complete" - at this point I'd expect your install to be complete - Since you splunk validate command returned a success I thing these "errors" are benign.  The existence of them is either a mistake in that script - OR - it could be for something we arent necessarily aware of. Either way, If the files get installed then I'm confident this isnt an issue.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Oh, thanks for the confirmation.. 
Hi @wm  Given that you are using the same connection details and the servers should be setup the same, it feels like the issue could be either network related or authentication related, as you have ... See more...
Hi @wm  Given that you are using the same connection details and the servers should be setup the same, it feels like the issue could be either network related or authentication related, as you have proven Splunk to be working. Try and establish a telnet connection between your Splunk host and the non-working database host on the relevant DB port. If this works then it demonstrates that the firewall is allowing connection. Double check the authentication credentials - are you using different passwords for the 2 databases or are they using a domain account? If a domain account, does it have the relevant permissions for the non-working server?  Are there any logs you can see on the database server around the connection?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @tech_g706  The last version that was listed as supporting Windows 2008 was 7.2.10 (See https://docs.splunk.com/Documentation/Splunk/7.2.10/Installation/Systemrequirements) - This lists it as Dep... See more...
Hi @tech_g706  The last version that was listed as supporting Windows 2008 was 7.2.10 (See https://docs.splunk.com/Documentation/Splunk/7.2.10/Installation/Systemrequirements) - This lists it as Deprecated and it was removed from the next version. UF 7.2.10 actually went out of Splunk support in April 2021 (With P3 support ending in 2023). The following shows a matrix of supported forwarders->Indexers which might also be useful: https://docs.splunk.com/Documentation/VersionCompatibility/current/Matrix/Compatibilitybetweenforwardersandindexers Unfortunately it looks like you might already be on the latest version known to work with Windows 2008. It not being listed doesnt necessarily mean it wont work with 2008 - but it wont be supported and there may be issues we arent aware of.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
I wouldn't expect any modern UF to be not only supported but also to work. That windows release is so out of support that I'd be more concerned with the OS itself than the UF version.
Hi @tech_g706 , there isn't any 8.x version certified compatible with Windows 2008/R2 and supported by Splunk. Probably the 9.4.x version runs on Windows 2008/R2 but it isn't certified and supporte... See more...
Hi @tech_g706 , there isn't any 8.x version certified compatible with Windows 2008/R2 and supported by Splunk. Probably the 9.4.x version runs on Windows 2008/R2 but it isn't certified and supported by Splunk. The only that  can formally answer to your question is Splunk Support. Ciao. Giuseppe
Although it looks like you should be able to use the "move" link in "Lookup definitions", you cannot. The best way to achieve this is to create a new KVstore in the desired location then copy the KV... See more...
Although it looks like you should be able to use the "move" link in "Lookup definitions", you cannot. The best way to achieve this is to create a new KVstore in the desired location then copy the KVstore data there. | inputlookup my_old_kvstore | outputlookup my_new_kvstore I did create a Splunk Idea to resolve this issue. Please vote for it here.
After moving a KVstore to a new Application, Splunk can not longer render the lookup. When moving a KVstore using the "move" link in "Lookup definitions" it only moves the transforms.conf stanza and... See more...
After moving a KVstore to a new Application, Splunk can not longer render the lookup. When moving a KVstore using the "move" link in "Lookup definitions" it only moves the transforms.conf stanza and not the collections.conf stanza. The on-prem solution is to manually move collections.conf but this cannot be done in Splunk Cloud.
Thanks, Upon checking the logs, it seems MongoDB is not running on the heavy forwarder, and that would be required. ta_netskopeappforsplunk_netskope_alerts_v2.log splunklib.binding.HTTPError: ... See more...
Thanks, Upon checking the logs, it seems MongoDB is not running on the heavy forwarder, and that would be required. ta_netskopeappforsplunk_netskope_alerts_v2.log splunklib.binding.HTTPError: HTTP 503 Service Unavailable -- KV Store initialization failed. 2: splunkd.log: ERROR KVStoreConfigurationProvider [15442 KVStoreConfigurationThread] - Failed to start mongod on first attempt reason=KVStore service will not start because kvstore process terminated  ERROR KVStoreConfigurationProvider [15442 KVStoreConfigurationThread] - Could not start mongo instance. Initialization failed.  ERROR KVStoreBulletinBoardManager [15442 KVStoreConfigurationThread] - Failed to start KV Store process. See mongod.log and splunkd.log for details. INFO  KVStoreConfigurationProvider [15442 KVStoreConfigurationThread] - Mongod service shutting down
Hi, We have some Windows 2008 R2 machines where UF 7.2 is running, and in Splunk Cloud CMC, we are getting the forwarder status as critical. My question is, can we install 9.4 or above 9 on the  Wi... See more...
Hi, We have some Windows 2008 R2 machines where UF 7.2 is running, and in Splunk Cloud CMC, we are getting the forwarder status as critical. My question is, can we install 9.4 or above 9 on the  Windows 2008 R2 machines? If not, what is the minimum supported UF version for Windows 2008 R2, and possible workaround to mitigate the critical status notification on Splunk Cloud CMC?   Thank you!
You should not need to enable the inputs separately on each indexer. That's what the CM-pushed apps are for.
@wm  Did you check the primary server is reachable from the Splunk server:  telnet <ServerName> <port>. Given the secondary database works, the issue is likely network-related (firewall, port, DNS)... See more...
@wm  Did you check the primary server is reachable from the Splunk server:  telnet <ServerName> <port>. Given the secondary database works, the issue is likely network-related (firewall, port, DNS) or authentication-related on the primary server. Ensure that no firewalls, network security groups, or proxies are blocking or interrupting the connection between the Splunk server and the HPC primary database.  
@kiran_panchavat @livehybrid Unfortunately , there seems to be no solution for this, i tried and followed the exact steps mentioned in the given below Splunk Doc, but i encounter the same error again... See more...
@kiran_panchavat @livehybrid Unfortunately , there seems to be no solution for this, i tried and followed the exact steps mentioned in the given below Splunk Doc, but i encounter the same error again and again. Do u know how to mitigate this scenarios or any further suggestions on why this is happening. I tried in few of the different settings of VMs where different platforms of linux were installed along with that i increased the capacity of the OS to ensure no issues on the other side. The issue seems to residing in unbundling of the downloaded packages. I understand installation is being corrupted but the package is being downloaded from splunk site right? it shows all files intact but unable to host forwarder in an instance I am stuck here with no specific solutions...   FYI -Tried different linux OS/versions, tried downgrading the SF versions as well... increased capacity of OS
@PickleRick @two separate apps - one with general HEC input settings, another with a token definition for your particular input needs) with CM in an app which is pushed to indexers... So here I crea... See more...
@PickleRick @two separate apps - one with general HEC input settings, another with a token definition for your particular input needs) with CM in an app which is pushed to indexers... So here I created two apps under etc/master-apps and in one enabled http and in other given token and other parameters. So once we push this to indexers.. HEC will be enabled right in indexers? Or do we need to edit each indexer etc/apps/http_input/local/inputs.conf and give [http] disabled = 0 in order to enable? Or already pushed app will do the similar work?
Hi @livehybrid , Thanks for the reply. We've already updated the build version and deployed it to the marketplace. Trying to understand why it's causing issues during an update, while working perfe... See more...
Hi @livehybrid , Thanks for the reply. We've already updated the build version and deployed it to the marketplace. Trying to understand why it's causing issues during an update, while working perfectly on a fresh install.
Hi, The problem seems to be the self signed certificate that was issued by Splunk from the cloud instance. It is not compatible with ver 9.4. I was wondering if it was just me who is experiencing t... See more...
Hi, The problem seems to be the self signed certificate that was issued by Splunk from the cloud instance. It is not compatible with ver 9.4. I was wondering if it was just me who is experiencing the issue or if some one else is experiencing it. But for now we are sticking with ver 9.3.1 in our HF until a fix is released by Splunk.  
I/O Error: DB server closed connection. Hi all, I am getting this error above when connecting to HPC primary database, secondary database connection works well on splunk db connect with exact same... See more...
I/O Error: DB server closed connection. Hi all, I am getting this error above when connecting to HPC primary database, secondary database connection works well on splunk db connect with exact same settings. From memory, hpc primary database connection used to be working too. I am using MS-SQL Server using jTDS Driver with Windows Authentication  jdbc:jtds:sqlserver://<ServerName>:<port number>/master;useCursors=true;domain=<domain_name>;useNTLMv2=true FYI this worked well for hpc secondary database with same configurations except <ServerName>
@livehybrid @kiran_panchavat PFA, this is so weird, i have seen few cases like this in the community with no solution. I tried multiple OS of Linux and older versions of forwarder, but its still the ... See more...
@livehybrid @kiran_panchavat PFA, this is so weird, i have seen few cases like this in the community with no solution. I tried multiple OS of Linux and older versions of forwarder, but its still the same.. Please find a solution as i tried a lot of versions here... splunkforwarder-9.4.1-e3bdab203ac8-linux-amd64-manifest older version 9.4.0 splunk forwarder - splunkforwarder-9.4.0-6b4ebe426ca6-linux-amd64-manifest  
As I said before in another thread - this is something that should be best discussed with your local Splunk Partner. I can tell you that you can just enable HEC input on the HF and send all the data... See more...
As I said before in another thread - this is something that should be best discussed with your local Splunk Partner. I can tell you that you can just enable HEC input on the HF and send all the data there so that the HF distributes it to the indexers but there are other possible issues with this setup. Since you probably have just one HF, you're gonna be creating a SPOF. It might also not be able to process all the load you're gonna push on it. That's why at the start I said that this problem only looks easy. There are several approaches to solving it which might look right from the technical point of view, meaning that each of them "should work" but some of them might be better from the business point of view given specific circumstances unique to your situation and environment.
@PickleRick we already have HF configured in our environment. So how to do it?