All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @Naoki  If you are using a standalone SearchHead then yet you can upload via the UI, ticking the upgrade button - however you might hit issues due to the size of the tarball file being higher tha... See more...
Hi @Naoki  If you are using a standalone SearchHead then yet you can upload via the UI, ticking the upgrade button - however you might hit issues due to the size of the tarball file being higher than the limits.conf set for file uploads. To overcome this see the following snippet from https://splunk.my.site.com/customer/s/article/Python-for-Scientific-Computing 1) To upgrade via UI needs to change max_upload_size at path $SPLUNK_HOME/etc/system/local/web.conf: [settings] max_upload_size = 2048 2) For clustered environment, increase the bundle size to the same setting as max_content_length on the SHs as follow: 1. Navigated to /opt/splunk/etc/system/local/distsearch.conf on the server. 2. Append/Update the following parameters:- [replicationSettings] maxBundleSize = 2048 or 3072 3. Save the file and restart all SHs.  Alternatively you can copy it to the Splunk instance's app folder ($SPLUNK_HOME/etc/apps) overwritting the previous content.  If in any doubt please make a backup copy of the PSC app on your Splunk instance first.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @RAVISHANKAR  Whilst you are right in that the 8.0.x UF forward can send events/metrics to 9.4.x it is important to note that 8.0.x UFs are no longer supported by Splunk. So technically, yes, it ... See more...
Hi @RAVISHANKAR  Whilst you are right in that the 8.0.x UF forward can send events/metrics to 9.4.x it is important to note that 8.0.x UFs are no longer supported by Splunk. So technically, yes, it will work - but from a support standpoint you need to upgrade UFs to 9.1.x to still be supported by Splunk, although that is only until 28th June (17 days!) so I would recommend a minimum of 9.2.x For more info on supported Splunk versions check out https://www.splunk.com/en_us/legal/splunk-software-support-policy.html?locale=en_us  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi @cdevoe57  If you want to use the lookup as a source of truth for the list of hosts I would use the following, also just a note that I'm suggesting tstats here which is *much* more performant tha... See more...
Hi @cdevoe57  If you want to use the lookup as a source of truth for the list of hosts I would use the following, also just a note that I'm suggesting tstats here which is *much* more performant than a regular index= search. | tstats latest(_time) as _time WHERE index=servers sourcetype=logs by host | eval last_seen_ago_in_seconds = now() - _time | eval System_Name = host | append [|inputlookup system_info.csv | eval last_seen_ago_in_seconds=9999] | stats min(last_seen_ago_in_seconds) as last_seen_ago_in_seconds, values(Location) AS Location, values(Responsible) AS Responsible by System_Name | eval MISSING = if(isnull(last_seen_ago_in_seconds) OR last_seen_ago_in_seconds>7200, "MISSING", "GOOD") | where MISSING=="MISSING" | sort -last_seen_ago_in_seconds This works by appending the system_info.csv with a large last_seen_ago_in_seconds which is updated by a lower last_seen_ago_in_seconds value if the host has been found.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi Team, Planned to upgrade Splunk Enterprise from Version 9.2.1 to 9.4.2 Latest - Currently my Splunk UF version is 8.0.5. Will 8.0.5 support or i need to upgrade UF version too? Compatibility ... See more...
Hi Team, Planned to upgrade Splunk Enterprise from Version 9.2.1 to 9.4.2 Latest - Currently my Splunk UF version is 8.0.5. Will 8.0.5 support or i need to upgrade UF version too? Compatibility between forwarders and Splunk Enterprise indexers - Splunk Documentation It says UF 8.0.X will be compatible for 9.4.X (E,M) Events and metrics. Need further clarification on this whether i should upgrade UF or it's ok to be on 8.0.X version. Thanks  
  also in future we would be Decommissioning the indexer after i have send the data to sh and then i will be sending data directly to sh
also in future we would be Decommissioning the indexer after i have send the data to sh and then i will be sending data directly to sh
@SN1  If you must move indexed data from the indexer to the Search Head, you can copy the data files,  Stop Splunk on both the indexer and the Search Head. Copy the index data directories from... See more...
@SN1  If you must move indexed data from the indexer to the Search Head, you can copy the data files,  Stop Splunk on both the indexer and the Search Head. Copy the index data directories from the indexer to the Search Head: Example: Copy $SPLUNK_HOME/var/lib/splunk/<index_name> from the indexer to the same path on the Search Head. Ensure file ownership,permissions,storage size,os and splunk versions are correct on the Search Head. Also make sure you have configuration for the indexes.conf for the indexes you have. Start Splunk on the Search Head. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos/Karma. Thanks!
In the meantime, if you are wanting to move the existing data from your indexer to your SH then stop Splunk on both servers and copy the full directory structure for each index (usually under $SPLUNK... See more...
In the meantime, if you are wanting to move the existing data from your indexer to your SH then stop Splunk on both servers and copy the full directory structure for each index (usually under $SPLUNK_DB, by default $SPLUNK_HOME/var/lib/splunk/<indexname>) from the old indexer to the new server. After copying, ensure Splunk points to the correct path for these indexes in indexes.conf on the new instance. Restart Splunk on the new instance for the data to be available. If there are no existing indexes with the same name on the new instance, you can simply copy the directories.  Both source and destination should use the same OS and compatible Splunk versions and don't copy buckets from newer Splunk versions to much older versions. If your SH is still setup to search your IDX then you should probably disconnect it at this point as your may see duplicate data.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @SN1  I'm not 100% sure if I'm following what your requirements are here, which scenario is this? 1) You want to move existing stored data from your indexer to be stored on your SH to turn it in... See more...
Hi @SN1  I'm not 100% sure if I'm following what your requirements are here, which scenario is this? 1) You want to move existing stored data from your indexer to be stored on your SH to turn it into an All-In-One? 2) Configure the indexer to forward new data as it arrives to the SH? 3) Move existing data *and* configure forwarding of new data to the SH? Please let me know so we can provide a better response.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
ok let me explain briefly ,  we are making our search head a standalone . so now i want to send some important data from indexer which is currently being stored and onboarded from the source to Se... See more...
ok let me explain briefly ,  we are making our search head a standalone . so now i want to send some important data from indexer which is currently being stored and onboarded from the source to Search head . Is this clear.
@SN1  To clarify your scenario, You want Search Head to query the indexer for data as needed or having data on the Search Head for  testing/some reason without using the indexer?
we have a index where the data is currently being stored and indexed on the indexer . Now i am making Search head standalone and i want to send the data from indexer to sh . How to do it.
There could be maximum 3 versions per day.
We are now using the Python for Scientific Computing app (v2.0.2) on a on-premise Linux instance, and planning to upgrade the app to the latest version 4.2.3. When upgrading, should we just upload t... See more...
We are now using the Python for Scientific Computing app (v2.0.2) on a on-premise Linux instance, and planning to upgrade the app to the latest version 4.2.3. When upgrading, should we just upload the app package through the splunk web and check the upgrade checkbox? Python for Scientific Computing (for Linux 64-bit) | Splunkbase
@a1bg503461  The error highlights that, opt_ca_certs_path is not defined in your configuration Are you using SSL/TLS with Elasticsearch? If yes Make sure you mention your .crt path in your config ... See more...
@a1bg503461  The error highlights that, opt_ca_certs_path is not defined in your configuration Are you using SSL/TLS with Elasticsearch? If yes Make sure you mention your .crt path in your config Eg: opt_ca_certs_path = /path/to/your/ca.crt If you are not using SSL/TLS, then try below in your config use_ssl = 0 # opt_ca_certs_path = Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos/Karma. Thanks!
@cdevoe57  As mentioned by @bowesmana  Its not best to use join, as it can sometimes cause fields from lookup to be lost. but anyway can you try below if you still want to use join, | inputlookup ... See more...
@cdevoe57  As mentioned by @bowesmana  Its not best to use join, as it can sometimes cause fields from lookup to be lost. but anyway can you try below if you still want to use join, | inputlookup system_info.csv | eval System_Name=System | join type=left System_Name [ | search index=servers sourcetype=logs | stats latest(_time) as Time by System_Name | eval mytime=strftime(Time,"%Y-%m-%dT%H:%M:%S") | eval now_time = now() | eval last_seen_ago_in_seconds = now_time - Time ] | stats values(*) as * by System_Name | lookup system_info.csv System_Name OUTPUT Location Responsible | eval MISSING = if(isnull(last_seen_ago_in_seconds) OR last_seen_ago_in_seconds>7200,"MISSING","GOOD") | where MISSING=="MISSING" | table System_Name Location Responsible MISSING or you can also try and check below, without join. index=servers sourcetype=logs | stats latest(_time) as Time by System_Name | eval last_seen_ago_in_seconds = now() - Time | eval MISSING = if(isnull(last_seen_ago_in_seconds) OR last_seen_ago_in_seconds>7200, "MISSING", "GOOD") | where MISSING=="MISSING" | lookup system_info.csv System_Name OUTPUT Location Responsible | table System_Name Location Responsible MISSING last_seen_ago_in_seconds | sort -last_seen_ago_in_seconds Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos/Karma. Thanks!
@caschmid  \d+ matches only digits, not any word. If "This is my" is always constant, you can try below rex field=_raw "This is my (?<string>\w+)" | stats count by string Regards, Prewin Splu... See more...
@caschmid  \d+ matches only digits, not any word. If "This is my" is always constant, you can try below rex field=_raw "This is my (?<string>\w+)" | stats count by string Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos/Karma. Thanks!
So let me start by saying I've been struggling with these lookup commands. My objective here is to use the lookup as it contains all known servers to find the servers that are not logging.  Includ... See more...
So let me start by saying I've been struggling with these lookup commands. My objective here is to use the lookup as it contains all known servers to find the servers that are not logging.  Including those that have yet to log.  This modified query now gives me the other fields.   However, the results are wrong. In the ned I need to  get the list of servers in the lookup that are not in the query results (index=servers sourcetype=logs)
Hello @heathramos ,  From the DNS Server you can find possible ERROR logs around issue by going to $SPLUNK_HOME/var/log/splunk and search for file named streamfwd.log please check the ERROR and sh... See more...
Hello @heathramos ,  From the DNS Server you can find possible ERROR logs around issue by going to $SPLUNK_HOME/var/log/splunk and search for file named streamfwd.log please check the ERROR and share here so we can help you with possible things.
Hello @Dallastek1 , Explanation given by @livehybrid is all correct here based on ERROR messages and screenshot it appears that you are missing required permissions, also make sure that you have give... See more...
Hello @Dallastek1 , Explanation given by @livehybrid is all correct here based on ERROR messages and screenshot it appears that you are missing required permissions, also make sure that you have given permission as App based and not Delegated one as they are not correct form of permissions.