All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @chenfan , the impact on license is null because you pay license based on the logs that are daily indexed, so probably they will be the same. About feature, you have many additional feature in t... See more...
Hi @chenfan , the impact on license is null because you pay license based on the logs that are daily indexed, so probably they will be the same. About feature, you have many additional feature in the new Splunk version, you can read at the links I shared to see the new features and the removed features. Put very much attention to the migration path and follow every step (even if it's very long!), because between 7 and 9 versions there were many structural changes (Pyton, mongodb, html, etc...). Then you have also to upgrade all the apps, because some of them aren't compatible with the old app versions. Then remember thet there's an orden in upgrading: Cluster Manager, Search Heads, Indexers, Other Splunk Servers (e.g. Deployment Server or Monitoring Console), Heavy Forwarders Universal Forwarders; and this order must be maintained for each upgrade level (7->8 all the steps, 8->9 all the steps). Last hint: plan all the steps in a document to be sure that you aren't forgotting any step. As I said, it will be a very long job, and it could be a good idea, to engage a certified Splunk Architect in the design phase and eventually also in the execution phase. Ciao. Giuseppe
Hi @gcusello  I am very confused, if we upgrade Splunk Enterprise from version 7.x.x to version 9.x.x, what impact will it have on the license? And will it affect the use of functions?
Hi @AstinSebastian  I have recently had to wait 4 weeks for my Splunkbase submissions to be reviewed, this is typically due to manual_checks, when I ran AppInspect against your app it looks like the... See more...
Hi @AstinSebastian  I have recently had to wait 4 weeks for my Splunkbase submissions to be reviewed, this is typically due to manual_checks, when I ran AppInspect against your app it looks like there are 2 manual checks to be done, such as: Security vulnerabilities Check for insecure HTTP calls in Python. MANUAL_CHECK: Possible insecure HTTP Connection. Match: requests.get Positional arguments, ["?"]; Keyword arguments, {"timeout": "?"} File: bin/wmi_exp.py Line Number: 38 MANUAL_CHECK: Possible insecure HTTP Connection. Match: requests.get Positional arguments, ["?"]; Keyword arguments, {"timeout": "?"} File: bin/wmi_exp051224_working.py Line Number: 23 This will get assigned to a Splunk Engineer who manually vets the code and then will either pass or fail it, you will get an email notification once this has been completed. Did you know that you have also included "wmi_exp051224_working.py" in your app within the /bin directory? This has a hard-coded windows path for a config_file variable passed to the system open() function, this might also cause a manual check, and potential failure due to hard-coded windows path which is not compatible with Cloud. Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
Hello, I am sorry. I tried many ways, but i was completly looking at examples that did not helped me This indeed solved the question. Thank you for your help. Harry
Hi @elizabethl_splu, Good day, is this implemented already? I have a requirement to hide job inspect, full screen and refresh which comes up upon mouse hovering on icons/single values, i was able to... See more...
Hi @elizabethl_splu, Good day, is this implemented already? I have a requirement to hide job inspect, full screen and refresh which comes up upon mouse hovering on icons/single values, i was able to hide search and export using this doc Apply view mode settings for dashboards but I can't find any doc related to hiding the other options.  Thanks in advance!
Hi @Sathish28 , probably there's a little error in your question: the last version of ES is 8.x, there's isn't any 9.x version (for now), probably 9.0.3 is the Splunk Enterprise version. Then, did ... See more...
Hi @Sathish28 , probably there's a little error in your question: the last version of ES is 8.x, there's isn't any 9.x version (for now), probably 9.0.3 is the Splunk Enterprise version. Then, did you checked the resources on the physical machine? at first if they are sufficient and anyway, if they are different, you have to chenge some configuration in Splunk e.g. the number of concurrent searches. Ciao. Giuseppe
Hi @larrydavid , the easiest approach is to create a lookup (eventually an automatic one!) containing the combinations of apps and hosts to define the environments, so you can use the lookup in your... See more...
Hi @larrydavid , the easiest approach is to create a lookup (eventually an automatic one!) containing the combinations of apps and hosts to define the environments, so you can use the lookup in your searches, something like this: environment app host env1 app1 host1 env1 app1 host2 env1 app1 host3 env2 app2 host4 env2 app2 host5 env2 app2 host6 env3 app3 host7 env3 app3 host8 env3 app3 host9 One additiona question: if each application uses some servers and there's a relation 1:n between apps and hosts, why you don't use only apps to define your environment? then, remember the there's the IN() operator to use instead of OR: source=*app1.log host IN (host1,host2,host3,host4) it's smaller! Ciao. Giuseppe 
Hi @anooshac  If you want to run this on a schedule then you might want to look at putting this into a Bash script and running as a cronjob.  Once you have a working CURL command, add this into a b... See more...
Hi @anooshac  If you want to run this on a schedule then you might want to look at putting this into a Bash script and running as a cronjob.  Once you have a working CURL command, add this into a bash script, ensure it is executable (chmod +x) and then add to your user's cron (crontab -e) To run hourly you would do something like 1 * * * * which would run at 1 minute past each hour. This assumes you are running a Linux system. Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
Hi @KKuser  If you are running Splunk Cloud then you might find you have multiple SH, this would mean the addresses are something like es-<stackName>.splunkcloud.com and itsi-<stackName>.splunkcloud... See more...
Hi @KKuser  If you are running Splunk Cloud then you might find you have multiple SH, this would mean the addresses are something like es-<stackName>.splunkcloud.com and itsi-<stackName>.splunkcloud.com - In this example they are part of the same deployment. However, there are other ways that Splunk deployments can be configured and connected, such as multiple SH/SHC as search peers on a single or multisite cluster if on-premise. These SH can be independent to each other but ultimately connect to the same indexers.  You can also setup federated search between different instances so they can search the same data. Either way, in these cases users are typically configured independently. It would be good to understand what you are trying to do, or what information you're trying to pull together, along with any other info you have (e.g. is this a Splunk Cloud, or on-premise deployment)?  Then I might be able to tailor the advise further. Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
@daniedoe  You're correct. The splunkdConnectionTimeout setting in web.conf primarily affects how Splunk Web (UI) interacts with splunkd. For direct REST API calls made to splunkd on port 8089, the ... See more...
@daniedoe  You're correct. The splunkdConnectionTimeout setting in web.conf primarily affects how Splunk Web (UI) interacts with splunkd. For direct REST API calls made to splunkd on port 8089, the timeout behavior can be different. If you need more detailed information, you can refer to the Splunk REST API Solved: How do I change the REST API execution timeout? - Splunk Community  Access endpoint descriptions - Splunk Documentation
Hi @kiran_panchavat , adding a bit of information the the perfect answer of @kiran_panchavat: it's always a best practice to save all the customizations that you did in ES in a custom app, e.g. cus... See more...
Hi @kiran_panchavat , adding a bit of information the the perfect answer of @kiran_panchavat: it's always a best practice to save all the customizations that you did in ES in a custom app, e.g. custom field extractions, custom correlation searches or dashboards or reports, or, as in your case, macros: don't leave anything custom in the Enterprise Security (and the other module) app. Ciao. Giuseppe
Hi @SN1  I would recommend running the following on our old SH to find out where the macro is easily:   /opt/splunk/bin/splunk btool macros list MacroName --debug   Replace MacroName with the na... See more...
Hi @SN1  I would recommend running the following on our old SH to find out where the macro is easily:   /opt/splunk/bin/splunk btool macros list MacroName --debug   Replace MacroName with the name of your missing macro - this should output a the configuration of the macro and include the path that the macro resides in. If you still do not see the macro there then it could be a private Knowledge Object. Did you copy you user's custom data from /opt/splunk/etc/users aswell? Did you copy all the apps from the old SH to the new SH? Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
Hi @Cheng2Ready , if you need to exclude only the days following holidays, you approach is correct. if instead you need to exclude both the holidays and the one followind days, you have to implemen... See more...
Hi @Cheng2Ready , if you need to exclude only the days following holidays, you approach is correct. if instead you need to exclude both the holidays and the one followind days, you have to implement a mix between the two solutions with both the checks. let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
@AstinSebastian  All apps uploaded to Splunkbase are automatically added to the Cloud Vetting queue. However, due to a significant backlog, apps that are not associated with customer installation re... See more...
@AstinSebastian  All apps uploaded to Splunkbase are automatically added to the Cloud Vetting queue. However, due to a significant backlog, apps that are not associated with customer installation requests and require manual vetting may remain in a pending state for an extended period. On the other hand, if your app has no manual_check results and no errors or failures, it can be automatically marked as compatible with one or both Splunk Cloud architectures, effectively bypassing the manual vetting queue. Of course, if your app encounters any errors or failures, you will receive an automated failure notification.
@KKuserIt appears that you might be operating two separate Splunk Cloud instances. Please have a look : https://community.splunk.com/t5/Deployment-Architecture/Search-Head-on-Splunk-Cloud/m-p/204981?... See more...
@KKuserIt appears that you might be operating two separate Splunk Cloud instances. Please have a look : https://community.splunk.com/t5/Deployment-Architecture/Search-Head-on-Splunk-Cloud/m-p/204981?utm_source=chatgpt.com 
How to find out whether both Splunk instances are connected or not?
How to find out whether both Splunk instances are connected or not?
hi @livehybrid , Thanks for the reply, Is there any way that i can schedule this export? Since ii have a tool which is scheduled to run every 1 hour.
@SN1When migrating from an old search head to a new one, it's essential to ensure that all configurations, including macros, are correctly transferred. However, if you're encountering issues such as ... See more...
@SN1When migrating from an old search head to a new one, it's essential to ensure that all configurations, including macros, are correctly transferred. However, if you're encountering issues such as missing macros after the migration, it indicates that some components may not have been properly moved. To address this, I recommend reaching out to Splunk Support for personalized assistance.  
@SN1  Locate Macros in the Old Search Head From the Splunk UI: Navigate to Settings > Advanced Search > Search Macros