All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I am looking to add text as well.   I am trying to add tooltip string but havent had any luck.
Hi @MK3 , sorry but there's some confision in your question: to forward data from Forwarders to Splunk Enterprise you have to follow the instructions at: https://docs.splunk.com/Documentation/Splu... See more...
Hi @MK3 , sorry but there's some confision in your question: to forward data from Forwarders to Splunk Enterprise you have to follow the instructions at: https://docs.splunk.com/Documentation/SplunkCloud/latest/Forwarding https://docs.splunk.com/Documentation/Splunk/9.3.0/Data/Forwarddata to forward data you need outputs.conf that can be in $SPLUNK_HOME/etc/system/local or a  dedicated app. to take logs, you need inputs.conf that's in the same folder. props.conf and transforms.conf are  in the same folder, but usually aren't relevant on Forwarders (if Universal) $SPLUNK_HOME is the folder where you installed Splunk, by default it's C:\Program Files\splunk on Windows and /opt/splunk on Linux. You cannot send indexed data from an Heavy Forwarder, because it doesn't index data, but maybe you mean coocked data: you can send coocked (or uncooked data) to a third party using syslog. To send data to an external database you must use DB-Connect on Search Heads, but it's a different thing. Ciao. Giuseppe
hello, as per https://docs.splunk.com/Documentation/Splunk/9.3.0/Forwarding/EnableforwardingonaSplunkEnterpriseinstance where are the files like outputs, props and transforms stored? i am using spl... See more...
hello, as per https://docs.splunk.com/Documentation/Splunk/9.3.0/Forwarding/EnableforwardingonaSplunkEnterpriseinstance where are the files like outputs, props and transforms stored? i am using splunk web enterprise. Also where is my $splunk_home? am trying to setup heavy forwarding to send indexed data to a database on a schedule. thanks
Still same error also our application has multiple jars.
Hi @Easwar.C, Can you confirm if the latest reply helped answer your post or not? If not, reply back and keep the conversation going. 
I'm using the Splunk TA for linux to collect serverlogs. Some background Looking in the "_internal" log I am seing a lot of these errors: 08-23-2024 15:52:39.910 +0200 WARN DateParserVerbose [6460... See more...
I'm using the Splunk TA for linux to collect serverlogs. Some background Looking in the "_internal" log I am seing a lot of these errors: 08-23-2024 15:52:39.910 +0200 WARN DateParserVerbose [6460 merging_0] - A possible timestamp match (Wed Aug 19 15:39:00 2015) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source=lastlog|host=<hostname>|lastlog|13275 08-23-2024 15:52:39.646 +0200 WARN DateParserVerbose [6460 merging_0] - A possible timestamp match (Fri Aug 7 09:08:00 2009) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source=lastlog|host=<hostname>|lastlog|13418 08-23-2024 15:52:32.378 +0200 WARN DateParserVerbose [6506 merging_1] - A possible timestamp match (Fri Aug 7 09:09:00 2009) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source=lastlog|host=<hostname>|lastlog|13338 This is slightly confusing and somewhat problematic as  the "lastlog" is collected not through a filewatch but from scripted output. The "lastlog" file is not collected/read and a stats-check on the file confirms accurate dates. However, this is not the source of the problem. I cannot se anything in the output from the commands in the script (Splunk_TA_nix/bin/lastlog.sh) which would indicate the precense of a "year"/timestamp. The indexed log does not contain "year" and the actual _time timestamp is correct. These "years" in "_internal" are also from a time when the server was not running/present, so they are not collected from any actual source "on the server". And the questions - Why am I seeing these errors - From where are these problematic "timestamps" generated - How do I fix the issue All the best  
Hi @Jeffrey.Escamilla, It looks like the community has not yet jumped in to reply. Have you happened to find a solution or more information you can share? If you still need help with this questi... See more...
Hi @Jeffrey.Escamilla, It looks like the community has not yet jumped in to reply. Have you happened to find a solution or more information you can share? If you still need help with this question, you can reach out to your CSM or contact AppDynamics Support: AppDynamics is migrating our Support case handling system to Cisco Support Case Manager (SCM). Read on to learn how to manage your cases.  If you get an answer, it would be helpful if you could come back and share it here. 
Hello,   I want to create a dataset for Machine Learning, I want kpi name and Service Health Score as field name and their value as value for last 14 days, how do i retrieve kpi_value and health_... See more...
Hello,   I want to create a dataset for Machine Learning, I want kpi name and Service Health Score as field name and their value as value for last 14 days, how do i retrieve kpi_value and health_score value, is it stored somewhere in itsi index? I cannot find kpi_value field in index=itsi_summary #predictive analaytics #machine learning, splunk it #predictive analytic  Splunk Machine Learning Toolkit  #Splunk ITSI Also, if you have done Machine Learning / Predictive ANalytics in your environment, please suggest a approach 
Hi @Sarath Kumar.Sarepaka, I was trying to find more information for you, but could not. Since the Community did not jump in, you can contact AppDynamics Support, or you can reach out to your AppD ... See more...
Hi @Sarath Kumar.Sarepaka, I was trying to find more information for you, but could not. Since the Community did not jump in, you can contact AppDynamics Support, or you can reach out to your AppD Rep/CSM AppDynamics is migrating our Support case handling system to Cisco Support Case Manager (SCM). Read on to learn how to manage your cases. 
I have asked ChatGPT. The answer is as below. So I don't think there is easy way like modify conf file can reslove this issue. As of the latest available information, there are no widely recognized ... See more...
I have asked ChatGPT. The answer is as below. So I don't think there is easy way like modify conf file can reslove this issue. As of the latest available information, there are no widely recognized third-party solutions or community-contributed add-ons specifically tailored for Splunk to collect logs from Azure China. Most existing add-ons, including the official *Splunk Add-on for Microsoft Cloud Services*, are designed for the global Azure environment and may require customization to work with Azure China. ### Options and Workarounds: 1. **Customization of Existing Add-ons**: - You can manually modify the Splunk Add-on for Microsoft Cloud Services to point to the Azure China endpoints by editing the configuration files directly. This is the most common workaround but requires technical know-how to ensure compatibility and proper data collection. 2. **Custom Scripts**: - If modifying existing add-ons is too complex or not feasible, you can create custom scripts using Azure SDKs (like Python SDK) to pull data from Azure China and forward it to Splunk using the HTTP Event Collector (HEC). 3. **Using REST API**: - Another approach is to use the Splunk Add-on for REST APIs to interact directly with Azure China's API endpoints. This method gives you the flexibility to collect any data available via the Azure China REST API. 4. **Community Forums and Contributions**: - While specific tailored add-ons for Azure China are not available, you may find discussions or shared configurations on the [Splunk Community Forums](https://community.splunk.com/) or other community-driven platforms like GitHub, where users may have shared their custom solutions. ### Keeping Up-to-Date: It's recommended to regularly check Splunkbase and participate in community discussions to stay updated on any new add-ons or tools that might become available for Azure China. For more details, you can visit [Splunkbase](https://splunkbase.splunk.com/) and the [Splunk Community](https://community.splunk.com/)【17†source】【18†source】.
https://community.splunk.com/t5/All-Apps-and-Add-ons/Is-the-Splunk-Add-on-for-Microsoft-Cloud-Services-or-Splunk-Add/m-p/646898 I have tried and failed. When asking for support, they replied no offi... See more...
https://community.splunk.com/t5/All-Apps-and-Add-ons/Is-the-Splunk-Add-on-for-Microsoft-Cloud-Services-or-Splunk-Add/m-p/646898 I have tried and failed. When asking for support, they replied no official support for Azure China. So this issue has not been resolved yet.  
I have tried and failed. When asking for support, they replied no official support for Azure China. So this issue has not been resolved yet.
What do we use for the Base URL when configuring the App's Add-on Settings? Should this be left to slack.com/api as default?
Authentication datamodel
Hi @vid1 , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Authentication datamodel that macro is not there in my macros list
identifying the correct sourcetype removed only one part of the header, still however it does not remove the priority and the other part of the header... I had already tried that. I thank you, do y... See more...
identifying the correct sourcetype removed only one part of the header, still however it does not remove the priority and the other part of the header... I had already tried that. I thank you, do you have any other solutions? Thank you, Giulia 
Hi @vid1 , usually in DataModel's there's a macro to reduce the indexes to use in the population activity. You can check this macro in the DataModel constrains. Now this macro isn't present in you... See more...
Hi @vid1 , usually in DataModel's there's a macro to reduce the indexes to use in the population activity. You can check this macro in the DataModel constrains. Now this macro isn't present in your environment or you haven't the permissions to use it. Search for this macro: if present check the permissions, if not present create it or remove t from the dataModel Constrains. Which DataModel are you speaking of? Ciao. Giuseppe
IMO, the best version of Python to use is the one that comes with Splunk.  That's either 3.7 or 3.9, depending on your Splunk version. Use the Splunk-provided interpreter with the command splunk cmd... See more...
IMO, the best version of Python to use is the one that comes with Splunk.  That's either 3.7 or 3.9, depending on your Splunk version. Use the Splunk-provided interpreter with the command splunk cmd python
i am facing error while running datamodel below The search job has failed due to err='Error in 'SearchParser': The search specifies a macro 'isilon_index' that cannot be found.    l