All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I'm getting thousands of log events that says -- ERROR CMSlave [2549383 CMNotifyThread] - Cannot find bid=wineventlog~157~96ECF7C4-1951-4288-B90A-9133E5408F14. cleaning up usage data It is on all m... See more...
I'm getting thousands of log events that says -- ERROR CMSlave [2549383 CMNotifyThread] - Cannot find bid=wineventlog~157~96ECF7C4-1951-4288-B90A-9133E5408F14. cleaning up usage data It is on all my indexers and references multiple but not all indexes.  Any ideas on how to fix that error?
Hi @Real_captain  There isn't a built-in Splunk Web feature for this rotation. I would recommend using a browser extension such as "Tab Rotate" - these often have configurations like the amount of t... See more...
Hi @Real_captain  There isn't a built-in Splunk Web feature for this rotation. I would recommend using a browser extension such as "Tab Rotate" - these often have configurations like the amount of time and rate at which different tabs are rotated/reloaded. Alternatively if you are using Classic dashboards you could write some custom javascript but I think you'll probably have better success, quicker, with an off-the-shelf browser extension.   Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.
Hi @siv , Yes, you can add a field to an existing KV Store collection without directly editing collections.conf by using the Splunk REST API. See https://dev.splunk.com/enterprise/docs/developapps/... See more...
Hi @siv , Yes, you can add a field to an existing KV Store collection without directly editing collections.conf by using the Splunk REST API. See https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/kvstore/usetherestapitomanagekv/#:~:text=Define%20the%20collection%20schema%3A for more info, but will cover briefly below. You can use curl to add a new field named new_field_name of type string to a collection named my_collection within the my_app app context. You'll need to update the existing field definitions and include the new one in the payload. First, get the current definition (optional but helpful):   curl -k -u admin:yourpassword \ https://<serverName>:8089/servicesNS/nobody/my_app/storage/collections/config/my_collection   Then, POST the updated configuration, including all existing fields plus the new one: curl -k -u admin:yourpassword \ -X POST \ https://<serverName>:8089/servicesNS/nobody/my_app/storage/collections/config/my_collection \ -d 'field.existing_field1=string' \ -d 'field.existing_field2=number' \ -d 'field.new_field_name=string' # Add your new field here This method requires appropriate permissions, specifically the POST / Updating capability, to modify KV Store configurations via the REST API. Using the REST API effectively updates the configuration as if you had edited the collections.conf file, but does so remotely. Documentation: KV Store REST API Endpoints: https://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTkvstore Specifically, the collection endpoint: https://docs.splunk.com/Documentation/Splunk/latest/RESTREF/RESTkvstore#storage.2Fcollections.2Fconfig.2F.7Bcollection.7D   Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @siv , I don't think so, you could try to upload a file with the additional column, but I'm not sure. ciao. Giuseppe
@kaushik3g @iamarkaprabha please help me on the similar issue- https://community.splunk.com/t5/Getting-Data-In/Akamai-add-on-logs-are-not-populating/m-p/743241#M118086
@shivanshu1593 @jitbahan @Gowhar @dsainz please help me on the similar issue - https://community.splunk.com/t5/Getting-Data-In/Akamai-add-on-logs-are-not-populating/m-p/743241#M118086
@cpetterborg @anzianojackson @adri2915 @iamarkaprabha please help on this similar issue - https://community.splunk.com/t5/Getting-Data-In/Akamai-add-on-logs-are-not-populating/m-p/743241#M118086
@James_ACN @javo_dlg @deepdiver @tofa please help me on the similar issue - https://community.splunk.com/t5/Getting-Data-In/Akamai-add-on-logs-are-not-populating/m-p/743241#M118086
There is option add a field to an existing kvstore without edit conf files? I dont own the server so it be It's difficult to get there all the time.
@deepdiver @nkoppert_s please help me on the similar issue - https://community.splunk.com/t5/Getting-Data-In/Akamai-add-on-logs-are-not-populating/m-p/743241#M118086
@mikefg @RCavazana2023 @cybersecnutant @LeandroNTT please help me on the similar issue ue - https://community.splunk.com/t5/Getting-Data-In/Akamai-add-on-logs-are-not-populating/m-p/743241#M118086
@James_ACN @javo_dlg @deepdiver @tofa anyone please help on this similar issue - https://community.splunk.com/t5/Getting-Data-In/Akamai-add-on-logs-are-not-populating/m-p/743241#M118086
To Rez an old thread, I was having this very same issue today and have a suspicion about why the option wasn't showing.  I have 2 clustered environments; One Lab and one production. The production S... See more...
To Rez an old thread, I was having this very same issue today and have a suspicion about why the option wasn't showing.  I have 2 clustered environments; One Lab and one production. The production SH's show the option to install from file, the Lab does not. In the Lab, I was experimenting with installing apps in a one off scenario and now the option is gone, but in production where I've only pushed by deployment, the option is still there.  tl;dr : Because I tried to install a one off app to a SH in the cluster, it seems to have removed the option to install further apps per SH?   Anyone seen similar?
Hi Team  Is it possible to switch the dashboard after a regular interval in the same app ?  I've around 15 dashboards in the same app and i want to switch to next dashboard after every 2 mins.  ... See more...
Hi Team  Is it possible to switch the dashboard after a regular interval in the same app ?  I've around 15 dashboards in the same app and i want to switch to next dashboard after every 2 mins.  In the above attached screenshot , i have around 15 dashboards and the home screen is always "ESES Hotline SUMMARY"  dashboard.  Is it possible to move automatically to next dashboard "ESES Hotline" after 2 mins and then move automatically to next dashboard "EVIS Application" after 2 mins and so on.     
Hi Team  Can you please let me know how it is possible to fetch the events with the time greater than the time of the 1st event in the dashboard.  Example: I've 3 jobs executed every day at around ... See more...
Hi Team  Can you please let me know how it is possible to fetch the events with the time greater than the time of the 1st event in the dashboard.  Example: I've 3 jobs executed every day at around below timings:  Job1 : Around 10 PM  ( Day D)  Job2 : Around 3 AM ( Day D + 1) Job3 : Around 6 AM ( Day D + 1) I am fetching the latest of the Job1/Job2/Job3 to show in the dashboard and want the result in the below format.  If we are after 5 PM - 10 PM ,  Job1 : PLANNED  Job2 : PLANNED  Job3 : PLANNED  If we are at 11 PM ,  Job1 : Executed at 10:00  Job2 : PLANNED  Job3 : PLANNED  If we are 4 AM ,  Job1 : Executed at 10:00  Job2 : Executed at  03:00 Job3 : PLANNED  If we are 7 AM ,  Job1 : Executed at 10:00  Job2 : Executed at  03:00 Job3 : Executed at  06:00  If we are 4 PM ,  Job1 : PLANNED  Job2 : PLANNED  Job3 : PLANNED  If we are at 5 PM ,  Job1 : PLANNED  Job2 : PLANNED  Job3 : PLANNED  We want to consider the start of day at 5 PM and end at next day at 5 PM instead of using last 24 hours / today.   
Hi @zafar , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points... See more...
Hi @zafar , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Hi @Treize , because summary index in a main search hasn't limits in the number of results. Ciao. Giuseppe
Hi i have a below table  i have been trying to represent it in a heat map i can see the percentage value in the blocks but how can i get the severity values => very high,high medium represented ... See more...
Hi i have a below table  i have been trying to represent it in a heat map i can see the percentage value in the blocks but how can i get the severity values => very high,high medium represented in the blocks     
In this situation, it could mean one of two things.  The first is that you're trying to use a cert chain and there is already a single cert in idpCert.pem.  Some IdP's like Ping require you to remove... See more...
In this situation, it could mean one of two things.  The first is that you're trying to use a cert chain and there is already a single cert in idpCert.pem.  Some IdP's like Ping require you to remove that idpCert.pem.  However, the more likely case here is that you have multiple single certs attached to your IdP metadata.xml file. Some IdP's such as ADFS and Azure (Entra) allow for Primary and Secondary IdP certs, which allow for seamless transition from expiring to new certs. However, Splunk does NOT accept two single certs in one metadata.xml file.  Hence, your solution here is as below: 1.  On the IdP, replace the expiring cert with the new cert 2.  Disable secondary cert option 3.  Download the new metadata.xml file 4.  Upload the IdP metadata.xml file to Splunk UI > Save    footnote:  Splunk DOES accept cert chains, but that has to be manually uploaded and in the correct order as per KB below: https://community.splunk.com/t5/Deployment-Architecture/Problem-with-SAML-cert-quot-ERROR-UiSAML-Verification-of-SAML/m-p/322376#M12073 
Hi @msmadhu, Splunk does not offer native clustering for Deployment Servers (DS) in the same way it does for indexers or search heads. High availability (HA) for the DS is typically achieved using e... See more...
Hi @msmadhu, Splunk does not offer native clustering for Deployment Servers (DS) in the same way it does for indexers or search heads. High availability (HA) for the DS is typically achieved using external components (Typically a load balancer). The common approach involves: Setting up two (or more) independent Deployment Server instances. Using a network load balancer (LB) in front of these instances. Configuring deployment clients (Universal Forwarders) to point to the virtual IP (VIP) or hostname of the load balancer. Keeping the configuration ($SPLUNK_HOME/etc/deployment-apps/) synchronized between the DS instances. This can be done manually, via scripting, or using shared storage (e.g., NFS), though shared storage requires careful implementation. This architecture relies on external systems (load balancer, potentially shared storage or sync scripts) and requires careful management to ensure configurations are consistent across DS instances. You can find guidance on HA strategies in the Splunk Enterprise Admin Manual: Protect against loss of the deployment server   Did this answer help you? If so, please consider: Adding kudos to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing