All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi I think that this answer should work also in Splunk Cloud https://community.splunk.com/t5/Alerting/How-do-you-disable-enable-alerts-via-the-REST-API/m-p/441558 Just change that server url to co... See more...
Hi I think that this answer should work also in Splunk Cloud https://community.splunk.com/t5/Alerting/How-do-you-disable-enable-alerts-via-the-REST-API/m-p/441558 Just change that server url to correct and ensure that you have enabled REST api on your stack. r. Ismo
Hi @Pranitkolhe ... The Splunk Documentation got this nice document: https://docs.splunk.com/Documentation/Splunk/9.1.1/Security/ConfigureSplunkforwardingtousesignedcertificates  
Hi @Omar,   I’m a Community Moderator in the Splunk Community.  This question was posted 3 years ago, so it might not get the attention you need for your question to be answered. We recommend... See more...
Hi @Omar,   I’m a Community Moderator in the Splunk Community.  This question was posted 3 years ago, so it might not get the attention you need for your question to be answered. We recommend that you post a new question so that your issue can get the  visibility it deserves. To increase your chances of getting help from the community, follow these guidelines in the Splunk Answers User Manual when creating your post.   Thank you! 
Hi if I have understand right those two are the same. Entra is just a new name for Azure AD service.  Microsoft Entra ID is the new name for Azure AD. The names Azure Active Directory, Azure AD, an... See more...
Hi if I have understand right those two are the same. Entra is just a new name for Azure AD service.  Microsoft Entra ID is the new name for Azure AD. The names Azure Active Directory, Azure AD, and AAD are replaced with Microsoft Entra ID. Microsoft Entra is the name for the product family of identity and network access solutions. Microsoft Entra ID is one of the products within that family. r. Ismo
Hi there was presentation about this on last conf. You could found it from https://conf.splunk.com/watch/conf-online.html?search.event=conf23&search=SEC1936B#/ r. Ismo
Hi predict needs a time series data for make a forecast. Also it needs enough datapoints to do that forecast. Based on your need you should/could select user algorithm and other needed parameters o... See more...
Hi predict needs a time series data for make a forecast. Also it needs enough datapoints to do that forecast. Based on your need you should/could select user algorithm and other needed parameters or use just predict with field lists like index=_internal source=*/var/log/splunk/*.log | timechart count by sourcetype | fields splunkd splunkd_access | predict splunkd splunkd_access Could you share your current data (inside "</>" block)? r. Ismo
Hi, I have have a list of events that contain a customer ID. I'm trying to detect when I have a sequence of events with incremental changes to the ID Example: - event A - ID0 - event B - ID1 - e... See more...
Hi, I have have a list of events that contain a customer ID. I'm trying to detect when I have a sequence of events with incremental changes to the ID Example: - event A - ID0 - event B - ID1 - event C- ID2 - event D - ID3   I might have other events between these increments that could have unrelated IDs (i.e: event A ID0 - event H ID 22, event B ID1) I've tried using | streamstats current=f last(CustomerID) as prev_CustomerID | eval increment = CustomerID - prev_CustomerID but without any luck.   Do you guys know a way this could be achieved ?            
sure. attached the valueCount and Pct. also the number of events:  1,380,350 events
Hi I cannot see any issues on Release Notes. Have you already made support ticket to Splunk? If not, could you do it, so we can get this as a known issue! r. Ismo
Hello everyone, I m using Splunk DB connect to get data from DB, I get three values from the DB as follow :  - Event's ID - Json Data - creation data of events There is the result, how can I... See more...
Hello everyone, I m using Splunk DB connect to get data from DB, I get three values from the DB as follow :  - Event's ID - Json Data - creation data of events There is the result, how can I remove the "rawjson=" to ba able to get this data on json format ?     Regards,
@deodeshm the initial way to test connectivity with any app is to press the "Test Connectivity" button. Whilst this will not send an email it will test the path/capability is available for when it ca... See more...
@deodeshm the initial way to test connectivity with any app is to press the "Test Connectivity" button. Whilst this will not send an email it will test the path/capability is available for when it can send a full email.  Additionally you could run the `send_email` action manually to test that it sends a proper email.    -- Hope this helped. If so please mark as a solution for future enquirers. Happy SOARing! --
I met the same error during the DB Connect set up. The solution that works in my situation is to check  $SPLUNK_HOME/etc/apps/splunk_app_db_connect/metadata/local.meta There are definitions for  id... See more...
I met the same error during the DB Connect set up. The solution that works in my situation is to check  $SPLUNK_HOME/etc/apps/splunk_app_db_connect/metadata/local.meta There are definitions for  identities and db_connections. Make sure the "export = system" instead of  "export = none". It should look like this: [identities/xxxxxx] access = read : [ * ], write : [ admin, db_connect_admin ] export = system owner = xxxxxx ...... [db_connections/xxxxxx] access = read : [ * ], write : [ admin, db_connect_admin ] export = system owner = xxxxxx ......
Hi If I recall right this needs only git client. Basically this means, that you could use it with (almost?) any git server which support basic git commands. ## What is required for this application... See more...
Hi If I recall right this needs only git client. Basically this means, that you could use it with (almost?) any git server which support basic git commands. ## What is required for this application to work with a remote git repository? The following assumptions are made: - git is accessible on the command line, this has been tested on Linux & Windows with git for Windows installed - git is using an SSH-based URL and the remote git repository allows the machine running the SplunkVersionControl application to remotely access the repository without a username/password prompt (i.e. SSH keys are in use) - git will work from the user running the Splunk process over SSH, note that on Windows this will be the system account by default, on Linux the splunk user - the git repository is dedicated to this particular backup as the root / top level of the git repo will be used to create backups r. Ismo 
When one of the month data is not available in one of the name.. i am seeing empty space in between bars.. is there any way we can avoid that space
Hi Usually you must restart splunk for hashing those passwords. r. Ismo
Hi Community,   The sslPassword in Seach Head $SPLUNK_HOME/etc/system/local/web.conf not being hashed. Other .conf password like server.conf & authentication.conf in $SPLUNK_HOME/etc/system/local/... See more...
Hi Community,   The sslPassword in Seach Head $SPLUNK_HOME/etc/system/local/web.conf not being hashed. Other .conf password like server.conf & authentication.conf in $SPLUNK_HOME/etc/system/local/ are hashed. I have changed the password recently in web.conf recently.   Anyone have any idea?
You need to be precise in data description.  I assume that the six characters starting with 999 are bounded by underscore (_), beginning of the string, or end of the string.  Something like the follo... See more...
You need to be precise in data description.  I assume that the six characters starting with 999 are bounded by underscore (_), beginning of the string, or end of the string.  Something like the following would do | rex field=field "^([^_]+_)*(?<six_char>999.{3})(_[^_]+)*$" Here is an emulation you can play with and compare with real data. | makeresults | fields - _time | eval field=mvappend("blah_999ars_blah_blah", "blah_blah_999cha_blah", "9996ch_blah_blah_blah", "blah_blah_blah_999har") | mvexpand field ``` data emulation above ```
Hi Pradeep,   Thank you for providing the Curl command. I noticed that you've combined two different authorization methods in the same command, "Password-Based" and "Bearer Token." ... See more...
Hi Pradeep,   Thank you for providing the Curl command. I noticed that you've combined two different authorization methods in the same command, "Password-Based" and "Bearer Token." To proceed, please choose either "UsernamePassword" or "Bearer Token" for this command. Here is the revised command;  please try it and inform me of the results. curl --user <username>@account-name:<password> "https://<controller page>/controller/rest/applications" curl -H "Authorization:Bearer <ACCESS TOKEN> "https://<controller page>/controller/rest/applications"
Hi as you have had some I/O errors on your /opt/cold there is possibility that there are some buckets which are corrupted and cannot used anymore. You should find from _internal -log what cause that... See more...
Hi as you have had some I/O errors on your /opt/cold there is possibility that there are some buckets which are corrupted and cannot used anymore. You should find from _internal -log what cause that issue. Just search those buckets from it which you have on MC's view of SF&RF not met and in fixing task. After you have identified those reasons you could decide how to proceed. Maybe just remove primary bucket and use your replicas or something else, but this is totally dependent on the reason what you found from internal. What are your SF & RF and have you single site or multisite cluster? Basically it should't need a data rebalancing unless your bucket count has totally unbalanced between indexers. You could see that e.g. via REST calls. r. Ismo
Hello,  Were you able to solve this? if yes please let us know.