All Topics

Top

All Topics

i want to know in which index is microsoft defender logs getting stored , I know some important fields which are there in microsoft defender and now i want to find them whether they are getting store... See more...
i want to know in which index is microsoft defender logs getting stored , I know some important fields which are there in microsoft defender and now i want to find them whether they are getting stored or not .
Getting following Static Errors in Splunk SOAR PR review from Bot. 1.  { "minimal_data_paths": { "description": "Checks to make sure each action includes the min... See more...
Getting following Static Errors in Splunk SOAR PR review from Bot. 1.  { "minimal_data_paths": { "description": "Checks to make sure each action includes the minimal required data paths", "message": "One or more actions are missing a required data path", "success": false, "verbose": [ "Minimal data paths: summary.total_objects_successful, action_result.status, action_result.message, summary.total_objects", " action one is missing one or more required data path", " action two is missing one or more required data path", " action three is missing one or more required data path" ] } },  I have provided all the data paths in output array in <App Name>.json file. Is there any other place where I have to provide the data paths? 2. { "repo_name_has_expected_app_id": { "description": "Validates that the app ID in the app repo's JSON file matches the recorded app ID for the app", "message": "Could not find an app id for <App Name>. Please add the app id for <App Name> to data/repo_name_to_appid.json", "success": false, "verbose": [ "Could not find an app id for <App Name>. Please add the app id for <App Name> to data/repo_name_to_appid.json" ] } } How do we resolve this issue,  did I missed any file?
Calculating metrics. I need to count the number of sensors that are created and monitored for each host. I have the index and sourcetype. I created about 7 different dashboards with multiple host on... See more...
Calculating metrics. I need to count the number of sensors that are created and monitored for each host. I have the index and sourcetype. I created about 7 different dashboards with multiple host on each dashboard and I need to get a count on the number of sensors that are being monitored by each host.  index=idx_sensors sourcetype = sensorlog | stats count by host the above query is giving me all the hostnames that are being monitored but the count is giving me all the events... I just need the # of sensors per host.   
Hi everyone, I'm running a query in Splunk using the dbxquery command and received the following error:   Error in 'script': Getinfo probe failed for external search command 'dbxquery'.   When I... See more...
Hi everyone, I'm running a query in Splunk using the dbxquery command and received the following error:   Error in 'script': Getinfo probe failed for external search command 'dbxquery'.   When I check Apps -> Manage Apps -> Splunk DB Connect, I see the version is 2.4.0. Please help me identify the cause and how to fix this error. Thank you!
[Register Here]  This thread is for the Community Office Hours session on Security: Cisco Talos Integration on Wed, March 19, 2025 at 1pm PT / 4pm ET.    This is your opportunity to ask questions a... See more...
[Register Here]  This thread is for the Community Office Hours session on Security: Cisco Talos Integration on Wed, March 19, 2025 at 1pm PT / 4pm ET.    This is your opportunity to ask questions about Cisco Talos integration with Splunk Security. Our experts are ready to answer all your questions, such as:   What can I ask in this AMA? How does Cisco Talos threat intelligence integrate with Splunk Security products? What kinds of intelligence are provided by the Cisco Talos integrations? How do I start using Cisco Talos intelligence in my Splunk Security products? Anything else you’d like to learn!   Please submit your questions at registration. You can also head to the #office-hours user Slack channel to ask questions (request access here).    Pre-submitted questions will be prioritized. After that, we will open the floor up to live Q&A with meeting participants.   Look forward to connecting!
[Register Here ] This thread is for the Community Office Hours session on Security: Threat Investigation and Analysis on Wed, Feb 19, 2025 at 1pm PT / 4pm ET.     This is your opportunity to ask qu... See more...
[Register Here ] This thread is for the Community Office Hours session on Security: Threat Investigation and Analysis on Wed, Feb 19, 2025 at 1pm PT / 4pm ET.     This is your opportunity to ask questions related to your specific Splunk threat investigation, analysis, and use cases, including:   What are the best practices for enhancing threat investigation and analysis? How can I help my SOC save time on those missions? How do I maximize the use of threat intelligence and contextual insights when investigating active threats? Which Splunk tools should I leverage to help optimize and automate the process of analyzing and investigating threats? How can Splunk help me get through phishing investigations more efficiently? How does Splunk Attack Analyzer complement SOAR tools and capabilities? How is it integrated with other Splunk security solutions? Anything else you’d like to learn!   Please submit your questions at registration. You can also head to the #office-hours user Slack channel to ask questions (request access here).    Pre-submitted questions will be prioritized. After that, we will open the floor up to live Q&A with meeting participants.   Look forward to connecting!  
Hi, Can any one please help in creating regex to extract 12 words(Words with characters/letters only) from beginning of the field? Sharing few samples with required output:   1)00012243asdsfgh - N... See more...
Hi, Can any one please help in creating regex to extract 12 words(Words with characters/letters only) from beginning of the field? Sharing few samples with required output:   1)00012243asdsfgh - No recommendations from System A. Message - ERROR: System A | No Matching Recommendations Required Output - No recommendations from System A. Message - ERROR: System A | No Matching Recommendations 2)001b135c-5348-4arf-b3vbv344v - Validation Exception reason - Empty/Invalid Page_Placement Value ::: Input received - Channel1; ::: Other details - 001sss-445-4f45-b3ad-gsdfg34 - Incorrect page and placement found: Channel1; Required Output - Validation Exception reason - Empty/Invalid Page_Placement Value ::: Input received - Channel1; 3)00assew-34df-34de-d34k-sf34546d :: Invalid requestTimestamp : 2025-01-21T21:36:21.224Z Required Output:Invalid requestTimestamp 4)01hg34hgh44hghg4 - Exception while calling System A - null Required Output:Exception while calling System A - null            
Hello, I have a question about sh deployer and search heads. We have three search heads within a cluster and for some reason at some point of time deployer connection got disconnected and now I am ... See more...
Hello, I have a question about sh deployer and search heads. We have three search heads within a cluster and for some reason at some point of time deployer connection got disconnected and now I am trying to connect it. Let me know what need to be done ? Is it just we need to match password of all search heads with deployer. Configurations I currently see: On Search head(1/2/3): /opt/splunk/etc/system/localserver.conf [shclustering] conf_deploy_fetch_url = https://XXXXXX:8089 disabled = 0 mgmt_uri = https://XXXXXXX:8089 replication_factor = 2 shcluster_label = shcluster1 id = 1F81D83B manual_detention = off Deployer : /opt/splunk/etc/system/localserver.conf [shclustering] shcluster_label = shcluster1 pass4SymmKey = XXXXXXX Thanks in advance for your help!  
We have a lookup that has all kinds of domain (DNS) information in it with about  60 fields like create date, ASN, name server IP,  MX IP, many of which are usually populated. But there are several f... See more...
We have a lookup that has all kinds of domain (DNS) information in it with about  60 fields like create date, ASN, name server IP,  MX IP, many of which are usually populated. But there are several fields which have no data - 10 to 20 on any given search (assuming that they are 'null'). The empty fields are likely to vary on each search. In other words some domains will have an MX record, some will not, but if they are in this lookup, they will always have a create-date.  I am presenting this data on a domain lookup dashboard, using "|transpose" so that you have a table with the field name and value on a dashboard. I would like to just show a field and a value where this is returned data and filter out or not show a field which is null. Is there a way to do this?
Hello, if you have specific app conf (like after configuring it using HF web gui for a specific site), is it still recommended to use deployment server as this requires to sync / copy HF app/local co... See more...
Hello, if you have specific app conf (like after configuring it using HF web gui for a specific site), is it still recommended to use deployment server as this requires to sync / copy HF app/local conf back to deployment server etc/deployment-apps/app/local to avoid any deletion when reloading deployment server/app update from DS? I guess using DS is good for centralizing (same) configurations across HFs? https://docs.splunk.com/Documentation/Splunk/9.3.0/Updating/Createdeploymentapps "The only way to allow an instance to continue managing its own copy of such an app is to disable the instance's deployment client functionality. If an instance is no longer a client of a deployment server, the deployment server will no longer manage its apps."   Thanks.
Stupid form editor adds extra CRs. Having trouble getting this search to work as desired. I've tried these 2 methods and can't them to work:   eventtype="x" Name="x" | fields Name, host ... See more...
Stupid form editor adds extra CRs. Having trouble getting this search to work as desired. I've tried these 2 methods and can't them to work:   eventtype="x" Name="x" | fields Name, host | dedup host | stats count by host | appendpipe [stats count | where count=0 | eval host="Specify your text here"]     and using the   fillnull   command. Here is my search:   index=idx1 host=host1 OR host=host2 source=*filename*.txt field1!=20250106 (field2="20005") OR (field2="20006") OR (field2="20007") OR (field2="666") | stats count(field2) by field2, field3 | sort count(field2)   In this case the value for field2="666" does not exist in the results. Here're the results I get:   field2 field3 count(field2) 1 20005 This is field3 value 1 2 2 20006 This is field3 value 2 6 3 20007 This is field3 value 3 13   To summarize, I want to search for all the values of field2 and return the counts for each field2 value even if the field2 value is not found in the search; so, then, count(field2) for field2=666 would be 0. As follows:   field2 field3 count(field2) 1 666 <empty string> 0 2 20005 This is field3 value 1 2 3 20006 This is field3 value 2 6 4 20007 This is field3 value 3 13   This is a simplified example. The actual use case is that I want to search one data set and return all the field2 values and then search for those values in the first data set. This actual search I'm running looks like this:   index=idx1 host=host1 OR host=host2 source=*filename*.txt field1!=20250106 [search index=idx1 host=host1 OR host=host2 source=*filename*.txt field1=20250106 | fields field2 | dedup field2 | return 1000 field2] | stats count(field2) by field2, field3 | sort count(field2)   I want to find all the field2 values when field1=20250106 and then find the counts of those values in the field1!=20250106 events (even for when the count of some field2 values have count=0 in results).
I need help with below splunk query   index=XXX_XXX_XXX | eval job_status=if( 'MSGTXT' = "*ABEND*","ko","ok") | where job_status="ko"   If I change job_status="ok" its working but not for abo... See more...
I need help with below splunk query   index=XXX_XXX_XXX | eval job_status=if( 'MSGTXT' = "*ABEND*","ko","ok") | where job_status="ko"   If I change job_status="ok" its working but not for above condition, appreciate any suggestion on this.   Regards  
Receving "Search auto-canceled" error while executing one month episod review. Please let us know if any quick solution.
Hello Splunker, After I upgraded to version 9.4 , KV store does not start , I generated a new certificate by renaming server.pem and restarting the splunk , And now I see the following error on mong... See more...
Hello Splunker, After I upgraded to version 9.4 , KV store does not start , I generated a new certificate by renaming server.pem and restarting the splunk , And now I see the following error on mongod.log [conn937] SSL peer certificate validation failed: self signed certificate in certificate chain NETWORK [conn937] Error receiving request from client: SSLHandshakeFailed: SSL peer certificate validation failed: self signed certificate in certificate chain. Ending connection from 127.0.0.1:38268 (connection id: 937) Does anyone have any idea what could be missing ? Appreciate your inputs in this regard, Thank you, Moh
Access to Splunk Observability Kubernetes “Classic Navigator” UI will no longer be available starting January 30, 2025.  A new version of Splunk Observability Kubernetes Navigator was introduced la... See more...
Access to Splunk Observability Kubernetes “Classic Navigator” UI will no longer be available starting January 30, 2025.  A new version of Splunk Observability Kubernetes Navigator was introduced last year with this announcement that essentially replaced the “Classic Navigator”. Since then we have released multiple iterations to Navigators to improve the overall user experience. Although Classic Navigator was deprecated as per the documentation last year, it was still accessible until now. What that means for you, the customer: There are no major changes to the experience for users other than not being able to access the “Classic Navigator” user interface There is no action required of admins or users of the product Classic Navigator URLs will get redirected to new Navigators automatically Thank you for being a customer of Splunk Observability Kubernetes Navigator. For any questions or clarification, please reach out via the Support Portal.
I have a lookup table with a bunch of IP addresses (ipaddress.csv) and a blank column called hostname. I would like to search in splunk to find what hostnames each IP address have. I can find the hos... See more...
I have a lookup table with a bunch of IP addresses (ipaddress.csv) and a blank column called hostname. I would like to search in splunk to find what hostnames each IP address have. I can find the hostnames in index=fs sourcetype=inventory. I'm just having a hardtime with the query logic of using the lookup table IPs to output to a table in splunk with their corresponding hostnames. Ideally I want to output the results in a table in splunk and add the hostnames back to the lookup table in the hostname column. Any help would be greatly appreciated! Thank you
Hello,  Looking for some guidance on "Auto Retry"  on HTTP / Browser Test,  Have a Scheduled Test running every 30 mins ( 9:00, 9:30 ) When the test fails at 9:00 see the result updated as "Failed... See more...
Hello,  Looking for some guidance on "Auto Retry"  on HTTP / Browser Test,  Have a Scheduled Test running every 30 mins ( 9:00, 9:30 ) When the test fails at 9:00 see the result updated as "Failed (Auto Retry)" The next run occurs only at 9:30 a.m. Is this expected? (Does Auto Retry kick in only at the scheduled run time?)
At the moment, our tiny indexer has very little disk space and _introspection consumes roughly GB of storage a day, is there a way to minimize the space consumed by the index besides making the reten... See more...
At the moment, our tiny indexer has very little disk space and _introspection consumes roughly GB of storage a day, is there a way to minimize the space consumed by the index besides making the retention very short? 
Is it possible to connect Azure storage to a Splunk cloud instance? Our client wants to store data from their Splunk cloud instance in azure to eliminate their Splunk cloud storage overage 
I am trying to install Splunk_TA_Nix on my UFs. I am in air-gapped area, so can't copy errors and paste here.  I followed below steps: cd $SPLUNK_HOME/etc/apps/ tar xzvf $TMP/Splunk_TA_nix-4.7.0-1... See more...
I am trying to install Splunk_TA_Nix on my UFs. I am in air-gapped area, so can't copy errors and paste here.  I followed below steps: cd $SPLUNK_HOME/etc/apps/ tar xzvf $TMP/Splunk_TA_nix-4.7.0-156739.tgz mkdir $SPLUNK_HOME/etc/apps/Splunk_TA_nix/local cp $SPLUNK_HOME/etc/apps/Splunk_TA_nix/default/inputs.conf $SPLUNK_HOME/etc/apps/Splunk_TA_nix/local/. vi $SPLUNK_HOME/etc/apps/Splunk_TA_nix/local/inputs.conf chown -R splunkfwd:splunkfwd $SPLUNK_HOME/etc/apps/Splunk_TA_nix And restarted Splunk   I was able to get it working on 2 machines but then on next couple of machines, I am seeing: -0500 ERROR Configwatcher [32904 SplunkConfigChangeWatcherThread] - File =/opt/splunkforwarder/var/run/splunk/confsnapshot/baseline_default/apps/splunk_TA_nix/default/app.conf not available in baseline directory -0500 ERROR Configwatcher [32904 SplunkConfigChangeWatcherThread] - Unable to log the changes for path=/opt/splunkforwarder/etc/apps/Splunk_TA_nix/default/app.conf Similar errors for other file name as well, like ._tags.conf and eventtypes.conf.  It seems like a permission issue but I have compared and permissions on the add-on folder and all files/dirs seems to be just like other UFs where the same add-on is working.    Any help would be appreciated.