All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello people,  I am trying to install the Microsoft Teams TA, but i have some problems with the Webhook url , where can i obtain that specific url? 
Is there an API that I could use to trigger a saved search that can collect data from an index into a summary index? 
I have the below column whereby im pinging the url in the column, but for a nicer view I only want to display  the pc name eg "03131bipc142w"  is this possible?    
Hi All, I have a unique values like below in my splunk dashboard,  Email account:            Anaoymzer sab@gmail.com                 No tr@gmail.com                 Yes rt@mail.com              ... See more...
Hi All, I have a unique values like below in my splunk dashboard,  Email account:            Anaoymzer sab@gmail.com                 No tr@gmail.com                 Yes rt@mail.com                        No sab@gmail.com             Yes sab@gmail.com            Yes sab@gmail.com          Yes All the above email account display with mail address list with IP address and ananoymzer as yes and No. we need to pull unique email account column as displayed above and Ananoymzer = yes in past 24 hours. Required SPL Query for this.
Hi,  I'm trying to use the Splunk Add-On for VMware with the DCN OVA.  The installation process is done according to the docs, but when trying to configure the DCN for data collection, I'm getting ... See more...
Hi,  I'm trying to use the Splunk Add-On for VMware with the DCN OVA.  The installation process is done according to the docs, but when trying to configure the DCN for data collection, I'm getting the below error: "You are attempting to configure the data collection features of this system, but this has not been enabled. to enable this system to be used as a data collection scheduler, add the "splunk_vmware_admin" role to the admin account."   The admin role has the correct permissions and roles assigned to it, but I'm still getting the error. NOTE: When I used the "OVA Metrics" the configuration process finished as expected, but I would like to use the one without the metrics.  Any idea?
Does anyone know if it is possible to automatically hide the navigation menus on the new Splunk Dashboard Studio dashboards? On classic dashboards you could just use hideChrome="true". Or if its even... See more...
Does anyone know if it is possible to automatically hide the navigation menus on the new Splunk Dashboard Studio dashboards? On classic dashboards you could just use hideChrome="true". Or if its even possible to default open a page in full screen mode? We just want to display some dashboards on some kiosks, however, the menus look untidy.
Hello, I have this method. function requestQuote(policyPeriod : PolicyPeriod, nextStep : String) { requestQuote(policyPeriod, nextStep, ValidationLevel.TC_QUOTABLE, RatingStyle.TC_DEFAULT) ... See more...
Hello, I have this method. function requestQuote(policyPeriod : PolicyPeriod, nextStep : String) { requestQuote(policyPeriod, nextStep, ValidationLevel.TC_QUOTABLE, RatingStyle.TC_DEFAULT) } I am trying to call a method on the policyPeriod parameter for data and it is not working for me. in code it would look like this -> policyPeriod.Submission.DisplayName which would return a string. Submission looks like this. It is a method that returns an object. public entity.Submission getSubmission() { return ((com.guidewire.pc.domain.policy.period.PolicyPeriodPublicMethods)__getDelegateManager().getImplementation("com.guidewire.pc.domain.policy.period.PolicyPeriodPublicMethods")).getSubmission(); }  I have tried invoke object, use getter chain with getSubmission, getSubmission(), Submission, and Submission() combinations.  I get an out put similar to this. [CANNOT EVALUATE: Could not find specified method = [Submission()], CANNOT EVALUATE: Could not find specified method = [Submission()]] Here is my last attempt.  Any help would be appreciated.  Thanks....
I added iplocation lookup into my CIM data model.  I found there's a rare handling when I validate the result by running | from datamodel:  SPL The result SPL is like following an intermediate se... See more...
I added iplocation lookup into my CIM data model.  I found there's a rare handling when I validate the result by running | from datamodel:  SPL The result SPL is like following an intermediate search filter was applied. search src_lon=* src_lat=* src_City=* src_Region=* src_Country=* dest_lon=* dest_lat=* dest_City=* dest_Region=* dest_Country=*  I have no idea why this is added. My data without location mapped will be dropped. In order to reduce the impact of this, seems I need to add EVAL to check if lon,lat,City,Country was not produced after running iplocation lookup. e.g.  | from datamodel expanded SPL (index=* OR index=_*) (((index=MY_INDEX)) tag=ids tag=attack) DIRECTIVES(READ_SUMMARY(datamodel="Intrusion_Detection.IDS_Attacks" summariesonly="false" allow_old_summaries="true")) | eval dvc=if(isnull(dvc) OR dvc="","unknown",dvc), ids_type=if(isnull(ids_type) OR ids_type="","unknown",ids_type), category=if(isnull(category) OR category="","unknown",category), signature=if(isnull(signature) OR signature="","unknown",signature), severity=if(isnull(severity) OR severity="","unknown",severity), src=if(isnull(src) OR src="" OR src="N/A","unknown",src), dest=if(isnull(dest) OR dest="" OR dest="N/A","unknown",dest), user=if(isnull(user) OR user="","unknown",user), vendor_product=case(isnotnull(vendor_product),vendor_product,isnotnull(vendor) AND vendor!="unknown" AND isnotnull(product) AND product!="unknown",vendor." ".product,isnotnull(vendor) AND vendor!="unknown" AND (isnull(product) OR product="unknown"),vendor." unknown",(isnull(vendor) OR vendor="unknown") AND isnotnull(product) AND product!="unknown","unknown ".product,isnotnull(sourcetype),sourcetype,1=1,"unknown") | iplocation src prefix="src_" | iplocation dest prefix="dest_" | eval src_Country=if(isnull(src_Country) OR src_Country ="","unknown", src_Country), dest_Country=if(isnull(dest_Country) OR dest_Country ="","unknown", dest_Country) | search src_lon=* src_lat=* src_City=* src_Region=* src_Country=* dest_lon=* dest_lat=* dest_City=* dest_Region=* dest_Country=* sourcetype="MY_SOURCETYPE" | eval is_Application_IDS_Attacks=if(searchmatch("(ids_type=\"application\")"),1,0), is_not_Application_IDS_Attacks=1-is_Application_IDS_Attacks, is_Host_IDS_Attacks=if(searchmatch("(ids_type=\"host\")"),1,0), is_not_Host_IDS_Attacks=1-is_Host_IDS_Attacks, is_Network_IDS_Attacks=if(searchmatch("(ids_type=\"network\")"),1,0), is_not_Network_IDS_Attacks=1-is_Network_IDS_Attacks | fields "_time" "host" "source" "sourcetype" "action" "dest_bunit" "dest_category" "dest_port" "dest_priority" "dvc_bunit" "dvc_category" "dvc_priority" "file_hash" "file_name" "file_path" "src_bunit" "src_category" "src_priority" "transport" "tag" "user_bunit" "user_category" "user_priority" "soc_site" "vendor_action" "CVE" "dvc" "ids_type" "category" "signature" "severity" "src" "dest" "user" "vendor_product" "src_Country" "dest_Country" "is_Application_IDS_Attacks" "is_not_Application_IDS_Attacks" "is_Host_IDS_Attacks" "is_not_Host_IDS_Attacks" "is_Network_IDS_Attacks" "is_not_Network_IDS_Attacks"   
Hello, I have created one add-on to collect data using python code, testing gives me the required data, But once I try to create input in this add-on I am getting below error- Unexpected error "<... See more...
Hello, I have created one add-on to collect data using python code, testing gives me the required data, But once I try to create input in this add-on I am getting below error- Unexpected error "<class 'splunktaucclib.rest_handler.error.RestError'>" from python handler: "REST Error [400]: Bad Request -- HTTP 400 Bad Request -- {"messages":[{"type":"ERROR","text":"'NoneType' object has no attribute 'startswith'"}]}". See splunkd.log/python.log for more details. Can anyone help on this ???
I have set up the maxtotalDataSizeMB for main index as 20 GB. But when I try to run the search for the index main on this specific indexer it shows me more than 20 GB of data. I ran the search for la... See more...
I have set up the maxtotalDataSizeMB for main index as 20 GB. But when I try to run the search for the index main on this specific indexer it shows me more than 20 GB of data. I ran the search for last 10 days. Can someone explain the theory behind this. How I understand is that it should only show 20 GB of data and whatever older events were there would have moved to frozen which is not searchable. But that's not what is happening in this case. Is there something that I am missing?
Hi Team, I am using Splunk Enterprise version. I will try to map Splunk Enterprise logs to SSE app for Mitre attack tactic and technique. is there any way to map both of this? @splunk @Anonymous ... See more...
Hi Team, I am using Splunk Enterprise version. I will try to map Splunk Enterprise logs to SSE app for Mitre attack tactic and technique. is there any way to map both of this? @splunk @Anonymous  Thanks & Regards Vatsal Shah
I would like to create a dashboard with a dropdown input. The input would affect dynamically the field application_methodName. The problem is that I have some method names that contain accents in i... See more...
I would like to create a dashboard with a dropdown input. The input would affect dynamically the field application_methodName. The problem is that I have some method names that contain accents in it, and they are not recognized when doing a search through this input. Next is the code of the input query : index=my_index timeseriesId=" appmethod.useractions" | dedup application_methodId | table application_methodName | sort application_methodName And the code of the query I'm trying to match with the dropdown input : index=my_index application_methodName="$userAction_token$" timeseriesId="appmethod.useractions" | stats sum(value) I could also use a application_methodId field as the dropdown token (that would erase the problem), but then it won't be user friendly anymore.  Any idea on how to make the query recognize accents ? Or a way to use the id as the token while still displaying the name in the dropdown ? 
Hi , Query: index=main sourcetype="activedirectory" I performed a search which showed only last 14 days of data. Is there a way to get older data than 14 days. User wants to get data of last 1 yea... See more...
Hi , Query: index=main sourcetype="activedirectory" I performed a search which showed only last 14 days of data. Is there a way to get older data than 14 days. User wants to get data of last 1 year. Regards, Rahul
I was wondering if anyone has already come up with an SPL to extract identities for Enterprise security Identity and asset lookups. Could you please post the SPL if you have.
Hi, I use the below curl command to disable the alert which works fine. curl -k -u admin:password https://<host>:<mgmt_port>/servicesNS/<user_context>/<app_context>/saved/searches/<search>/disable ... See more...
Hi, I use the below curl command to disable the alert which works fine. curl -k -u admin:password https://<host>:<mgmt_port>/servicesNS/<user_context>/<app_context>/saved/searches/<search>/disable -X POST   But i am trying to hide the username and password in the below shell script but getting unauthorized exception,  SCRIPT_DIR=/tmp/ USER=$(cat $SCRIPT_DIR/.nonprodusr.txt) PWD=$(cat $SCRIPT_DIR/.nonprod.txt) curl -k -u $USER:$PWD https://<host>:<mgmt_port>/servicesNS/<user_context>/<app_context>/saved/searches/<search>/disable -X POST  
Hi, I'm inserting an appendpipe into my SPL so that in the event there are no results, a stats table will still be produced. However, I am seeing differences in the field values when they are not n... See more...
Hi, I'm inserting an appendpipe into my SPL so that in the event there are no results, a stats table will still be produced. However, I am seeing differences in the field values when they are not null. Can anyone explain why this is occurring and how to fix this?  
Hello guys I hope you are doing well,   It turns out I am in need of a regex that will allow me to extract a "fixed" or "static" pattern within a field that is called HEAD in a splunk search that I... See more...
Hello guys I hope you are doing well,   It turns out I am in need of a regex that will allow me to extract a "fixed" or "static" pattern within a field that is called HEAD in a splunk search that I have... this so-called HEAD field will start with any kind of words/numbers/strings... but will always have at some point the pattern "***\|Hotel=YY-4857UU45547|" wich is three (*) followed by "\|Hotel=" and then a combination of words and numbers and this pattern with always end with a "|" .... this will may always have some other kinds of words of number after that last "|" so what i an trying to acchive is estracting only the pattern that we know to be always consistent... to show you an example this is one of the real values of that field: | makeresults | eval HEAD=" 487542 For Flight Toronto AV TAX VIP client UBER_LIFT_ 78547 ***\|Hotel=YY-4857UU45547| aws not equip Need end seat 1U" and I would like a regex that will allow me to extract: YY-4857UU45547 and put it in a new field name: RESERV_CODE I have tried all day and all nig I will ne so thankful to any of you lovely people who can help me out tahnk you so much   love; cindy  
Hi, Running 5.04 of the Add On - on a HF Splunk 8.1.3. Randomly an input just stops ingesting. There is nothing in the logs, even with DEBUG on. Loggin on this app is poor (rant). Anyone run into s... See more...
Hi, Running 5.04 of the Add On - on a HF Splunk 8.1.3. Randomly an input just stops ingesting. There is nothing in the logs, even with DEBUG on. Loggin on this app is poor (rant). Anyone run into similar issues? Tips, suggestions?  Nothing showing up in the splunkd DEBUG logs either.    Thanks   Chris
Hello, I have a lookup called top sites with the bellow:   Name Ip address test1 10.10.10.10 test2 10.10.10.11 Test3 10.10.10.12 Test4 10.10.10.11 Test5... See more...
Hello, I have a lookup called top sites with the bellow:   Name Ip address test1 10.10.10.10 test2 10.10.10.11 Test3 10.10.10.12 Test4 10.10.10.11 Test5 10.10.10.11   I am trying to update test3 IP address with the bellow:   | inputlookup topsites.csv | append [ |eval Name=Test3 |eval Ip address="9.9.9.9"] [|stats count by Name,Ip address] | outputlookup createinapp=true topsites.csv   However it just adds another entry instead pf replacing the IP address value, what am I missing?   Thanks
Hello, I am just trying to know if there is a way where I can generate a report based on error or exceptions that are happening on live environment? 1) Any query to run in analytics ? 2) I need to ... See more...
Hello, I am just trying to know if there is a way where I can generate a report based on error or exceptions that are happening on live environment? 1) Any query to run in analytics ? 2) I need to know what is causing the issue for error and exception? with minimalistic way?