All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

HI Experts , I want to rigger an alert based on below scenario 1) Get license utilization in GB for yesterday and day before yesterday . 2) Show difference in GB and if the difference is increased... See more...
HI Experts , I want to rigger an alert based on below scenario 1) Get license utilization in GB for yesterday and day before yesterday . 2) Show difference in GB and if the difference is increased by 40GB then trigger an alert Something like below , I want to trigger alert only for line 2 that is for database index_name yesterday day_before_yesterday diff application 20GB 10GB 10GB database 30GB 70GB 40GB security 40GB 20GB 20GB
Hello, Our test environment uses production LM and we never had any compatibility issue upgrading first test nodes : 6.2.3 > 6.5.2 6.5.2 > 7.1.4 7.1.4 > 7.3.4 We plan to upgrade 7.3.4 to 8.2.2, ... See more...
Hello, Our test environment uses production LM and we never had any compatibility issue upgrading first test nodes : 6.2.3 > 6.5.2 6.5.2 > 7.1.4 7.1.4 > 7.3.4 We plan to upgrade 7.3.4 to 8.2.2, any possible issue? In fact https://docs.splunk.com/Documentation/Splunk/8.2.2/Admin/Configurealicensemaster rather looks best practice and not a requirement. Thanks    
Hi, I am trying to install the ITSI Module for Kafka smart monitoring App,  and its related Kafka streaming platform logging management TA, but I received notification from my support case that the ... See more...
Hi, I am trying to install the ITSI Module for Kafka smart monitoring App,  and its related Kafka streaming platform logging management TA, but I received notification from my support case that the app/addon is incompatible for jQuery reasons.  The TA came back with unspecified incompatibilities. Guilhem Marchand is the author of the work.  Does anyone know if the TA/App combo is being maintained? Thanks in advance, Alex  
Of the Servers LM, CM, SHC or Deployment server, which needs to be put in a maintenance mode before upgrading to 8.2.2.1 please? Thanks a million for your help.
If I am trying to execute the following code block and my total records is greater than 50K it limits me to the 50K so is there a modified example of this through an offset or pagination technique so... See more...
If I am trying to execute the following code block and my total records is greater than 50K it limits me to the 50K so is there a modified example of this through an offset or pagination technique someone can provide that will allow me to iterate through the full result set beyond the 50k oneshotSearchArgs.put("earliest_time", yesterday+"T05:00:00.000" ); oneshotSearchArgs.put("latest_time", today+"T05:00:00.000" ); oneshotSearchArgs.put("count", 0); String oneshotSearchQuery = "search " + searchString; // The search results are returned directly InputStream results_oneshot = service.oneshotSearch(oneshotSearchQuery, oneshotSearchArgs); // Get the search results and use the built-in XML parser to display them ResultsReaderXml resultsReader = new ResultsReaderXml(results_oneshot); HashMap<String, String> event; int eventsMatched = 0; String log = null; ArrayList<String> list = new ArrayList<>(); while ((event = resultsReader.getNextEvent()) != null) { }
hello I need to display 0 in a single panel if there is no results I tried the 2 solutions below but it doesnt works how to do this please?   | stats avg(Response) | eval Response=if(Response="... See more...
hello I need to display 0 in a single panel if there is no results I tried the 2 solutions below but it doesnt works how to do this please?   | stats avg(Response) | eval Response=if(Response="0","0",Response) | stats avg(Response) | eval Response=if(Response="","0",Response)  
I have my splunk Jason in below format   { [-] delete_me: True vendor: Dbruzy name: Rahul date: [ [-] 10-jan-2022 30-dec-2022 ] count_target: [ [-] 1700 300 ] site: India type: Sales }       I ... See more...
I have my splunk Jason in below format   { [-] delete_me: True vendor: Dbruzy name: Rahul date: [ [-] 10-jan-2022 30-dec-2022 ] count_target: [ [-] 1700 300 ] site: India type: Sales }       I am looking for a query to get output like this: Vendor Name Date Count_Target Site Type Dbruzy Rahul 10-jan-2022 1700 India Sales Dbruzy Rahul 30-dec-2022 300 India Sales   But I am getting as below: Vendor Name Date Count_Target Site Type Dbruzy Rahul 10-jan-2022 30-dec-2022 1700 300 India Sales Dbruzy Rahul 10-jan-2022 30-dec-2022 1700 300 India Sales   Query I am using:     my index | rename count_target{} as target | rename Date{} as voltage | spath input=voltage path=voltage output=someOtherField | spath input=someOtherField | foreach voltage* [ eval voltage=mvappend(voltage, '<<FIELD>>')] | spath input=target path=target output=someOtherField1 | spath input=someOtherField1 | foreach target* [ eval target=mvappend(target, '<<FIELD>>')] | mvexpand target| mvexpand voltage | stats values(voltage) as Date values(target) as Count_Target by Vendor, Name,Site,Type     Can you please help?
Hi All I have installed a machine agent on the server and extension on Redis and I have given the application name, tier name, and node name and it was working fine. Now, I plan to install the ... See more...
Hi All I have installed a machine agent on the server and extension on Redis and I have given the application name, tier name, and node name and it was working fine. Now, I plan to install the NodeJS agent to the NodeJS application and have configured the controller information the same as the controller provided. But it was not connecting as checked with document and machine agent controller-info.xml file should be removed with the application name, node name, and Tier name.  I restarted the application but it was not connected. So, have stopped the machine agent and rename the folder, and restarted the application still the same. Does anyone face the same issue? Kindly help. thanks
Newbie here...! I have a list of IP's in a CSV from which I need to exclude few IP's (IP1, IP2, IP3, etc.,) from the results of a query. Here's how my search looks. Example search:  base searc... See more...
Newbie here...! I have a list of IP's in a CSV from which I need to exclude few IP's (IP1, IP2, IP3, etc.,) from the results of a query. Here's how my search looks. Example search:  base search | join type=left ip [| inputlookup iplist.csv |fields ip] |fields Any help would be appreciated. Thanks
Hi,  I am trying to change/control many multi-select dropdowns by one Master_multi-select dropdown value/checks. So I am trying to use more than one value in the set-token tag. But it is not workin... See more...
Hi,  I am trying to change/control many multi-select dropdowns by one Master_multi-select dropdown value/checks. So I am trying to use more than one value in the set-token tag. But it is not working, if I give 2 values in the Set-token tag. then it will get merge but if I give a single value then it is working fine. I tried multiple ways like double quotes, single quotes., etc but I could not find the solution   Please find the below example and help me to find a solution.   <set token="form.Filter1"> "new", "rejected" </set> out put will be like below.  But the expectation is like below   Example in multiselect:- <input type="dropdown" token="MasterFilter_Token"> <label>MasterFilter</label> ..., <change> <condition> <set token="form.Filter1"> "new", "rejected", "closed" </set> ..., </condition> </change> ..., Thanks in Advance!!!  
Hello Team! I have a problem I need to solve, but I couldn't find a way to do it. I have some servers that have Universal Forwarder installed and Windows services are being monitored through it... See more...
Hello Team! I have a problem I need to solve, but I couldn't find a way to do it. I have some servers that have Universal Forwarder installed and Windows services are being monitored through it. What happens is that sometimes some of these services are unavailable and there is a need to restart this service, I would like to know if, somehow, as soon as Splunk identifies that one of these services is out, run a script on the local server that restarts that service That is, I need to know if there is any way to run a script that is in the universal forwarder through Splunk Server Thanks in advance!
Has anyone tried to use the Splunk Attack Range with a preconfigured VPC in AWS?  What I'm trying to do is use the attack range hosts to use a Splunk server on our local enterprise network.  The VPC ... See more...
Has anyone tried to use the Splunk Attack Range with a preconfigured VPC in AWS?  What I'm trying to do is use the attack range hosts to use a Splunk server on our local enterprise network.  The VPC that is already setup in AWS is configured for a direct connect back to our local network.   Any tips or suggestions would be helpful.     Thanks,   Jon 
Hi, I'm having trouble with a regex field extraction. I'm looking to extract the numeric ID after the "x-client-id" key: .........pp_code":["{IVR-US}. CPC"],"x-client-id":["1234567890"],"x-requested... See more...
Hi, I'm having trouble with a regex field extraction. I'm looking to extract the numeric ID after the "x-client-id" key: .........pp_code":["{IVR-US}. CPC"],"x-client-id":["1234567890"],"x-requested-with":["DA_ONLINE_IV............ This is how that field appears in the event string. This (client-id) is the only field I need in the entire string. The quotes are throwing off all of my normal regex formats. Any help would be super appreciated.
Hello, we are trying to diagnose a parsing error from AWS Firehose to Splunk using HEC. The endpoint is configured properly but we are getting "no data" parsing errors. To try and debug this I have s... See more...
Hello, we are trying to diagnose a parsing error from AWS Firehose to Splunk using HEC. The endpoint is configured properly but we are getting "no data" parsing errors. To try and debug this I have switched DEBUG on for httpeventcollector on the heavy forwarder receiving the data. However the introspection log is still only showing INFO.  Am i setting debug in the wrong place? or has anyone else overcome this?
Hi all, I'm interested in bringing Snowflake query history into Splunk and there are posts on how to do it with DBConnect; however, it seems like the app is only available for Splunk enterprise, not ... See more...
Hi all, I'm interested in bringing Snowflake query history into Splunk and there are posts on how to do it with DBConnect; however, it seems like the app is only available for Splunk enterprise, not Splunk Cloud.  Is that correct? If so, is there any way to bring in Snowflake data into Splunk Cloud? Thanks!
Hi - I have a few dashboards that use expressions like eval var=ifnull(x,"true","false") ...which assigns "true" or "false" to var depending on x being NULL Those dashboards still work, but I not... See more...
Hi - I have a few dashboards that use expressions like eval var=ifnull(x,"true","false") ...which assigns "true" or "false" to var depending on x being NULL Those dashboards still work, but I notice that ifnull() does not show up in any of the current documentation, and it seems the current way to get the same result would be eval var=if(isnull(x),"true","false") Did I miss some kind of deprecation of that syntax ages ago (must have been before 6.3.0), and it just happens to still be parsed?
Hi guys, Does anyone have any advice on what would be a good search to carry out on local performance data. I am trying to create some sort of dashboard that shows the performance of my local machin... See more...
Hi guys, Does anyone have any advice on what would be a good search to carry out on local performance data. I am trying to create some sort of dashboard that shows the performance of my local machine and not sure what I could be searching for to put in the dashboard. If anyone has any advice on what I could search for please let me know.   Thank You 
I am trying to remove duplicates in my result using the |dedup command. Even though I am seeing 2 entries in my result. Kindly help me to remove 1 duplicate.    
Been trying to get the AWS app working and the ec2 dashboards are not working... I have traced it down to it looking like every search is just plain wrong...  as an example: `aws-description-sourcet... See more...
Been trying to get the AWS app working and the ec2 dashboards are not working... I have traced it down to it looking like every search is just plain wrong...  as an example: `aws-description-sourcetype` $accountId$ $region$ source="*:$resource$" | eventstats latest(_time) as latest_time | eval latest_time=relative_time(latest_time,"-55m") | where _time > latest_time | dedup id sortby -start_time The problem is at `dedup id sortby -start_time`.  There is no "id" field on the data... there is however "InstanceId".  It is a similar situation for every dashboard that is not populating which leads me to believe there is a job somewhere that is not running or I am missing some very fundamental thing.  Any help would be greatly appreciated... Thanks!
Hi. I'm using TA for Windows and everything is mostly working OK. But. In some events I'm receiving values like ReadOperation %%8100 If I understand correctly, that's _not_ what evt_resolv... See more...
Hi. I'm using TA for Windows and everything is mostly working OK. But. In some events I'm receiving values like ReadOperation %%8100 If I understand correctly, that's _not_ what evt_resolve_ad_obj option should affect, right? That option affects only resolving (or not) SID-s to usernames/groups and this is something completely different, right? What is it then? And can I force my UF to forward the same contents that I see in Event Log Viewer? In this case it's Read Operation: Enumerate Credentials I understand that it's something that event log viewer is rendering on its own, because in detail view of the event, it does indeed show %%8100 as ReadOperation so it's apparently the program's intepretation of this data that says "Enumerate Credentials". So I suppose there'd have to be some lookups to "humanize" the events, right?