All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I'm using Splunk ITSI, viewing its Episode Review. When an episode is opened, the episode list is compressed on the left side, and the opened episode displayed on the right side.  When this occurs, ... See more...
I'm using Splunk ITSI, viewing its Episode Review. When an episode is opened, the episode list is compressed on the left side, and the opened episode displayed on the right side.  When this occurs, In the episode list, a count is added in the left side of that pane.  It displays the number of Notable Events within each episode, unless that amount exceeds 99, in which case it shows "100+".  When an episode is not opened, the count is only displayed if that field is included in those selected for display, but again, if the value exceeds 99, displays "100+". If the count is selected for display in the episode list, AND an episode is opened, THEN in the compressed episode list in the left pane, the count is added on the left side of the pane as before, but the selected count field still displays as well, AND the selected count field NOW displays the actual count values even if they exceed 99.  HOW can I get the actual values exceeding 99 to display when no episode is open?
Is there a way to get the actual link for the alert when using the Service Now Incident Integration addon, as you would get with the normal Send email option? Thinking it’s a Custom fields setting, b... See more...
Is there a way to get the actual link for the alert when using the Service Now Incident Integration addon, as you would get with the normal Send email option? Thinking it’s a Custom fields setting, but not sure. https://docs.splunk.com/Documentation/AddOns/released/ServiceNow/Usecustomsearchcommands See screenshots.
I need to add a file to a lookup list / table. Please share how this is done?
We would like to be alerted when an alert has been changed. We use -     | rest /servicesNS/-/-/saved/searches   This call brings back the owner but not the recent modifier id. Is there any w... See more...
We would like to be alerted when an alert has been changed. We use -     | rest /servicesNS/-/-/saved/searches   This call brings back the owner but not the recent modifier id. Is there any way to get the modifier id?  
We're trying to gather a list of servers, both linux and windows that are missing specific software packages. It's easy enough to get the list of servers that has the software installed.   search s... See more...
We're trying to gather a list of servers, both linux and windows that are missing specific software packages. It's easy enough to get the list of servers that has the software installed.   search software IN ("CrowdStrike")   I was hoping I could search against the software package, like   search NOT software in ("CrowdStrike")   but that still displays hosts with Crowdstrike installed, just not that particular event showing that Crowdstrike is indeed installed.  I thought of making an eval    |eval cs_win_installed=if(match(software, "CrowdStrike"),1,0)   and then searching for 0 or 1 depending on what I care about, but can I do that with all the software that I'm searching on? Running that eval for multiple pieces of software   | eval cs_lin_is_installed=if(match(software, "falcon-sensor"),1,0) | eval cs_win_is_installed=if(match(software, "CrowdStrike Windows Sensor"),1,0) | eval q_is_installed=if(match(software, "Qualys*"),1,0) | eval f_is_installed=if(match(software, "SecureConnector*"),1,0)   only returns with the event showing that 1 piece of software on the machine. Am I overthinking this? How should I go about displaying hosts with missing software? Thanks much.
I want to get a predicted value from the data statistics. Is it possible to output the predicted value for each pattern from NO, 1, NO2 to No.3 like the following data? ------ no,time,pattern1,p... See more...
I want to get a predicted value from the data statistics. Is it possible to output the predicted value for each pattern from NO, 1, NO2 to No.3 like the following data? ------ no,time,pattern1,pattern2,pattern3,pattern4,pattern5,pattern6 1,2021/8/1,3,17,20,25,26,29 2,2021/8/2,11,12,21,30,28,11 ・ ・ ------ is there any good methods for that?
New to Splunk and experimenting a couple of functionalities, especially data aggregation With the experimental file app_usage.csv, I was trying to see the percentile of Webmail using  |inputlookup ... See more...
New to Splunk and experimenting a couple of functionalities, especially data aggregation With the experimental file app_usage.csv, I was trying to see the percentile of Webmail using  |inputlookup app_usage.csv | stats perc(Webmail, 10.0) but it returns error  Percentile must be a floating point number that is >= 0 and < 100. Not sure what to do, tried to cast Webmail to float also failed |inputlookup app_usage.csv | eval Webmail=cast(Webmail, 'float') with error Error in 'eval' command: The 'cast' function is unsupported or undefined. cast should be in the eval command, right? Based on the documentation.         
Hi Splunkers, I have query where i want to filter out all the legitimate process by path process which ive identify that path is legit. Basically this query i custom from ESCU, where all the element... See more...
Hi Splunkers, I have query where i want to filter out all the legitimate process by path process which ive identify that path is legit. Basically this query i custom from ESCU, where all the element i already setup to match exactly the same with the existing escu query.  What i expect is the result display will be not from the lookup (whitelist process) that i call from the query. Field : process , process_path | tstats `security_content_summariesonly` count values(Processes.dest) as dest values(Processes.user) as user min(_time) as firstTime max(_time) as lastTime from datamodel=Endpoint.Processes by Processes.process_name, Processes.parent_process_path | rename Processes.process_name as process, Processes.parent_process_path as process_path | rex field=user "(?<user_domain>.*)\\\\(?<user_name>.*)" | `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | search [| tstats count from datamodel=Endpoint.Processes by Processes.process_name, Processes.parent_process_path | rare Processes.process_name limit=30 | rename Processes.process_name as process, Processes.parent_process_path as process_path | lookup update=true lookup_rare_process_allow_list_default2 process, process_path OUTPUTNEW allow_list | where allow_list="false" | lookup update=true lookup_rare_process_allow_list_local2 process, process_path OUTPUT allow_list | where allow_list="false" | table process process_path ] | `detect_rare_executables_filter`   as you can see above query, the second "tstats" consist of two lookup, which first lookup definition (lookup_rare_process_allow_list_default2) is whitelist on totally existing process (ex: splunk process) and the second lookup definition used (lookup_rare_process_allow_list_local2) is the all list of whitelist process.    The above query is running fine if i change both lookup definition line into below: | lookup update=true lookup_rare_process_allow_list_default2 process OUTPUTNEW allow_list | where allow_list="false" | lookup update=true lookup_rare_process_allow_list_local2 process OUTPUT allow_list | where allow_list="false"   But what i want is not on the field=process, but on field=process_path. I've read the doc for lookup and other community postage, seem should be no issue. No error display for first query if run. Just result is empty and i think some string is not pass to display the result. Really glad if someone can help me on this. thanks!
How can I split a field, into many other fields, but without using a delimiter, and using the position range instead? For example: bignumber = 16563764 I need to split it in: account id = positio... See more...
How can I split a field, into many other fields, but without using a delimiter, and using the position range instead? For example: bignumber = 16563764 I need to split it in: account id = position [0 to 3] of field "bignumber" company code = position [4 to 6] of field "bignumber" operation code = position [7] of field "bignumber"   Thanks!!          
Hi All, Hope you guys are doing fine. I do have few doubts with relates to field comparison. Please find the below sample data. Field1 Field2 TRAP_BGP BGP BACKWARD TRANSITION TRAP_BF... See more...
Hi All, Hope you guys are doing fine. I do have few doubts with relates to field comparison. Please find the below sample data. Field1 Field2 TRAP_BGP BGP BACKWARD TRANSITION TRAP_BFD CISCO BFD SESS DOWN Interface GigabitEthernet0/0 BGP BACKWARD TRANSITION TRAP_LINK LINK UP/DOWN TRAPS RECEIVED IN THE LAST 5 MINUTES EXCEEDS THRESHOLD   I need to check the value of field1 is containing in field2(partial match). From the above example, TRAP_BGP and BGP BACKWARD TRANSITION. in this both words BGP is common. if it is common then the result should "YES". This is sample data, we do have multiple data with this format(This data is dynamic not static). Can someone please help with the SPL query. I have tried match, LIKE command but it doesn't seems to be working.
i am unable to send the alerts via email (outlook), can anyone help me with that , i performed  all the procedure ,like entering the smtp host and all, i really dont know whats the issue , the a... See more...
i am unable to send the alerts via email (outlook), can anyone help me with that , i performed  all the procedure ,like entering the smtp host and all, i really dont know whats the issue , the alerts are triggering but the email action is not working , Here is my email settings :  smtp.office365.com:578 Enable TLS vinod@mail.com Password allowed domains left empty. i configured an alert and its work fine , it is shown in triggered alerts , can anyone help me with this it would be appreciated. Thankyou.       
I have a csv file that that I am using for a lookup which has multiple values in a particular field. I am trying to do a lookup which matches any one of the field values. example: lookup table file... See more...
I have a csv file that that I am using for a lookup which has multiple values in a particular field. I am trying to do a lookup which matches any one of the field values. example: lookup table file - room,color livingroom,purple|green|yellow (the pipe symbol delineates the different values in the color field) Then my search - <base search>  | lookup paint_colors room OUTPUTNEW color | search color=purple | fields room,color | stats list by room My desired result would be to see livingroom in the results. Is it possible to search for any one value in a field with multiple values? Thanks in advance!    
Hi Splunkers, I have some HF configured to send data over SSL to one indexer; As I am about to configure a second indexer, I was wondering if it is possible to load-balance data from HF to: IDX1 ... See more...
Hi Splunkers, I have some HF configured to send data over SSL to one indexer; As I am about to configure a second indexer, I was wondering if it is possible to load-balance data from HF to: IDX1 over SSL IDX2 without SSL And have outputs.conf configured such as: [tcpout] defaultGroup = default-autolb-group [tcpout:default-autolb-group] server = idx1:1234,idx2:5678   where 1234 is the SSL port, and 5678 the standard one, without SSL. and on indexer side, we would have for inputs.conf IDX1 [splunktcp-ssl://1234] connection_host = dns IDX2 [splunktcp://5678] connection_host = dns   Do you think this could work? Thanks !
how to have the quotation of splunk entreprise for the entreiprise of D.R.C
We aren't supposed to see the same results from both sites. For a given event we should only see it coming from one site (whichever had the searchable copy). It almost appears that Splunk is giving a... See more...
We aren't supposed to see the same results from both sites. For a given event we should only see it coming from one site (whichever had the searchable copy). It almost appears that Splunk is giving a result from each site. What might be the issue here and how to resolve
Hello, I need some help where to look in order to diagnostic the issue I am facing. I am using v8.0.9 in a multisite search head cluster and indexer cluster. After more than 30 days of normal opera... See more...
Hello, I need some help where to look in order to diagnostic the issue I am facing. I am using v8.0.9 in a multisite search head cluster and indexer cluster. After more than 30 days of normal operation, the search heads are not parsing bluecoat logs. While I try the same search from the cluster master the parsing is done properly but from any of the search heads.... There has not done any change in the cluster but suddenly the parsing stopped working. Any ideas on where to focus my troubleshooting?
I am trying to make an app (using Python) in which a user will select key field details that have to be saved into a settings file (json or conf) but currently when it write's it's only saving to the... See more...
I am trying to make an app (using Python) in which a user will select key field details that have to be saved into a settings file (json or conf) but currently when it write's it's only saving to the search head the user is currently on, no replicating across them all. 
HI Team, I need to add my company email address to my SplunK profile. I want to update it with my professional email ID so that the new certifications I do through my company ID can be reflected in... See more...
HI Team, I need to add my company email address to my SplunK profile. I want to update it with my professional email ID so that the new certifications I do through my company ID can be reflected in my own Profile along existing certifications.
Hi community I created an Addon based on the Addon Builder 3.01, in order to maintain this Addon, Splunk asked me to export the project and import it into the new version of Addon builder 4.0. I tr... See more...
Hi community I created an Addon based on the Addon Builder 3.01, in order to maintain this Addon, Splunk asked me to export the project and import it into the new version of Addon builder 4.0. I tried to export the project from the old Add-on builder but i cannot see my Add-on :   I can see it under « Other apps and add-ons section :   Any idea what's the problem and how can i open it from the Addon builder page to avoid creating the Addon from scratch using the new version of the addon builder.   Thanks
Hi Fellas! I just wanted to ask if it would be possible for a Splunk UF to monitor logs that is not accessible to its underlying user. For example, I am running my Splunk UF instance under the splu... See more...
Hi Fellas! I just wanted to ask if it would be possible for a Splunk UF to monitor logs that is not accessible to its underlying user. For example, I am running my Splunk UF instance under the splunk user and I am try to capture data from files under the directory /var/logs/appservicename/*.log which is owned by root user. Given the I have the correct configuration at inputs.conf and outputs.conf, will the data be transmitted to my indexer instance?