All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, How can we extract a list of open episodes in splunk itsi.Please  Thanks!
Hello, I have question about pipeline parallelization. From docu and other sources I find that is safe enable pipeline parallelization if I have plenty of free resources in Splunk deployment, parti... See more...
Hello, I have question about pipeline parallelization. From docu and other sources I find that is safe enable pipeline parallelization if I have plenty of free resources in Splunk deployment, particularly CPU cores. In other words, if CPU on indexers or heavy forwarders are "underutilized". But, my question is - what does it mean "underutilized" in numbers? Especially in distributed environment. Example: lets imagine I have IDX cluster. 8 nodes, 16 CPU cores each. I see in Monitoring console )historical charts) average CPU load 40%, median CPU load 40% and maximum CPU load between 70 - 100%. My opinion is it is not safe to enable parallelization in this environment, OK? But when it is safe - if maximum load is under 50% Or 25%? What factors I should take into calculations and what numbers are "safe"? Could you please share your experience or point me to some available guide? Thank you very much in advance. Best regards Lukas Mecir
In one of our dashboard we have a table with a custom action, When the user clicks on a field we check if it is the delete field and if so get the name of the field we want to delete. We can put it... See more...
In one of our dashboard we have a table with a custom action, When the user clicks on a field we check if it is the delete field and if so get the name of the field we want to delete. We can put it in a javascript variable. We also have a search that needs to use this variable. Something like: where someVariable is update in a function.   var someVariable = "" var validateChannelCanBeDeletedSearch = new SearchManager({  id: "validate something",  autostart: false,  search: `| inputlookup some | search some_field="${someVariable}"` });  Later we manually trigger the search. The problem is that the update value of someVariable is not used in the query. How can we make it use the updated value.
Hey there! I try do write some code which will interact with the Splunk REST API. I use the Splunk FREE edition version 8.2.3.3. Unfortunately I cannot get any response from port 8089:   ``` ... See more...
Hey there! I try do write some code which will interact with the Splunk REST API. I use the Splunk FREE edition version 8.2.3.3. Unfortunately I cannot get any response from port 8089:   ``` $ curl https://localhost:8089/services/search/jobs/ curl: (28) Operation timed out after 300523 milliseconds with 0 out of 0 bytes received ```   The URI does not matter. I cannot get any reaction whatsoever. Is this a known limitation? Or do I need to configure something? Thanks a lot for suggestions!
Hi,I have one query that we need to submit node downtime duration report based on node monthly.Every month how much time that node down and how much time it is up.Please help me with the query.Please... See more...
Hi,I have one query that we need to submit node downtime duration report based on node monthly.Every month how much time that node down and how much time it is up.Please help me with the query.Please find the sample log(100 is up ,200 is down) 08/29/2022 10:05:00 +0000,host="0.0.1.1:NodeUp",alert_value="100"              08/29/2022 10:05:00 +0000,host="0.1.1.1:NodeUp",alert_value="100" 08/29/2022 10:00:00 +0000,host="0.0.1.1:NodeDown",alert_value="200" 08/23/2022 10:10:00 +0000,host="0.0.1.1:NodeUp",alert_value="100"  08/23/2022 09:55:00 +0000,host="0.0.1.1:NodeDown",alert_value="200" Example:If node down for 30 min overall in a month different dates.still we need to display hostname along with dowtime(i.e 30min) and remaining uptime duration in one row Note:Every 5min our Saved search will run and show this log data like above so that time stamp is will be every 5min
Hello community, I have a problem with a search that does not return a result. For the purposes of a dashboard, I need one of my searches, when it does not return a result, to display 0. I have al... See more...
Hello community, I have a problem with a search that does not return a result. For the purposes of a dashboard, I need one of my searches, when it does not return a result, to display 0. I have already succeeded in this modification in some somewhat complex searches but for a fairly simple search, I cannot do it. Here is the example in question: Note that when I have a result, it is displayed well, my search runs correctly. I attempted to use the command "| eval ACKED = if(isnull(ACKED) OR len(ACKED)==0, "0", ACKED)" but search doesn't seem to read it:   I found several topics on similar subjects (with the use of fillnull for example) but without result :   I think it's not complicated but I can't put my finger on what's the problem, do you have any idea? Best regards, Rajaion
hello I have a strange behavior with an eval command if I am doing this it works well     | eval site=case(site=="0", "AA", site=="BR", "BB", site=="PER", "CC", 1==1,site) | eval s=lower(s... See more...
hello I have a strange behavior with an eval command if I am doing this it works well     | eval site=case(site=="0", "AA", site=="BR", "BB", site=="PER", "CC", 1==1,site) | eval s=lower(s) |search site="$site$"      but if I put | search site="$site$" just after the eval, the search command is not recognized as a splunk command!     | eval site=case(site=="0", "AA", site=="BR", "BB", site=="PER", "CC", 1==1,site) |search site="$site$"      what is wrong please?
Hi Team, I have NFR license, want to install the ITSI. was trying to install the app, in the process it routed to Splunk base and my splunk account authorization denied what to do some one please h... See more...
Hi Team, I have NFR license, want to install the ITSI. was trying to install the app, in the process it routed to Splunk base and my splunk account authorization denied what to do some one please help?
hello In a first dashboard, I have a dropdown list     <input type="dropdown" token="site" searchWhenChanged="true"> <label>Espace</label> <fieldForLabel>site</fieldForLabel> ... See more...
hello In a first dashboard, I have a dropdown list     <input type="dropdown" token="site" searchWhenChanged="true"> <label>Espace</label> <fieldForLabel>site</fieldForLabel> <fieldForValue>site</fieldForValue> <search>     so when I chosse a site value, the dashboard is updated with the selected site now I want to drilldown on another dashboard from the selected site like this     <link target="_blank">/app/spl_pu/test?form.$site$=$click.value$</link>     In the second dashboard, i try to call the token like this but it doesnt works     | search site="$site$"     could you help please?
Hi, We have a requirement to install the Splunk add on for sql server. We are using Splunk cloud with classic experience. Where all do we need to install this add on? is it sufficient to instal... See more...
Hi, We have a requirement to install the Splunk add on for sql server. We are using Splunk cloud with classic experience. Where all do we need to install this add on? is it sufficient to install on the search head? Or it has to be installed on the heavy forwarder also? Please clarify. Docs suggest to install on the search head only as the below table. Splunk instance type Supported Required Comments Search Heads Yes Yes Install this add-on to all search heads where Microsoft SQL Server knowledge management is required. Indexers Yes No Not required, because this add-on does not include any index-time operations. Heavy Forwarders Yes No To collect dynamic management view data, trace logs, and audit logs, you must use Splunk DB Connect on a search head or heavy forwarder. The remaining data types support using a universal or light forwarder installed directly on the machines running MS SQL Server. Universal Forwarders Yes No To collect dynamic management view data, trace logs, and audit logs, you must use Splunk DB Connect on a search head or heavy forwarder. The remaining data types support file monitoring using a universal or light forwarder installed directly on the machines running MS SQL Server.
I can across a bug for this app: https://splunkbase.splunk.com/app/6553/ and though I'd share. The log types logs and users work fine. But with apps and groups it's configure to get "enrichment dat... See more...
I can across a bug for this app: https://splunkbase.splunk.com/app/6553/ and though I'd share. The log types logs and users work fine. But with apps and groups it's configure to get "enrichment data", this fails if you need to use a proxy. After a bit of trouble shooting I found the on line 243 in okta_utils.py there is no proxy in the request call. I updated it to the following and it works:   Before: r = requests.request("GET", url, headers=headers) After: r = requests.request("GET", url, headers=headers,proxies=proxies,timeout=reqTimeout)   I also had to add these lines to grab those settings, I added them just before the if statement: # Get Proxy settings proxies = get_proxy_settings(self.session_key, self.logger) # set RequestTimeout to 90sec reqTimeout = float(90)
Hello , I have data like below. I need to frame a query such that I can calculate number of desync for each rate-parity-group.   For example: "rate-parity-group":{"CN":{"avail":11,"price... See more...
Hello , I have data like below. I need to frame a query such that I can calculate number of desync for each rate-parity-group.   For example: "rate-parity-group":{"CN":{"avail":11,"price":11}}} rate-parity-group":{"CK":{"avail":18,"price":0},"CL":{"avail":36,"price":0},"CM":{"avail":18,"price":0}}}, "rate-parity-group":{"CL":{"avail":18,"price":0},"CM":{"avail":36,"price":0}}} Expected outcome  rate-parity-group  total-desync CL                                        54(36+18) CM                                      54 CK                                       18   Since CK,CM,CL all these rate-parity-group is dynamic so I m facing problem.  Could someone help me to get the desync count at rate-parity-group. Sample data attached in screenshot.   Thanks in Advance  
<input type="multiselect" token="product_token" searchWhenChanged="true"> <label>Product types</label> <choice value="*">All</choice> <default>*</default> <prefix>(</prefix> <suffix>)</suffix>... See more...
<input type="multiselect" token="product_token" searchWhenChanged="true"> <label>Product types</label> <choice value="*">All</choice> <default>*</default> <prefix>(</prefix> <suffix>)</suffix> <initialValue>*</initialValue> <valuePrefix>DB_Product="*</valuePrefix> <valueSuffix>*"</valueSuffix> <delimiter> OR </delimiter> <fieldForLabel>DB_Product</fieldForLabel> <fieldForValue>DB_Product</fieldForValue> <search base="base_search_Products"> <query>|dedup DB_Product | table DB_Product</query> </search> </input>   This is my input multi select , thorugh which user select product Types example - All /A,B,C,D etc I need to count, How many Product types are selcted by user . This info i need for further processing.
Is there a way to retrieve what time range does a search use?, I have tried using this endpoint curl -k -u admin:pass https://localhost:8089/services/saved/searches/search_name/history but i gu... See more...
Is there a way to retrieve what time range does a search use?, I have tried using this endpoint curl -k -u admin:pass https://localhost:8089/services/saved/searches/search_name/history but i guess it is not returning its time range Thank you
We have configured DBConnect data from MySQL db under some index at hourly frequency. Data is being pulled however we see that the count of Splunk events is much higher than the count of rows in its... See more...
We have configured DBConnect data from MySQL db under some index at hourly frequency. Data is being pulled however we see that the count of Splunk events is much higher than the count of rows in its respective table. This is due to the fact that the SQL table is real-time in nature and always have the entries updating, whereas, Splunk keeps storing the entries as per the hourly execution frequency. So as a result, Splunk will have historical events too which currently is not present in SQL table. We need to counter this situation as we plan to build some analytics report on this data so it has to be true and updated in Splunk as well.
How can we configure custom domain and SSL certificate purchased from GoDaddy in Splunk? Need to securely access the Splunk enterprise outside my network using the my purchased domain. Please help!
Hi, I would like to know how to run searches using different time ranges in dropdown. For example, an input in the dropdown would be labelled "Yesterday", and I would like to assign 2 different ti... See more...
Hi, I would like to know how to run searches using different time ranges in dropdown. For example, an input in the dropdown would be labelled "Yesterday", and I would like to assign 2 different time ranges to the same label, such that I can run 2 different searches using the separate time ranges by just selecting one input from the dropdown. I have tried defining 4 tokens under the same label, but it doesn't work, ie.   <choice value="yesterday">Yesterday</choice> <condition label="Yesterday"> <set token="custom_earliest">-8d@d+7h</set> <set token="custom_latest">@d+7h</set> <set token="breakdown_earliest">-1d@d+7h</set> <set token="breakdown_latest">@d+7h</set> </condition>    Thanks
I am looking for details if it possible to customize the splunk logs , like mask the data or redact the field or display only required fields in the logs
Hello, I have a chart with dynamic field names displayed as table and would like to change the order of the columns:     Name Season 1 Season 2 Season 3 Name1 10000 1111... See more...
Hello, I have a chart with dynamic field names displayed as table and would like to change the order of the columns:     Name Season 1 Season 2 Season 3 Name1 10000 11111 22222 Name2 9999 9997 9998 Name3 7777 5555 6666     How can I change the order of the columns? The number of Seasons is flexible and it should always start with the latest one -> Name  Season3  Season2  Season1
Hi. I work with ServiceNow, a ticketing platform.  I wish to get only the current "new" incidents and display it in a dashboard, but when I put "| search status=New" I get results which turned "r... See more...
Hi. I work with ServiceNow, a ticketing platform.  I wish to get only the current "new" incidents and display it in a dashboard, but when I put "| search status=New" I get results which turned "resolved" already. Is there a way I can display only the current new incidents?