All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello   <row> <panel> <title>Manage alerts</title> <input type="checkbox" token="detail_input_token"> <label></label> <choice value="hide">Hide detail</choice> ... See more...
Hello   <row> <panel> <title>Manage alerts</title> <input type="checkbox" token="detail_input_token"> <label></label> <choice value="hide">Hide detail</choice> <initialValue>hide</initialValue> </input> <html>debug detail_input_token:$detail_input_token$</html> <html rejects="$detail_input_token$"> <body> <p> <A HREF="/manager/myapp/data/lookup-table-files?ns=myapp&amp;pwnr=-&amp;search=mySearch&amp;count=100" target="_blank"> Manage <B>lookup mySearch</B> (csv files). </A> </p> </body> </html> </panel> </row>     It works in splunk 6.4.2 We migrate on splunk 8.2 and it doesn't work now. I do many change without success.  Does somebody have an idea ? Thanks in advance.
We have an issue wherein every time we attempt to create a search macro, create a lookup definition, create a new lookup, update a lookup file name, clone the mentioned knowledge objects,  Splunk res... See more...
We have an issue wherein every time we attempt to create a search macro, create a lookup definition, create a new lookup, update a lookup file name, clone the mentioned knowledge objects,  Splunk responds with " Your entry was not saved. The following error was reported: server abort. splunk " Can you let us know the cause of the issue in our Splunk instance? We are currently unable to create any new search macros in this environment.   They advised that there is a workaround of updating the .conf files in the backend, however our clients don't have access to the backend, and everytime they want to update something, the request goes directly to us. Does anyone know how to resolve this issue? We need the UI to function properly as it is causing delay in delivery.
Hi Team, When i m searching the  switch logs for last 7 days, i m gettting the error " Search auto-canceled and DAG execution error ". able to get last 15 or 60 mins logs, could you please suggest ... See more...
Hi Team, When i m searching the  switch logs for last 7 days, i m gettting the error " Search auto-canceled and DAG execution error ". able to get last 15 or 60 mins logs, could you please suggest how can i resolve this issue. i m using 8.1.3 splunk enterprise version. Thanks Sridevi M
so before the update (was v6.4.1) we would edit the incident in 'incident review' ->  add a comment or change some status -> click on 'save changes' and the pop up window would disappear and the inci... See more...
so before the update (was v6.4.1) we would edit the incident in 'incident review' ->  add a comment or change some status -> click on 'save changes' and the pop up window would disappear and the incident list would refresh. All good and dandy. But after the update (v6.6.0) when we click on 'save changes'; we have to wait for about 4 seconds for the close button to be clickable and then we have to click on it to dismiss the window. It is a bit of a productivity killer. I would like to find a way to remove the need to click on the close button. Can anybody point me to the right direction on where can I get some documentation on it? 
hello I use the search below in order to calculate the average of the field "diff"     index=toto | eval diff=strptime('Fin',"%d/%m/%Y %H:%M:%S")-strptime('Debut',"%d/%m/%Y %H:%M:%S") | eval dif... See more...
hello I use the search below in order to calculate the average of the field "diff"     index=toto | eval diff=strptime('Fin',"%d/%m/%Y %H:%M:%S")-strptime('Debut',"%d/%m/%Y %H:%M:%S") | eval diff=round(diff, 2) | stats avg(diff) as diff   I am a little surprised because I have the same results if I add a | search in my search for changing the type of machine index=toto | eval diff=strptime('Fin',"%d/%m/%Y %H:%M:%S")-strptime('Debut',"%d/%m/%Y %H:%M:%S") | eval diff=round(diff, 2) | search PPOSTE = * | stats avg(diff) as diff OR index=toto | eval diff=strptime('Fin',"%d/%m/%Y %H:%M:%S")-strptime('Debut',"%d/%m/%Y %H:%M:%S") | eval diff=round(diff, 2) | VPOSTE = * | stats avg(diff) as diff Is it the correct way to do that please?  
Hi Experts, I'm having some difficulties to extract the correct information from a file that was add to splunk. I tried to read/understand as much as I could but still struggling to correctly extra... See more...
Hi Experts, I'm having some difficulties to extract the correct information from a file that was add to splunk. I tried to read/understand as much as I could but still struggling to correctly extract the information. Here is a snip of my file: call_type: "I" alert_id: "8626530 " data_center: "XYZ2 " memname: "QWERTPX " order_id: "1OOUZ" severity: "R" status: "Not_Noticed " send_time: "20210928070008" last_user: " " last_time: " " message: "ASDFGH STARTUP OF REGION QWERTPX" run_as: "USER01 " sub_application: "QWERT " application: "HOUSEKEEPING " job_name: "JOBASDF " host_id: " " alert_type: "R" closed_from_em: " " ticket_number: " " run_counter: " " notes: " " call_type: "I" alert_id: "8626531 " data_center: "XYZ2 " memname: "QWERTZD " order_id: "1OOVH" severity: "R" status: "Not_Noticed " send_time: "20210928070009" last_user: " " last_time: " " message: "ASDFGH STARTUP OF REGION QWERTZD" run_as: "USER01 " sub_application: "QWERT " application: "HOUSEKEEPING " job_name: "JOBASDF " host_id: " " alert_type: "R" closed_from_em: " " ticket_number: " " run_counter: " " notes: " " call_type: "I" alert_id: "8626533 " data_center: "XYZ2 " memname: "QWERTZU " order_id: "1OOVV" severity: "R" status: "Not_Noticed " send_time: "20210928070009" last_user: " " last_time: " " message: "ASDFGH STARTUP OF REGION QWERTZU" run_as: "USER01 " sub_application: "QWERT " application: "HOUSEKEEPING " job_name: "JOBASDF " host_id: " " alert_type: "R" closed_from_em: " " ticket_number: " " run_counter: " " notes: " " call_type: "I" alert_id: "8626532 " data_center: "XYZ2 " memname: "QWERTZE " order_id: "1OOVJ" severity: "R" status: "Not_Noticed " send_time: "20210928070009" last_user: " " last_time: " " message: "ASDFGH STARTUP OF REGION QWERTZE" run_as: "USER01 " sub_application: "QWERT " application: "HOUSEKEEPING " job_name: "JOBASDF " host_id: " " alert_type: "R" closed_from_em: " " ticket_number: " " run_counter: " " notes: " " What I need is have this 21 fields extracted properly, at moment I tried the delimiters but it doesn't work with :   I believe I will have to write an regular expression (this is where I got stuck as I have no clue how...) Basically what I need is the below fields extracted from the file so I could run dashbords, reports, alerts etc... Field_1 - all_type: "I" Field_2 - alert_id: "0000007 " Field_3 - data_center: "XYZ2 " Field_4 - memname: "ABCABC01 " Field_5 - order_id: "1OO59" Field_6 - severity: "R" Field_7 - status: "Not_Noticed " Field_8 - send_time: "20210923210008" Field_9 - last_user: " " Field_10 - last_time: " " Field_11 - message: "MSG SHUTDOWN OF REGION ABCDEF" Field_12 - run_as: "USER01 " Field_13 - sub_application: "QWERT " Field_14 - application: "HOUSEKEEPING " Field_15 - job_name: "JOBASDF " Field_16 - host_id: " " Field_17 - alert_type: "R" Field_18 - closed_from_em: " " Field_19 - ticket_number: " " Field_20 - run_counter: " " Field_21 - notes: " " Really appreciate any help to achieve this  Thank you !!   
Hi Team  When i tried running the below eval command, i am getting some error message often. I wrote this below command to find out number of Samsung device used in a month.  eval Next= if(match(c... See more...
Hi Team  When i tried running the below eval command, i am getting some error message often. I wrote this below command to find out number of Samsung device used in a month.  eval Next= if(match(cs_user_agent, "SM-G980F"),"Samsung Galaxy S20-5G",if(match(cs_user_agent, "SM-G975W"),"Samsung Galaxy S10+",if(match(cs_user_agent, "SM-G935F"),"Samsung Galaxy S7 edge ",if(match(cs_user_agent, "SM-T350"),"Samsung Galaxy Tab",if(match(cs_user_agent, "SM-G950"),"Samsung Galaxy S8",if(match(cs_user_agent, "SM-G998"),"Samsung Galaxy S21 Ultra-5G",if(match(cs_user_agent, "SM-J120Z"),"Samsung Galaxy J1",if(match(cs_user_agent, "SM-A217F"),"Samsung Galaxy A21s",if(match(cs_user_agent, "SM-G988"),"Samsung Galaxy S20 Ultra 5G",if(match(cs_user_agent, "SM-A105G"),"Samsung Galaxy A10",if(match(cs_user_agent, "SM-A525"),"Samsung Galaxy A52",if(match(cs_user_agent, "SM-G991"),"Samsung Galaxy S21 5G",if(match(cs_user_agent, "SM-A225F"),"Samsung Galaxy A22",if(match(cs_user_agent, "SM-A725"),"Samsung Galaxy A72",if(match(cs_user_agent, "SM-G781"),"Samsung Galaxy S20 FE 5G",if(match(cs_user_agent, "SM-F900U"),"Samsung Galaxy Fold",if(match(cs_user_agent, "SM-A326"),"Samsung Galaxy A32 5G",if(match(cs_user_agent, "SM-F700"),"Samsung Galaxy Z Flip3 5G",if(match(cs_user_agent, "SM-A226"),"Samsung Galaxy A22 5G",if(match(cs_user_agent, "SM-N986"),"Samsung Galaxy Note20 Ultra 5G",if(match(cs_user_agent, "SM-A526"),"Samsung Galaxy A52 5G",if(match(cs_user_agent, "SM-A515"),"Samsung Galaxy A51",if(match(cs_user_agent, "SM-A217"),"Samsung Galaxy A21s",if(match(cs_user_agent, "SM-M326"),"Samsung Galaxy M32 5G",if(match(cs_user_agent, "SM-T7"),"Samsung Galaxy Tab S7 FE",if(match(cs_user_agent, "SM-T50"),"Samsung Galaxy Tab A7 10.4",if(match(cs_user_agent, "SM-T50"),"Samsung Galaxy Tab A7 10.4",if(match(cs_user_agent, "SM-T50"),"Samsung Galaxy J7 Prime",if(match(cs_user_agent, "SM-M515"),"Samsung Galaxy M51",if(match(cs_user_agent, "SM-A505"),"Samsung Galaxy A50",if(match(cs_user_agent, "SM-T22"),"Samsung Galaxy Tab A7 Lite",if(match(cs_user_agent, "SM-G930"),"Samsung Galaxy S7",if(match(cs_user_agent, "SM-N960"),"Samsung Galaxy Note9",if(match(cs_user_agent, "SM-J700"),"Samsung Galaxy J7",if(match(cs_user_agent, "SM-G970"),"Samsung Galaxy S10e",if(match(cs_user_agent, "SM-M127"),"Samsung Galaxy M12",if(match(cs_user_agent, "SM-N970"),"Samsung Galaxy Note10",if(match(cs_user_agent, "SM-A115"),"Samsung Galaxy A11",if(match(cs_user_agent, "SM-T87"),"Samsung Galaxy Tab S7",if(match(cs_user_agent, "SM-A315"),"Samsung Galaxy A31",if(match(cs_user_agent, "SM-M315F"),"Samsung Galaxy M31",if(match(cs_user_agent, "SM-A205"),"Samsung Galaxy A20",if(match(cs_user_agent, "SM-J500"),"Samsung Galaxy J5",if(match(cs_user_agent, "SM-T97"),"Samsung Galaxy Tab S7+","other")))))))))))))))))))))))))))))))))))))))))))) Note - could some one please help me finding out the best way to get the expected outcome from the user agent or please help to avoid the error. 
  how can i add some descriptions at all input log (metric, syslog, snmp, etc...)   i tried, add "_meta = description::test_description" in UF inputs.conf in this case, can be added description a... See more...
  how can i add some descriptions at all input log (metric, syslog, snmp, etc...)   i tried, add "_meta = description::test_description" in UF inputs.conf in this case, can be added description at all log but, cant HF case   so... i think, what if it could be applied to heavy forwarder? retried add "_meta ~~" in HF inputs.conf   but, not work   how can i do?     
I need to monitor user (s) or a groups' activities or the amount of Bandwidth they are using on an Index assigned to them. Thanks a ton. 
I have created a calculated field which parses _time from a date stamp in the data. However, it does not set _time correctly. If I set the calculated field to something different it's fine. So, was... See more...
I have created a calculated field which parses _time from a date stamp in the data. However, it does not set _time correctly. If I set the calculated field to something different it's fine. So, was just wondering if there was any documentation anywhere that talks about being able to override _time with a calculated field. NB: I can't set the event _time at ingestion to be the correct date from the data as I am ingesting a complete data set every day, where historical results may change, so I'm just using a 24h search and then changing _time.  
Does Splunk support enabling WORM on SmartStore S3 buckets ?
I created an input_type (data input type) to collect data from external REST API using Splunk Add-on Builder app.  How do I delete it?
Looking for the web link to all the Splunk + ES Confs of the past, their lectures & contents posted. Thanks a million in advance.
I have a csv file containing the SAM accounts of 1200 AD groups and I need to find out the proper search query to find the last date of their modification or change.
I am sure I am sure I am missing something easy but, for some reason, when I compare these two values (they are in string format from my data) the comparison isn't correct. I don't seem to have this ... See more...
I am sure I am sure I am missing something easy but, for some reason, when I compare these two values (they are in string format from my data) the comparison isn't correct. I don't seem to have this issue when I use this EVAL statement with other versions I have, but for some reason, this comparison just throws it out of whack:       | eval Status=case("21.3.0.44"<"5.5.5.0","Declining","21.3.0.44"="5.5.5.0","Mainstream","21.3.0.44">"5.5.5.0","Emerging")     The result just shows "21.3.0.44" as always less-than. Please advise if I am missing some caveat I am not aware of. P.S. I tried converting to these to numbers but due to all the decimals in version numbers, the number isn't valid. I suppose I could replace the decimals somehow but thought I would ask first before I try going down this route. Thanks in Advance!
Our ITSI is showing some "Detected Anomaly" for the kpi "Index Usage".   Where and how can I find the notable events for those "Detected Anomaly"? I didn't find then in index=itsi_tracked_aler... See more...
Our ITSI is showing some "Detected Anomaly" for the kpi "Index Usage".   Where and how can I find the notable events for those "Detected Anomaly"? I didn't find then in index=itsi_tracked_alerts. Thanks
 Hello there, Sorry for the bad translation. Some time ago I installed a plugin called "Splunk Secure Gateway" during installation and configuration it was necessary to create a new role with the s... See more...
 Hello there, Sorry for the bad translation. Some time ago I installed a plugin called "Splunk Secure Gateway" during installation and configuration it was necessary to create a new role with the same name "securegateway" Today I am trying to enter a new user and I get the error message "In handler 'users': Could not get info for role that does not exist: securegateway" The funny thing is that the user that I am trying to create I am only assigning the role "user"  
Hi all,   We are looking for possibilities of Monitoring DB and JVM on Splunk Cloud as we have the following issues:   DB Connect APP – Data security compliance issue, ruled-out JMX APP –... See more...
Hi all,   We are looking for possibilities of Monitoring DB and JVM on Splunk Cloud as we have the following issues:   DB Connect APP – Data security compliance issue, ruled-out JMX APP – this APP supports Java version 8 & above. But, we need to enable monitors for Java version 6 & 7  If you have any documentation or any thing on other ways of Monitoring please provide the same   Thanks in Advance,    
Hi i hope you will be fine.i need your help.i want splunk forwarder only take alert data from logs?how i can tell the splunk forwarder which is called Splunk universal forwarder to take only alert da... See more...
Hi i hope you will be fine.i need your help.i want splunk forwarder only take alert data from logs?how i can tell the splunk forwarder which is called Splunk universal forwarder to take only alert data from logs.let say i have 5000 files of logs ,in which only 1000 files are Alert logs i want only 1000 logs files from splunk forwarder,help me for this issue,i hope you will help me. Thanks in advance
Hello All, We have a mixed environment where some UFs point to our on-prem Heavy Forwarders while others point to Splunk cloud indexers. I would like to update all UFs to point to Splunk cloud but... See more...
Hello All, We have a mixed environment where some UFs point to our on-prem Heavy Forwarders while others point to Splunk cloud indexers. I would like to update all UFs to point to Splunk cloud but have some questions. Notes - (1) we also have an on-prem deployment server and as a test (2) installed UF on my Mac as it is fwd'ing logs to Splunk Cloud. * What's the best way to update the old UF config to the new? In other words, can someone point me to resources that explain how to best use the deployment server to do this? * Will I lose transformations to logs that point to the HF? Thanks in advance