All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

My current serach is -    | from datamodel:Remote_Access_Authentication.local | append [| inputlookup Domain | rename name as company_domain] | dest_nt_domain   How do I get the search to only li... See more...
My current serach is -    | from datamodel:Remote_Access_Authentication.local | append [| inputlookup Domain | rename name as company_domain] | dest_nt_domain   How do I get the search to only list items in my table where | search dest_nt_domain=company_domain?  Is there another command other than append that I can use with inputlookup?  I do not need to add it to the list.   Just trying to get the data in to compare against the datamodel. 
Hi ,    I have a JSON object of following type :   {  "time": "14040404.550055", "Food_24ww": {      "Grains" : {               "status" : "OK",              "report": {                   "... See more...
Hi ,    I have a JSON object of following type :   {  "time": "14040404.550055", "Food_24ww": {      "Grains" : {               "status" : "OK",              "report": {                   "2014": {                           "type" :"rice",                           "prod" : "50",                           "rate"  : "30"                   },                "2015": {                        "type": "pulses",                        "prod" : "50",                       "rate"  : "30"                }       } },    "Beverages" : {           "status": "Good",        "2014": {            "type" :"pepsi",           "prod" : "50",           "rate"  : "60"         },      "2015": {          "type": "coke",          "prod" : "55",          "rate"  : "30"       }    }  } }   I want to extract all the key values inside "report" key for "Grains" and "Beverages". Means , for Grains , I want 2014 (and key values inside it), 2015 (and key values inside it) , similarly for Beverages.   Now the challenge is none of the JSON keys until "reports" are constant.  The first key "Food_24ww" and the next level "Grains" and "Beverages" are not constant.    Thanks
Hello. I am a Splunk newbie. I have a question about the replication factor in searchhead clustering. Looking at the docs it says that search artifacts are only replicated for scheduled saved sea... See more...
Hello. I am a Splunk newbie. I have a question about the replication factor in searchhead clustering. Looking at the docs it says that search artifacts are only replicated for scheduled saved searches. https://docs.splunk.com/Documentation/Splunk/9.1.2/DistSearch/ChooseSHCreplicationfactor   I'm curious as to the reason and advantage of duplicating search artifacts only in this case. And, then, in the case of real-time search, is it correct that search artifacts are not replicated and only remain on the local server? In that case, in a clustering environment, member 2 should not be able to see the search results of member 1. But I can view it by using the loadjob command in member2. Then, wouldn’t it be possible to view real-time search artifacts as well? Thank you
Hello Team, We have deployed machine agent as an  side car(different container within a pod) for  apache in OSE. It's working for most of the pod but for one pod we are getting below error. code-ex... See more...
Hello Team, We have deployed machine agent as an  side car(different container within a pod) for  apache in OSE. It's working for most of the pod but for one pod we are getting below error. code-external-site-ui-sit-50-gm9np==> [system-thread-0] 23 Jan 2024 08:22:14,654 DEBUG RegistrationTask - Encountered error during registration. com.appdynamics.voltron.rest.client.NonRestException: Method: SimMachinesAgentService#registerMachine(SimMachineMinimalDto) - Result: 401 Unauthorized - content:   at com.appdynamics.voltron.rest.client.VoltronErrorDecoder.decode(VoltronErrorDecoder.java:62) ~[rest-client-1.1.0.245.jar:?] at feign.SynchronousMethodHandler.executeAndDecode(SynchronousMethodHandler.java:156) ~[feign-core-10.7.4.jar:?] at feign.SynchronousMethodHandler.invoke(SynchronousMethodHandler.java:80) ~[feign-core-10.7.4.jar:?] at feign.ReflectiveFeign$FeignInvocationHandler.invoke(ReflectiveFeign.java:100) ~[feign-core-10.7.4.jar:?] at com.sun.proxy.$Proxy114.registerMachine(Unknown Source) ~[?:?] at com.appdynamics.agent.sim.registration.RegistrationTask.run(RegistrationTask.java:147) [machineagent.jar:Machine Agent v23.9.1.3731 GA compatible with 4.4.1.0 Build Date 2023-09-20 05:14:38] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] at java.lang.Thread.run(Thread.java:834) [?:?] code-external-site-ui-sit-50-gm9np==> [system-thread-0] 23 Jan 2024 08:22:17,189 DEBUG GlobalTagsConfigsDecider - Global tags enabled: false code-external-site-ui-sit-50-gm9np==> [system-thread-0] 23 Jan 2024 08:22:17,189 DEBUG RegistrationTask - Running registration task code-external-site-ui-sit-50-gm9np==> [system-thread-0] 23 Jan 2024 08:22:17,256  WARN RegistrationTask - Encountered error during registration. Will retry in 60 seconds. code-external-site-ui-sit-50-gm9np==> [system-thread-0] 23 Jan 2024 08:22:17,256 DEBUG RegistrationTask - Encountered error during registration.   We have cross-verified and everything looks good from the configuration end.    Kindly help us with your suggestions.
Hi Team, We have opted for 250 GB of licensing on daily basis.  So if the license is reaching more than 70% (i.e. 175 GB) i need to get an alert similarly if the license is getting reached 80% and m... See more...
Hi Team, We have opted for 250 GB of licensing on daily basis.  So if the license is reaching more than 70% (i.e. 175 GB) i need to get an alert similarly if the license is getting reached 80% and more (i.e. 200 GB) then i need to get another alert. And finally if it crossed more than 90% (i.e. 225 GB) i need to get another alert.   So can you help me with the Search query.
Hi, I have  database1 and database2,  I have query1 to get the data from database1 and query2 to get data from database2. query3 to get unique values from databse2 which doesn't exist in database1. ... See more...
Hi, I have  database1 and database2,  I have query1 to get the data from database1 and query2 to get data from database2. query3 to get unique values from databse2 which doesn't exist in database1. Now my requirement is to combine the common values in both the databases using a query1 & query2 and also unique values from query2 from database2 which doesn't exist in database1. Please provide me the Splunk query.
Dear Team, Is it possible to join a Splunk license server through proxies? I found this but I don't know if it applies to this context: https://docs.splunk.com/Documentation/Splunk/9.1.2/Admin/Con... See more...
Dear Team, Is it possible to join a Splunk license server through proxies? I found this but I don't know if it applies to this context: https://docs.splunk.com/Documentation/Splunk/9.1.2/Admin/ConfigureSplunkforproxy   Regards,      
I have installed splunk and added windows systems to splunk through universal forwarder, but I have a problem with default system names, these names confusing me when I check their status, I want to ... See more...
I have installed splunk and added windows systems to splunk through universal forwarder, but I have a problem with default system names, these names confusing me when I check their status, I want to consider alias name or rename hostname so that I diagnose system with it's name in search.  For example, I want to change hostname "WIN-KLV1NNUJO8P" to "mydashboard" . Please help me, I can't find answer for this problem and solutions that I found in the internet not working
Can some one help me with query for getting logs in descending order based on API execution time which printed on logs.
Hi everyone, I would want to ask if I can create a field alias for _indextime and _time then set this alias as a default field for all sourcetype?
Hello. I am a security researcher analysing the CVE-2023-46214 vulnerability.  I think this vulnerability have a problem using exsl:document. So I want to block packets containing exsl:document, bu... See more...
Hello. I am a security researcher analysing the CVE-2023-46214 vulnerability.  I think this vulnerability have a problem using exsl:document. So I want to block packets containing exsl:document, but do you use exsl:document in real life? Is this a feature that is officially supported by Splunk?
Hello, I have a windows machine with UF installed on that machine. How can I configure my Universal forwarder to ingest windows performance monitoring logs into SPLUNK. Our Windows Source server is ... See more...
Hello, I have a windows machine with UF installed on that machine. How can I configure my Universal forwarder to ingest windows performance monitoring logs into SPLUNK. Our Windows Source server is located in different location SPLUNK should be getting performance data from. Any help would be greatly appreciated. Thank you!  
Hello All, I have created an Alert with the following query, Issue I'm having here is, I'm not receiving email alert even if the condition is met and events are returned.   | dbxquery query="SELEC... See more...
Hello All, I have created an Alert with the following query, Issue I'm having here is, I'm not receiving email alert even if the condition is met and events are returned.   | dbxquery query="SELECT eventTriggeredDate, APPLICATION_NAME, APPLICATION_NAMEENV, APPLICATION_GROUP, eventChain, eventType, eventMessage, eventMod, eventRule, eventSeverity FROM Admin.console.v_ES_RelevantEvents55 WHERE eventTriggeredDays <= 7 AND (APPLICATION_NAME='ABC_PRD' OR APPLICATION_NAME='XYZ-PRD') AND APPLICATION_NAMEENV='PRD'" connection="TESTING_DEV" | lookup users_email.csv "Application Name" as APPLICATION_NAME OUTPUT "Admin email" as Admin_email "QA email" as QA_email "Developers email" as Developers_email | lookup policy_details.csv policy_name as eventRule OUTPUT policy_description | eval users_mail = Admin_email.",".Developers_email.",".QA_email | stats count as Total_Events values(eventChain) as "Event Policy/Rule" values(eventType) as "Event Type" values(eventMod) as "Event Mod/Policy" values(eventRule) as "Event Rule" values(users_mail) as users_mail values(eventMessage) as eventMessage values(policy_description) as policy_description by APPLICATION_NAME, eventSeverity | eval eventMessage=mvindex(eventMessage, 0, 20) | where Total_Events > 10 | table APPLICATION_NAME, Total_Events, eventSeverity, "Event Type", "Event Rule", users_mail, eventMessage, policy_description | rename APPLICATION_NAME as application_name, Total_Events as number_of_events, eventSeverity as event_severity, "Event Type" as event_type, "Event Rule" as event_rule, eventMessage as event_message   I have given email list as $result.users_mail$, the values from the filed users_mail. I see the alert being triggered but i don't receive an email. Also is there a way we can add external links to the Splunk Alerts?
I am trying to filter my search results where only a particular subset of the results should be shown. Example suppose if below is the intermediate search result.  MESSAGE: Records::0 MESSAGE: Reco... See more...
I am trying to filter my search results where only a particular subset of the results should be shown. Example suppose if below is the intermediate search result.  MESSAGE: Records::0 MESSAGE: Records::1 MESSAGE: Records::0 MESSAGE: Records::4 Final search results should contain only where the records are greater than 0. Is there any query which can help with this?
Hi,  I want to get rid of columns which have single unique value. There could be multiple columns showing this behavior.  Test Value1 Value2 Value3 Value4 Test1 2 b a 7 Test2 1 c... See more...
Hi,  I want to get rid of columns which have single unique value. There could be multiple columns showing this behavior.  Test Value1 Value2 Value3 Value4 Test1 2 b a 7 Test2 1 c a 7   I want to get rid of columns "Value3" and "Value4" since they have only one unique value across.   @gcusello @ITWhisperer @scelikok @PickleRick     
#mission_control, # splunk cloud Hi  In my org primarily Mission Control events are investigated by SOC as soon as they pop up, if futher investigation is needed the incident is escalated to Enterp... See more...
#mission_control, # splunk cloud Hi  In my org primarily Mission Control events are investigated by SOC as soon as they pop up, if futher investigation is needed the incident is escalated to Enterprise security TEAM who is responsible to perform deeper/detailed investigation and update back in Mission Control.  USE CASE:  The enterprise security manger wants a DASHBOARD which will inform him about :  if the investigation is being performed by his team (ES)> how much average time his team member takes to resolve an incident > averaged over a month.   For ES team I have lookup file or also I can type there name(Only 7-8 people) > I NEED A QUERY WHICH WILL EVALUATE WHEN assigne=(tom,tim,xyz) , difference between update_time & create_time , averaged out over month.  Field we have : | mcincidents   add_response_stats=true | eval create_time=strtime(create_time, "%m/%d%Y %I:%M:%S %p") | eval update_time=strtime(create_time, "%m/%d%Y %I:%M:%S %p") | table assigne, create_time, update_time, description, disposition, id, incident_type, name, sensitivity, source_type, summary
My company flagged redis being vulnerable to security because requirepass is not enabled. How do I enable it and give the password to the clients that connect to the redis?
How to display top 10 and replace the rest with others? I tried using   top limit 5 with userother, but the number didn't match and showed other fields like count, percent and _tc.  This is just ... See more...
How to display top 10 and replace the rest with others? I tried using   top limit 5 with userother, but the number didn't match and showed other fields like count, percent and _tc.  This is just an example.  I have a lot of fields and rows in real data  Thank  you for your help | addcoltotals labelfield=Name | top limit=5 userother=t Name Score ==> number didn't match Before Expense Name Score 1 Rent 2000 2 Car 1000 3 Insurance 700 4 Food 500 5 Education 400 6 Utility 200 7 Entertainment 100 8 Gym 70 9 Charity 50 10 Total 5020 After Expense Name Score 1 Rent 2000 2 Car 1000 3 Insurance 700 4 Food 500 5 Education 400 6 Others 420 7 Total 5020
Hi ,   I have two sets of JSON data. I want to find the keys which are unique in one dataset and also keys which are missing in the same in comparison with the other dataset. My first data set ... See more...
Hi ,   I have two sets of JSON data. I want to find the keys which are unique in one dataset and also keys which are missing in the same in comparison with the other dataset. My first data set looks as below :   { "iphone": { "price" : "50", "review" : "Good" }, "desktop": { "price" : "80", "review" : "OK" }, "laptop": { "price" : "90", "review" : "OK" } } My second data set looks as below : { "tv": { "price" : "50", "review" : "Good" }, "desktop": { "price" : "60", "review" : "OK" } } Therefore, for the first data set (w.r.t second data set): unique values will be :  iphone and laptop and missing values will be : tv  How can I find out this difference and show then in a table with columns like "uniq_value" and "missing_value" I could only write the query up to this , but this is half part and not what I want: index=product_db | |eval p_name=json_array_to_mv(json_keys(_raw)) |eval p_name = mvfilter(NOT match(p_name, "uploadedBy") AND NOT match(p_name, "time") | mvexpand p_name| table p_name Thanks
Would you kindly assist us in hiding the credit card number and expiration date for the following field some ab n required YES Accommodation [Bucharest] 5 Nights – Novotel Bucharest HDFC Master ca... See more...
Would you kindly assist us in hiding the credit card number and expiration date for the following field some ab n required YES Accommodation [Bucharest] 5 Nights – Novotel Bucharest HDFC Master card number 1234 4567 0009 2321 Expiry Date of HDFC card 01/26 Any other relevant info Thanks and Regards, Murali. From