All Topics

Top

All Topics

Does anyone have experience writing a query that can be used to alert on disabled AD accounts being re-enabled? I've learned that Windows EventCode 4722 can be used to find accounts being enabled, bu... See more...
Does anyone have experience writing a query that can be used to alert on disabled AD accounts being re-enabled? I've learned that Windows EventCode 4722 can be used to find accounts being enabled, but I'm unsure of how to correlate that with whether or not the account was in a disabled state beforehand.  
I have the following json event:   { "tags": [ {"key":"Name","value":"Damian"}, {"key":"Age","value":34}, {"key":"Country","value":"Argentina"}, {"key":"City","value":"Buen... See more...
I have the following json event:   { "tags": [ {"key":"Name","value":"Damian"}, {"key":"Age","value":34}, {"key":"Country","value":"Argentina"}, {"key":"City","value":"Buenos Aires"} ] }   I need to extract the correspondent fields in my event, with the key and value: Name="Damian" Age="34" Country="Argentina" City="Buenos Aires" This is what I tried:   | spath path=tags{}.key output=a_keys | spath path=tags{}.value output=a_values | eval {a_keys} = a_value     But the result of it is a multivalued field: Name Age Country City = [ "Damian", "34", "Argentina", "Buenos Aires" ] How can I create the correct fields?
With macOS Ventura (13) coming in a few months. Is there is a plan to provide a client that at least supports macOS Monterey (12)?
I have the raw data where i need to convert the time in raw data to particular time zone example:if the time contains emea in it i need to convert to CST time. his is the 3 conditions of time zon... See more...
I have the raw data where i need to convert the time in raw data to particular time zone example:if the time contains emea in it i need to convert to CST time. his is the 3 conditions of time zone: when emea => CEST/CST time when apac => HKT time when us=>EDT 6/10/22 9:39:00.000 AM   2022-06-10 15:39:00 emea 6/10/22 9:41:56.000 AM   2022-06-10 15:41:56 apac 6/10/22 9:41:56.000 AM   2022-06-10 15:41:56 us   Please help me on the query Thank you in advance
Hi everyone, I'm actually trying to set up splunk-connect-for-kubernetes to get my cluster logs. I created 2 metrics and 1 HEC but i don't know if they are correctly set up or not : . My metric... See more...
Hi everyone, I'm actually trying to set up splunk-connect-for-kubernetes to get my cluster logs. I created 2 metrics and 1 HEC but i don't know if they are correctly set up or not : . My metrics are using the Search & Reporting application ( don't know if i should use something else ) . My HEC don't have a specific sourcetype ( using automatic one ) and have the two created metrics. As i'm using splunk enterprise on my localhost my values.yml file is configured this way :   And for the metrics part : The deployment with helm seems to be correct : But i don't receive any data from my cluster. Where am i doing it wrong ? Thank you in advance,
Hi, I am getting this error, can anyone help? WARN ManagedMonitorDelegate - Metric Reporter Queue full. Dropping metrics. ^ Post edited by @Ryan.Paredez this post was split off into its own new... See more...
Hi, I am getting this error, can anyone help? WARN ManagedMonitorDelegate - Metric Reporter Queue full. Dropping metrics. ^ Post edited by @Ryan.Paredez this post was split off into its own new post because it was a reply to a post that was over two years old. It's better to create a new conversation than you reply to a post that is over a year old. 
Hey all, I'm trying to build a clickable dashboard. For that, I had choosen line chart visualization. So in the below pic you can see the line chart resulting count of events with respective some ran... See more...
Hey all, I'm trying to build a clickable dashboard. For that, I had choosen line chart visualization. So in the below pic you can see the line chart resulting count of events with respective some random id. So my idea is to make that id or count clickable so that we redirect to see the original events i.e for below shown id we should able to see all 6,630 event logs in new tab.. Is that possible by any chance? Thanks in advance @ITWhisperer 
Hi Team, We have been requested to integrate logs CloudWAN application logs using API key/token(Pull). Means Pull logs from CloudWan using API token But the issue is that CloudWAN API key has val... See more...
Hi Team, We have been requested to integrate logs CloudWAN application logs using API key/token(Pull). Means Pull logs from CloudWan using API token But the issue is that CloudWAN API key has validity only 1 days so it is not possible generate API key on daily basis to pull the logs from CloudWAN. So we are looking any Automated Scripts or Method to generate API key and refresh data into SPlunk.    
Hi All, I  have below logs in one event: AMQ8450I: Display queue status details. QUEUE(ECS.AU.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(00.28.06) LPUTTIME(00.28.06) AMQ8450I: Disp... See more...
Hi All, I  have below logs in one event: AMQ8450I: Display queue status details. QUEUE(ECS.AU.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(00.28.06) LPUTTIME(00.28.06) AMQ8450I: Display queue status details. QUEUE(ECS.HK.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(01.32.35) LPUTTIME(01.32.35) AMQ8450I: Display queue status details. QUEUE(ECS.ID.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(01.26.46) LPUTTIME(01.26.46) AMQ8450I: Display queue status details. QUEUE(ECS.MY.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(01.38.02) LPUTTIME(01.38.02) AMQ8450I: Display queue status details. QUEUE(ECS.PH.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(23.12.07) LPUTTIME(23.12.07) AMQ8450I: Display queue status details. QUEUE(ECS.SG.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(01.39.26) LPUTTIME(01.39.26) AMQ8450I: Display queue status details. QUEUE(ECS.TH.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(01.28.20) LPUTTIME(01.28.20) AMQ8450I: Display queue status details. QUEUE(ECS.VN.TO_KAFKA_RES.LISTEN) TYPE(QUEUE) CURDEPTH(0) LGETTIME(21.47.43) LPUTTIME(21.47.43) I tried to create a table out of the data using the query: *** | rex field=_raw max_match=0 "QUEUE\((?P<Queue_Name>[^\)]+)\)" | rex field=_raw max_match=0 "CURDEPTH\((?P<Cur_Depth>[^\)]+)\)" | rex field=_raw max_match=0 "LGETTIME\((?P<Get_Time>[^\)]+)\)" | rex field=_raw max_match=0 "LPUTTIME\((?P<Put_Time>[^\)]+)\)" | table Queue_Name,Cur_Depth,Get_Time,Put_Time But the table comes out as below: Queue_Name Cur_Depth Get_Time Put_Time ECS.AU.TO_KAFKA_RES.LISTEN ECS.HK.TO_KAFKA_RES.LISTEN ECS.ID.TO_KAFKA_RES.LISTEN ECS.MY.TO_KAFKA_RES.LISTEN ECS.PH.TO_KAFKA_RES.LISTEN ECS.SG.TO_KAFKA_RES.LISTEN ECS.TH.TO_KAFKA_RES.LISTEN ECS.VN.TO_KAFKA_RES.LISTEN 0 0 0 0 0 0 0 0 00.28.06 01.32.35 01.26.46 01.38.02 23.12.07 01.39.26 01.28.20 21.47.43 00.28.06 01.32.35 01.26.46 01.38.02 23.12.07 01.39.26 01.28.20 21.47.43 The problem here is that all the data is coming up in one row only. I tried "mvexpand" command to split up in individual rows but failed to do so. Please help to modify the query to make the output table come as one row per line. Thank All..!!
I am trying parse data from three tables. In one table I have MAC_ADDR and HOST_NAME info, the second table has MAC_ADDR IP_ADDR NEIGHBOR_ADDR PORT and the third has IF_MAC DEVICE_NAME.  The field... See more...
I am trying parse data from three tables. In one table I have MAC_ADDR and HOST_NAME info, the second table has MAC_ADDR IP_ADDR NEIGHBOR_ADDR PORT and the third has IF_MAC DEVICE_NAME.  The field names are as above. I use join for the first two table the following way:     search router_table | join mac_addr [ search dhcp_table ] | table mac_addr host_name neighbor_mac ip_addr port        Now I want to search table 3 having fields IF_MAC and DEVICE_NAME where I want to search (if_mac=neighbor_mac) and append device_name. I tried appendcols but I can't pass the neighbor_mac as an argument to the third subsearch. Can anyone help me figure out a way to add the result of the third search?
Hi All,   We are using Splunk add for VMware to monitor Vcenter device. This is installed on virtual appliance. There was no  issue until reboot of the appliance. But after the reboot  yesterday... See more...
Hi All,   We are using Splunk add for VMware to monitor Vcenter device. This is installed on virtual appliance. There was no  issue until reboot of the appliance. But after the reboot  yesterday, we are not receiving data on VMware indexes(VMware-inv, VMware-task event and VMware-perf). We could see all the Vcenter devices are connected in Splunk add on for Vcenter. But still we are not receiving any data. I could see the ports the required ports are opened. In the _internal logs I could see the below error: RROR Application Updater - Error checking for update, URL=https://apps.splunk.com/api/apps:resolve/checkforupgrade: Connect Timeout   Could any one please  provide any inputs to know why there is not data collected. Is the above error related to VMware app.   regards Manjunath R
Hi, I have a lookup containing some admin users and I need to add some text like "ADS_" before the username to distinguish them from normal users. I tried: index=myindex tag=authentication | look... See more...
Hi, I have a lookup containing some admin users and I need to add some text like "ADS_" before the username to distinguish them from normal users. I tried: index=myindex tag=authentication | lookup Ads.csv Utenza AS username OUTPUT Gruppo | fillnull value=NULL | eval username=if(Gruppo="NULL",username, ADS_.username) | eval action=if(match(details_message,"opened a Web Portal"),"success",action) | search action=success dest_host!="- -" | stats count by username | sort count desc   what I'm missing?   Thanks in advance!
Is it possible to set TLS to only one input? For example: Checkpoint --> TLS --> SC4S --> Splunk CISCO ASA --> UDP514 --> SC4S --> Splunk So far, i can only find information about enabling TLS ... See more...
Is it possible to set TLS to only one input? For example: Checkpoint --> TLS --> SC4S --> Splunk CISCO ASA --> UDP514 --> SC4S --> Splunk So far, i can only find information about enabling TLS for all, just wondering if i can set it per source.   Thanks!
Hi, I have multiple timecharts which have similar search queries, sharing the same index, the only difference is that they are from different metric names, ie     | mstats max(my_Var) AS my_Var wh... See more...
Hi, I have multiple timecharts which have similar search queries, sharing the same index, the only difference is that they are from different metric names, ie     | mstats max(my_Var) AS my_Var where index=* AND "internal_name"="A1" ... | timechart span=1w sum(Var) AS output | mstats max(my_Var) AS my_Var where index=* AND "internal_name"="A2" ... | timechart span=1w sum(Var) AS output     .. and so on. I would like to have a panel where the various "output" are summed into a combined timechart. I have seen some similar solutions involving tokens, but I am unfamiliar with how they work, so I hope that someone can walk me through what to do, or any other solutions will be great too. Thanks  
We're using Splunk Cloud 8.2.2202.1 and have a data upload issue. I can upload a CSV  using the Add Data button in Settings menu into index=sandbox, but can't upload to my newly created index=xxx. I... See more...
We're using Splunk Cloud 8.2.2202.1 and have a data upload issue. I can upload a CSV  using the Add Data button in Settings menu into index=sandbox, but can't upload to my newly created index=xxx. I get a "supplied index 'xxx' missing" error. After data loaded into sandbox (showing I have accounted for adding a timestamp) I can index=sandbox | collect index=xxx sourcetype=hec testmode=false to put the data into xxx which seems to me proves index=xxx exists despite the contents of the error message. 1. Can anyone suggest what I might be doing wrong in getting my csv data into the correct index without taking a detour through index=sandbox? 2. I suspect there might be more log information about the failure somewhere, but I've looked in index=_internal and have not seen anything relevant. Is there somewhere else I can look?  3. We have been maddened by a string of silent failures in our use of Splunk in the past weeks and I wonder if there isn't a logging verbosity control that we could use to make it clearer what is happening (or not happening) with our Splunk operations?  Kind Regards, Sean
I am stuck on a integration. Scenario:- we have pas sever who generally does the va scan of all the environment now we need to integrate this with splunk Problem statement:-the pas server run que... See more...
I am stuck on a integration. Scenario:- we have pas sever who generally does the va scan of all the environment now we need to integrate this with splunk Problem statement:-the pas server run queries on the data and is storing it in a virtual directory in itslef now the client has installed the windows iis webserver and hosted this directory on a https url and the reports are showing on a url as  E.g url/report1, url/report2 etc. Client has created a service account and password for security to login to website and access the reports.  Now we have 3 things:- https url, username and password.  Which method of integration i should go with? Note :- client said no to u.f installation on the windows server and i checked and found no addon on splunkbase Correct me on Possible solution:- 1) use the inbuilt rest api method but instead of username and password i should ask him for api key? 2)if he cant provide api key then i need to go to addon builder and then create an addon with username and password as authentication? Has anybody worked on this? Is there any documentation avaiy with you? I am not sure of this method 3) use curl command and also put creds and download the report to another server and then use u.f on it to send to splunk. I am not sure of this method due to security concerns. Help me out url, username ans password
Hi, I want to store earliest and latest times of my search in variables to use them in further operations. But I am unable to do so. I am trying like below. | makeresults | eval jobEarliestTi... See more...
Hi, I want to store earliest and latest times of my search in variables to use them in further operations. But I am unable to do so. I am trying like below. | makeresults | eval jobEarliestTime = $job.earliestTime$ | eval jobLatestTime = $job.latestTime$ Could anyone help me on this? Thank You.
Hi Guys, I already have a query below that gives me a table similar to the one on bottom.  I was wondering if there is a way to get it to display results when count of IP Address is exactly 2?    ... See more...
Hi Guys, I already have a query below that gives me a table similar to the one on bottom.  I was wondering if there is a way to get it to display results when count of IP Address is exactly 2?    Meaning show results when IP address = 2 otherwise dont show it.  So 3rd entry should not show but first two should.   Please let me know if any ideas.  Appreciate your helps in advance.       index=EventLog source=security EventCode=4771 | stats count values(source) AS IP_Address BY Account_Name EventID Message | where count > 20   Account_Name EventID Message Count IP Address SmithA 4771 Kerberos pre-authentication failed 5000 1.1.1.1 2.2.2.2 JohnsonX 4771 Kerberos pre-authentication failed 6000 3.3.3.3 4.4.4.4 washingtonZ 4771 Kerberos pre-authentication failed 7000 5.5.5.5
We have a case where -   index = network_index host=xx.xx.xx.xx | eval lag_sec = (_indextime - _time) | stats count by lag_sec   _time is current but _indextime is 37 minute earlier.  What ... See more...
We have a case where -   index = network_index host=xx.xx.xx.xx | eval lag_sec = (_indextime - _time) | stats count by lag_sec   _time is current but _indextime is 37 minute earlier.  What can it be? 
Good Afternoon! I have a search (code example #1) that looks for the EventData_Xml field looking at programs installed. I'm creating a report to show what where and when. Trying to cut out the unne... See more...
Good Afternoon! I have a search (code example #1) that looks for the EventData_Xml field looking at programs installed. I'm creating a report to show what where and when. Trying to cut out the unneeded data and show just the program name, such as Microsoft Edge in the "Program Installed" column in the code example #2 below. Thank you in advance for any assistance. I appreciate it.   index=wineventlog EventData_Xml="*" AND EventID=11707 | table host _time EventData_Xml | rename host as "Host", _time as "Time", EventData_Xml as "Program Installed" | convert ctime(Time) <Data>Product: Microsoft Edge -- Installation completed successfully.</Data><Data>(NULL)</Data><Data>(NULL)</Data><Data>(NULL)</Data><Data>(NULL)</Data><Data>(NULL)</Data><Data></Data><Binary>7B34443639394544332D333539302D334635352D424638302D3732374546444242313032467D</Binary>