All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello there   I'm trying dashboard studio for the first time and it awesome but i can't figure out why in drilldown option I can see only what in screenshot: any idea? Thanks in advance
Hello Community. Can you please tell me how to fix this, I don't understand why this is happening. I have explored various topics but have not been able to find a solution. I have an applica... See more...
Hello Community. Can you please tell me how to fix this, I don't understand why this is happening. I have explored various topics but have not been able to find a solution. I have an application which is configured by Splunk_TA_nix on remote servers. But not all servers are getting the CPU=all field I first encountered this when a team with their dashboard contacted me. They had 2 lonely servers. On one of them the CPU field was extracted and the dashboard worked. On the other one it didn't work anymore. I have set up a new server to forward the logs. But there was no CPU field on that one either. I even installed the sysstat utility. But I can't figure it out yet. Thus I am asking for help. Regards to everyone
Hi All,  i am using 2 searches combined via an append to get me data in the following format. Each row is a distinct event in Raw data. _time Status owner rule_ID 2022-08-03 23:00... See more...
Hi All,  i am using 2 searches combined via an append to get me data in the following format. Each row is a distinct event in Raw data. _time Status owner rule_ID 2022-08-03 23:00:00 <null> unassigned 001 2022-08-03 23:35:00 Acknowledged John 001 2022-08-03 23:40:00 Resolved John 001   I need to calculate time_difference between each event  i.e. each row above.  How can i get another column called "difference" added that shows the delta between these 3 different events. Desired Output: _time Status owner rule_ID Difference  2022-08-03 23:00:00 <null> unassigned 001 0 2022-08-03 23:35:00 Acknowledged John 001 0:35:00 2022-08-03 23:40:00 Resolved John 001 0:05:00 Note:  Rule_ID is the only common field in all 3 events. I referred to other posts here where folks have recommended transaction command. Unfortunately i don't have any specific field to use in startswith or endswith  , so transaction won't work. Thank you in advance
I have this query in Splunk which gets me the src_ip  along with different fields  for the particular UserId. But i want to exclude the logs having src_ip starting with either 10 or 172 . Could someo... See more...
I have this query in Splunk which gets me the src_ip  along with different fields  for the particular UserId. But i want to exclude the logs having src_ip starting with either 10 or 172 . Could someone please help     index=wineventlog $UserId sourcetype="WinEvt:ADFS" EventCode=120* | rex "IpAddress\W(?<src_ip>\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3})" | rex "Activity\sID\W(?<Activity_ID>\s.*)" | table src_ip, Activity_ID, _time, UserAgent | sort _time | reverse
Hi , Can you please help me to write a query for calculating the difference in time for two simultaneous logs? I want to calculate the difference for multiple such logs simultaneously, and then vie... See more...
Hi , Can you please help me to write a query for calculating the difference in time for two simultaneous logs? I want to calculate the difference for multiple such logs simultaneously, and then view the difference in a tabular format
Hello, I am trying to add annotations to a line chart, where the x-axis is a simple Id ( 1, 2, 3,....), a field named "RunId".  Each event label then should display (on mouse hover) the content of... See more...
Hello, I am trying to add annotations to a line chart, where the x-axis is a simple Id ( 1, 2, 3,....), a field named "RunId".  Each event label then should display (on mouse hover) the content of the field "Info", which is extracted in the annotation search. Both primary and annotation datasource include the field "RunId". But no event is displayed, why ?     "viz_jcq0L1f3": { "type": "viz.column", "dataSources": { "primary": "ds_bNZHw3eE", "annotation": "ds_annotation" }, "encoding": { "annotationX": "annotation.RunId" },... ... }         "ds_annotation": { "type": "ds.search", "options": { "query": "index=someTestIndex source=*runids.txt | rex field=_raw \" (?<RunId>\\d+)\" |rex field=_raw \"info=\\\"(?<Info>[^\\\"]+)\" | dedup RunId | table RunId, Info" }, "name": "Annotation Search" },    
Hi All, We have turned on the Use Case - ESCU 0365 Authentication Failures Alert We need this turned on in order to assess risky logins, however due to the nature of the company (being a Universi... See more...
Hi All, We have turned on the Use Case - ESCU 0365 Authentication Failures Alert We need this turned on in order to assess risky logins, however due to the nature of the company (being a University) we have alot of old unused Alumni accounts that we cannot get rid of. The issue we are running into is that the risk scores of some of these accounts are consistently rising causing a consistent volume of Highs.  We have found that the accounts are trying to login via the user agent BAV2ROPC which is the Azure User Agent for Legacy Authentication (IMAP, POP3 etc) We have tried adding these users to a conditional access policy in Azure to prevent these Authentication attempts but that has not worked. The question I am asking is, is there any possible way in the Use Case search to specifically filter our BAV2ROPC so we do not constantly get these alerts as they are causing alot of noise within our search and it is making it difficult to find actual attempts to access users accounts. This is our Correlation Search for this use case index=appext_o365 `o365_management_activity` Workload=AzureActiveDirectory UserAuthenticationMethod=* status=failure | stats count earliest(_time) AS firstTime latest(_time) AS lastTime values(UserAuthenticationMethod) AS UserAuthenticationMethod values(UserAgent) AS UserAgent values(status) AS status values(src_ip) AS src_ip by user | where count > 10 | `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | `o365_excessive_authentication_failures_alert_filter`
I have field user-agent like this user-agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36\r\nHost: domain.com\r\nConne... See more...
I have field user-agent like this user-agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36\r\nHost: domain.com\r\nConnection: Keep-Alive\r\n"   How's the SPL query, if I just wanna get the "domain.com". Thanks.
1st Query :     StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" HasError__c=false Transaction_Log__c="*" | eval message = "200andNo matching records were found" | where l... See more...
1st Query :     StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" HasError__c=false Transaction_Log__c="*" | eval message = "200andNo matching records were found" | where like(_raw,"%".message."%") | append [search StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" Transaction_Log__c="*" | eval message = "400andDealer Code provided is invalid" | where like(_raw,"%".message."%")] | append [search StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" Transaction_Log__c="*" | eval message = "400andDealer Type provided is invalid" | where like(_raw,"%".message."%")] | append [search StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" Transaction_Log__c="*" | eval message = "400andNo Dealer Code was provided" | where like(_raw,"%".message."%")] | append [search StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" Transaction_Log__c="*" | eval message = "400andNo Dealer Type was provided" | where like(_raw,"%".message."%")] | append [search StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" Transaction_Log__c="*" | eval message = "400andInvalid input data" | where like(_raw,"%".message."%")] | append [search StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" Transaction_Log__c="*" | eval message = "500andCannot deserialize request body" | where like(_raw,"%".message."%")] | append [search StoreManagementAPI index=b2cforce sourcetype="sfdc:exception_log__c" ErrorCode__c=500 Interface_Name__c=StoreManagementAPI | eval message = "Unexpected character" | where like(_raw,"%".message."%")] | append [search StoreManagementAPI index=b2cforce sourcetype="sfdc:exception_log__c" ErrorCode__c=500 Interface_Name__c=StoreManagementAPI | where Error_Description__c != "Unexpected character (&#39;}&#39; (code 125)): was expecting double-quote to start field name at [line:4, column:6]" | table _time,Error_Description__c | rename Error_Description__c as message] | timechart span=30m count by message | eval eval threshold = 25     2nd query :     StoreManagementAPI index=b2cforce sourcetype="*" "attributes.type"="*" | stats count(sourcetype) as total_events | where total_events > 480     3rd query :   StoreManagementAPI index=b2cforce sourcetype="sfdc:transaction_log__c" Transaction_Log__c="*" | eval _raw= Transaction_Log__c | rex max_match=0 "timestamp[[:punct:]]+(?<timestamp>[^\\\"]+)" | eval first_timestamp=mvindex(timestamp,0), last_timestamp=mvindex(timestamp, -1) | eval first_ts = strptime(first_timestamp, "%Y-%m-%dT%H:%M:%S.%3N%Z"), last_ts = strptime(last_timestamp, "%Y-%m-%dT%H:%M:%S.%3N%Z") | eval diff = last_ts - first_ts | stats avg(diff) as average  
Hi what would be the best way to check if after a user has been added to a group, they have not been removed from the same group within say 24 hours.  I currently have a search that provides a tabl... See more...
Hi what would be the best way to check if after a user has been added to a group, they have not been removed from the same group within say 24 hours.  I currently have a search that provides a table that shows group additions and group removals using winevent index. What is the best way to find events where there has been an addition but no removal for the same group and user added within 24 hours. I started to look at | transaction but I don't feel this is correct as I am interested if there has not been a removal after a time period. Failing this if anyone has an alternate solution to alert when a user has been added and not removed from a group within a time period that would be much appreciated. Thanks
Hello, I have a raw data that go like this   ... in[ 60: ]<3034> in[ 62: ]<10> in[ 62: ]<EC_CARDVER> ...     I want to extract the EC_CARDVER to a field name msg My rex is   | rex field=_raw... See more...
Hello, I have a raw data that go like this   ... in[ 60: ]<3034> in[ 62: ]<10> in[ 62: ]<EC_CARDVER> ...     I want to extract the EC_CARDVER to a field name msg My rex is   | rex field=_raw "(in)\[ 62: \]\<(?P<msg>)\>"   But it doesn't seem to catch on. How do I write to extract only the EC_CARDVER but not the 10 above it?
I have created a query to detect too much blocked traffic to one single destination.Somehow this doesn't work. Help me to resolve this. bin _time span = 5m as timespan | eval start time = strptime(... See more...
I have created a query to detect too much blocked traffic to one single destination.Somehow this doesn't work. Help me to resolve this. bin _time span = 5m as timespan | eval start time = strptime(connection_start_time,"%Y-%m-%d %H:%M:%S") |stats dc(D_IP)as num values (start_time)by src_ip | search num>3 | sort num desc I want to display the src ip,hostname,destination ip, count
Hi all, I am new to Splunk. Right now I am trying to make a table out of a log, which contains different fields like Level = INFO etc., there's a field      Log = {"objects":[object1, object2 ... See more...
Hi all, I am new to Splunk. Right now I am trying to make a table out of a log, which contains different fields like Level = INFO etc., there's a field      Log = {"objects":[object1, object2 ...], "info": "some strings", "id1": someInt, "id2": someInt} Log = {"objects":[object1, object2 ...], "info": "some other strings", "id1": someOtherInt, "id2": someOtherInt} Log = { "info": "some log strings"} Log = "some string"     I have tried a few rex and spath but it seems that it's not working well I would like to extract "objects" field by different "info", for example, I need objects from Log but sometimes I need objects from the first Log above, and sometimes I need them from second Log ( for different panels in dashboard), and the way to separate them is by using "info" And need to display objects in it in a chart under a column. Any help/hints are appreciated!
query: index=xxx host=xx sourcetype=xxx source=xxx |rex field = -raw "\MeasuStatus\:(?<Status>.*?)\|" |where isnotnull(Status) |eval Success=if(Status="0", "Done", null()) |eval Failed=if(Sta... See more...
query: index=xxx host=xx sourcetype=xxx source=xxx |rex field = -raw "\MeasuStatus\:(?<Status>.*?)\|" |where isnotnull(Status) |eval Success=if(Status="0", "Done", null()) |eval Failed=if(Status!="0", "notDone", null()) |stats count(Sucess) as SuccessC count(Failed) as FailedC count(Status) as overall |eval SuccessPerc=(SuccessC/overall) *100 |eval SucessPercentage=round(SucessPerc,2) |table SucessPercentage The above query is working fine, But i want to modify the query to run in less time because now it is taking more time to get the results. Can any one suggest.
One of our client recently performed a vulnerability scan on Splunk Enterprise 8.2.7 and they were found as vulnerable for Apache Spark package and Apache hive package : bin\jars\vendors\spark\3.0.1... See more...
One of our client recently performed a vulnerability scan on Splunk Enterprise 8.2.7 and they were found as vulnerable for Apache Spark package and Apache hive package : bin\jars\vendors\spark\3.0.1\lib\spark-core_2.12-3.0.1.jar  and  \bin\jars\thirdparty\hive_3_1\hive-exec-3.1.2.jar I see version 9.0 uses patched version of hive i.e 3.1.3 and does not use spark Did anyone else found this ??  
I am trying to set up anomaly detection based on the number of ModSecurity warnings in the log in real-time to indicate an attack. I set up an experiment in the MLTK with the query below and set up t... See more...
I am trying to set up anomaly detection based on the number of ModSecurity warnings in the log in real-time to indicate an attack. I set up an experiment in the MLTK with the query below and set up the alerting system. The app is used Mon-Fri, so we tried to account for the low traffic on Sat-Sun in the data model. Unfortunately, the alert is being triggered constantly. Can anyone assist with how to write the query or set up the experiment for my use case?    index="APP_NAME_production" source="<PATH TO LOG>/modsec_audit.log:*" "ModSecurity: Warning" | bin _time span=15m | stats sum(linecount) as total_lines by _time | eval HourofDay=strftime(_time,"%H") | eval DayofWeek=strftime(_time, "%A")
I just installed this app and found it simple to setup...but I must be doing something wrong. I've created Trap information on my two UPS devices and haven't had any luck bringing them into Splunk. I... See more...
I just installed this app and found it simple to setup...but I must be doing something wrong. I've created Trap information on my two UPS devices and haven't had any luck bringing them into Splunk. I enabled SNMP, all versions and then I added the IPs for the Traps to point to and included my splunk cloud DNS name, Deployment IP, HF and UF and haven't seen anything come in.  It also says the default sourcetype it has is snmp_ta Activation Key * Using 14 day free trial Log Level INFO  SNMP Mode Listen For Traps (I've also tried DEBUG) SNMP Version 2C I left everything else blank except  SNMP Trap listener settings I put the IP address of the UPS I'm trying to get the information from. 
I saw that a new version of this add-on was released to support OAuth. The instructions for setting up the Client ID is truncated: "The Reporting Web Service should now appear in the list of applic... See more...
I saw that a new version of this add-on was released to support OAuth. The instructions for setting up the Client ID is truncated: "The Reporting Web Service should now appear in the list of applications that your app requires permissions for <blank" I added ReportingWebService.Read.All to the Client ID I already use for other O365 logs, and configured the new TA but this still gives me a 401 error. Are there additional premissions required?
So I have migrated to Splunk Cloud, but still have a Deployment server, UF, and HF. How do I find out what my IP is for Splunk Cloud?  I'd like to be able to send Trap information directly there. Es... See more...
So I have migrated to Splunk Cloud, but still have a Deployment server, UF, and HF. How do I find out what my IP is for Splunk Cloud?  I'd like to be able to send Trap information directly there. Especially since the UPS keeps saying failed to find DNS name.  Thank you. 
Hi Guys, I want to know How can i configure a multiple drilldown on a tablet in Dashboard Studio. I need to put different links in Row and collums, but in Dashboard Studio haver only one way.