All Topics

Top

All Topics

Has anyone found a query or way to track what files have been moved onto or off of a USB. I can see that a USB was plugged in but i need to go one level deeper and see what has been either moved on o... See more...
Has anyone found a query or way to track what files have been moved onto or off of a USB. I can see that a USB was plugged in but i need to go one level deeper and see what has been either moved on or moved off of the USB and to and from the system.
Hello, I'm having issue with getting a report of users Action, with fullname and username = email.. But the sourcetypes have username. But one has username and  fullname but not the action. Also on t... See more...
Hello, I'm having issue with getting a report of users Action, with fullname and username = email.. But the sourcetypes have username. But one has username and  fullname but not the action. Also on the username part one has an uppercase U and other has a lower case u..  I trying to get the fullname,username,action.. This is what I have tried  index=hv_lastpass [search source=lastpass_users fullname="*,*" [search sourcetype="lastpass:activity" Action="Failed Login Attempt"| return fullname] |return Action,Time, Username] | table fullname,Username, Action ,Time index=hv_lastpass | join type=left username [search source=lastpass_users]|join type=left Username [search sourcetype="lastpass:activity"]| table fullname,Username, Action ,Time thank you 
OK, this is odd Search:  index=myindex Works and returns a field "Name", happily listing all values of Name as expected However any search on the name field e.g. index=myindex Name=Fred returns... See more...
OK, this is odd Search:  index=myindex Works and returns a field "Name", happily listing all values of Name as expected However any search on the name field e.g. index=myindex Name=Fred returns the error: Cannot expand lookup field 'Name' due to a reference cycle in the lookup configuration. Check search.log for details and update the lookup configuration to remove the reference cycle. Unfortunately I have no idea what to search for in the search log  Splunk support have only pointed me to this discussion and told me to re-save a specific cisco lookup: https://community.splunk.com/t5/Splunk-Cloud-Platform/Cannot-expand-lookup-field-due-to-a-reference-cycle-in-the/m-p/543455 and it isn't that as we don't have that cisco lookup table
I have a props conf file that is not parsing data as i expected. I can see in the raw log that the IIS log has the header information in it.    [sourcetype] DATETIME_CONFIG = INDEXED_EXTRACTIONS ... See more...
I have a props conf file that is not parsing data as i expected. I can see in the raw log that the IIS log has the header information in it.    [sourcetype] DATETIME_CONFIG = INDEXED_EXTRACTIONS = w3c LINE_BREAKER = ([\r\n]+) MAX_TIMESTAMP_LOOKAHEAD = 32 NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Web description = W3C Extended log format produced by the Microsoft Internet Information Services (IIS) web server detect_trailing_nulls = auto disabled = false pulldown_type = true   When i manual upload the log file and assigning it to it will parse out the log with the same settings.   [sourcetypeA] DATETIME_CONFIG = INDEXED_EXTRACTIONS = w3c LINE_BREAKER = ([\r\n]+) MAX_TIMESTAMP_LOOKAHEAD = 32 NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Web description = W3C Extended log format produced by the Microsoft Internet Information Services (IIS) web server detect_trailing_nulls = auto disabled = false pulldown_type = true   This props is on the searchhead were i am searching the data. The IIS logs is being capture via a UF.   Has anyone run into this before?
Hello Folks, How can i perform a CIDR/Subnet match with the "ip_intel" lookup file that comes by default ?  This lookup KV store dataset has CIDR ranges and single IP's listed under "IP" column . ... See more...
Hello Folks, How can i perform a CIDR/Subnet match with the "ip_intel" lookup file that comes by default ?  This lookup KV store dataset has CIDR ranges and single IP's listed under "IP" column . Basically if the Dest_IP from my search results fall in a subnet range of the "IP" column  of the lookup file , then it should display the result in a table format.  I am able to match against a single IP-address but not against CIDR  range. How do you guys about this one ? Thanks in advance
Hi Everyone, I am new to Splunk. Could someone help me and provide the search for the below query: That would be Great!! I have a search which gives below results by doing comparison of last week a... See more...
Hi Everyone, I am new to Splunk. Could someone help me and provide the search for the below query: That would be Great!! I have a search which gives below results by doing comparison of last week and current week and it shows the count of each status in weekly manner. Date Active Inactive Deleted Added 09/10/21 50 20 5 20   I will be sending the above result to lookup file every week data to provide the summary.(| outputlookup append=true summary.csv) The lookup file looks like below after 3 weeks. Date Active Inactive Deleted Added 09/10/2021 50 20 5 20 16/10/2021 55 15 10 30 23/10/2021 60 10 8 15   Date column keeps on growing dynamically each week,  always I need to  calculate difference between last two column and create new column to tell the difference. The result am expecting is below.    Status 09/10/2021 16/10/2021 23/10/2021 Difference b/w last 2 columns Active 50 55 60 5 Inactive 20 15 10 -5 Deleted 5 10 8 -2 Added 20 30 15 -15
Hello Splunkers, I need help with Network Security Group flow logs where  each of the tuples should be a single event  with other relevant data for an event. Sample.log _raw: {"time":"2021-10... See more...
Hello Splunkers, I need help with Network Security Group flow logs where  each of the tuples should be a single event  with other relevant data for an event. Sample.log _raw: {"time":"2021-10-25T16:17:50.8670851Z","systemId":"1c5751f4-8686-4ea5-82ee-173b64d401dd","macAddress":"xxxxxxxxxx","category":"NetworkSecurityGroupFlowEvent","resourceId":"/SUBSCRIPTIONS/A80612A2-33D6-47FF-817A-283E8BC8EDD2/RESOURCEGROUPS/C-SAP-EUS-NONPROD-01-INT-NETWORKING-RG/PROVIDERS/MICROSOFT.NETWORK/NETWORKSECURITYGROUPS/DATA-INT-SUBNET-NSG","operationName":"NetworkSecurityGroupFlowEvents","properties":{"Version":2,"flows":[{"rule":"DefaultRule_AllowVnetOutBound","flows":[{"mac":"000D3A57248C","flowTuples":["1635178607,,10.123.2.28,46058,9997,T,O,A,E,1,74,1,60","1635178607,10.115.34.31,10.123.2.18,29128,9997,T,O,A,E,19,7292,16,1227","1635178609,10.115.34.31,10.119.241.5,26540,9997,T,O,A,E,47,54806,64,4395","1635178612,10.115.34.31,13.69.239.72,56024,443,T,O,A,B,,,,","1635178613,10.115.34.31,13.69.239.72,56026,443,T,O,A,B,,,,","1635178614,10.115.34.31,10.192.124.221,56488,80,T,O,A,B,,,,","1635178618,10.115.34.31,13.69.239.72,56024,443,T,O,A,E,8,1158,8,4897"]}]},{"rule":"UserRule_AzAppSubnet_access_toAzDBSubnet_Catch-all","flows":[{"mac":"000D3A57248C","flowTuples":["1635178635,10.115.32.28,10.115.34.31,54322,33015,T,I,A,B,,,,"]}]}]}} Json format    category: NetworkSecurityGroupFlowEvent    macAddress: xxxxxxxxxx    operationName: NetworkSecurityGroupFlowEvents    properties: { [-]      Version: 2      flows: [ [-]        { [-]          flows: [ [-]            { [-]              flowTuples: [ [-]                1635172376,ip1,ip2,58636,443,T,O,A,E,6,1611,1,66                1635172377,ip1,ip2,27910,443,T,O,A,B,,,,                1635172377,ip1,ip2,59136,443,T,O,A,E,0,0,0,0                1635172378,ip1,ip2,56756,9997,T,O,A,B,,,,                1635172378,ip1,ip2,58686,9997,T,O,A,B,,,,                1635172379,ip1,ip2,53684,9997,T,O,A,B,,,, Result: Event 1: category: NetworkSecurityGroupFlowEvent    macAddress: xxxxxxxxxx    operationName: NetworkSecurityGroupFlowEvents    properties: { [-]      Version: 2      flows: [ [-]        { [-]          flows: [ [-]            { [-]              flowTuples: [ [-]                1635172376,ip1,ip2,58636,443,T,O,A,E,6,1611,1,66               Event2: category: NetworkSecurityGroupFlowEvent    macAddress: xxxxxxxxxx    operationName: NetworkSecurityGroupFlowEvents    properties: { [-]      Version: 2      flows: [ [-]        { [-]          flows: [ [-]            { [-]              flowTuples: [ [-]               1635172377,ip1,ip2,27910,443,T,O,A,B,,,,                 Thanks
I was told that the first two splunk fundamentals courses were free for veterans, but it looks like those courses are deprecated as of today. Is there still any free training for veterans or has that... See more...
I was told that the first two splunk fundamentals courses were free for veterans, but it looks like those courses are deprecated as of today. Is there still any free training for veterans or has that ship sailed?
Hello everyone, I have the following inputs.conf file which is actually working for the first 2 stanza, but not for the third one. could someone please tell me why? I do not get any events from them... See more...
Hello everyone, I have the following inputs.conf file which is actually working for the first 2 stanza, but not for the third one. could someone please tell me why? I do not get any events from them. [WinEventLog://Security] disabled = 0 renderXml = 1 start_from = oldest current_only = 0 evt_resolve_ad_obj = 1 checkpointInterval = 5 sourcetype = XmlWinEventLog index = ad whitelist1=4624,4769,4728,4732,4756,4761,4751,4746 # This stanza will send all events for the event_code 21 [WinEventLog://Microsoft-Windows-TerminalServices-LocalSessionManager/Operational] disabled = 0 renderXml = 1 sourcetype = XmlWinEventLog index = ad source="XmlWinEventLog:Microsoft-Windows-TerminalServices-LocalSessionManager/Operational" whitelist2=21 # This stanza will send all events for the event_code 1024 [WinEventLog://Microsoft-Windows-TerminalServices-RemoteConnectionManager/Operational] disabled = 0 renderXml = 1 sourcetype = XmlWinEventLog index = ad source="XmlWinEventLog:Microsoft-Windows-TerminalServices-RemoteConnectionManager/Operational" whitelist3=1024   Thank you very much for helping me!  
I've deployed  an architecture with a centralized s3 bucket that forwards AWS logs to an SQS queue. In Splunk side, I have an enterprise edition, already installed the Splunk Add-ons for AWS, set the... See more...
I've deployed  an architecture with a centralized s3 bucket that forwards AWS logs to an SQS queue. In Splunk side, I have an enterprise edition, already installed the Splunk Add-ons for AWS, set the input as Custom>SQS and Configurations as follows: - account number, access keys - IAM role with assume role permissions. I stil can't get logs in Splunk, any guidance for trouble-shooting? Also, is it possible to share a reference example of SQS access policies?   Thanks
Hello All, I am a Newbie to ES and need some help on a basic use case of ES.    We are ingesting our firewall logs into  Splunk.  How can i setup a search to check connection attempts ( as in dest_i... See more...
Hello All, I am a Newbie to ES and need some help on a basic use case of ES.    We are ingesting our firewall logs into  Splunk.  How can i setup a search to check connection attempts ( as in dest_ip) going to Malicious IPs/ CNC ip-addresses?  index=cisco  eval connection = if (dest_ip=(From Threat_intel_List)) ,  generate an alert or show data in table format  We don't want to rely on manually creating a lookup file and keep on manually updating it.
I'm trying to use the map command and it seems to fail when I try using some functions within the subsearch (specifically: cidrmatch()).    This search returns a correctly-populated table of all th... See more...
I'm trying to use the map command and it seems to fail when I try using some functions within the subsearch (specifically: cidrmatch()).    This search returns a correctly-populated table of all the fields except for the "matches" field which is just empty  index=my_index earliest=-5m | table _time src_ip | map search=" | search index=my_other_index  earliest=-6h | rename id as id2 | dedup id2 | eval searchip=$src_ip$ | eval matches=if(cidrmatch(cidr_block, searchip), "true", "false") | table id2 searchip matches cidr_block" Note: my goal is to join two searches but not based on a common field, based on cidrmatching ips from one search to the cidrblocks in the other. I don't want to use lookup tables as I want both to be dynamic. 
Hello, I am new to Splunk and I am looking for a way to write a rule to detect SMB traffic.   Thanks
I was wondering... Can I get (probably from _internal) which reports and dashboards were executed by users? I supposed that saved searches spawned by scheduler should be possible to either find in c... See more...
I was wondering... Can I get (probably from _internal) which reports and dashboards were executed by users? I supposed that saved searches spawned by scheduler should be possible to either find in configuration (by checking their schedule) or by tracking scheduler logs. But ad-hoc ones? Use case is - users created many different dashboards and reports and we want to clean the ones not used anymore. But first of course we need to find them.
I am completely new to splunk and have to deploy it in our environment. Can i get some guidance on best practices for deployment? I have 3 physical CentOS boxes. What would you set each on up with?... See more...
I am completely new to splunk and have to deploy it in our environment. Can i get some guidance on best practices for deployment? I have 3 physical CentOS boxes. What would you set each on up with?  Splunk1 - configured RAID 10 - 5 TBssd Splunk2 -  500 GB ssd Splunk3 - 500GB ssd Any advice is appreciated, thanks!
Below query is producing expected result only sometime, but not working for similar data on some other random days. Query: index=my_summary source=app_response_status report=app_response_status Api... See more...
Below query is producing expected result only sometime, but not working for similar data on some other random days. Query: index=my_summary source=app_response_status report=app_response_status ApiName=metadata | timechart span=1d sum("200"), sum("404") Working Data: 10/24/2021 00:00:00 +0000, search_name=app_response_status, search_now=1635123600.000, info_min_time=1635033600.000, info_max_time=1635120000.000, info_search_time=1635124485.280, 200=7552, 404=7582, ApiName=metadata, info_sid=scheduler__gmrm_VkEtdm1lLXJ0bXMtc2g__RMD50cd89fe00e4c64f8_at_1635123600_39072, RowTotals=15134, info_max_time="1635120000.000", info_min_time="1635033600.000", info_search_time="1635124485.280", report=app_response_status Not Working Data: 09/03/2021 00:00:00 +0000, search_name=app_response_status, search_now=1630717200.000, info_min_time=1630627200.000, info_max_time=1630713600.000, info_search_time=1630717575.202, 200=9483, 404=5287, ApiName=metadata, info_sid=scheduler__gmrm_VkEtdm1lLXJ0bXMtc2g__RMD50cd89fe00e4c64f8_at_1630717200_72746, RowTotals=14770, info_max_time="1630713600.000", info_min_time="1630627200.000", info_search_time="1630717575.202", report=app_response_status  I am not able to figure out the problem, both data looks same to me, but not sure why it is not working. pls help.
HI All I have IP flow based information being ingested into Splunk, which consists of source_ip, source_port, destination_ip, destination_port.  Occasionally, due to the environmental factors, we g... See more...
HI All I have IP flow based information being ingested into Splunk, which consists of source_ip, source_port, destination_ip, destination_port.  Occasionally, due to the environmental factors, we get a duplicate log of the flow in the reverse direction.  E.g. source_ip                   source_port                    destination_ip                  destination_port 1.1.1.1                        42000                                  2.2.2.2                                     80                     <-  Keep this 2.2.2.2.                        80                                         1.1.1.1                                       42000            <- I would like to discard this  1.1.1.5                       42300                                  2.2.2.2                                      80 3.3.3.3                       134                                       5.5.5.5.                                      80        My goal is to identify and ultimately filter out the duplicated entries. What I am having trouble with is coming up with a query to flag events where there is a duplicate entry (in reverse direction). I can then filter out the “flagged” duplicate entries where say source_port < destination_port. I am trying to avoid using computational heavy commands such as nested searches as the data set is quite large.  Would greatly appreciate some ideas or assistance on how this can be tackled.
Hi Champions, In this below mentioned dataset. I want to create a conditional splunk query.  Ex: I want to check first whether rsyslog service is stopped, if it stopped then who stopped it, in whic... See more...
Hi Champions, In this below mentioned dataset. I want to create a conditional splunk query.  Ex: I want to check first whether rsyslog service is stopped, if it stopped then who stopped it, in which server, then display the results in a table.  Can you please help ? Oct 25 16:30:06 keybox sudosh: KHYJS6PxEI64zG Henry: service rsyslog start Oct 25 16:30:02 keybox sudosh: KHYJS6PxEI64zG Joseph: #011service rsyslog stop Oct 25 15:15:30 keybox sudosh: ssNjFZca22OvaB Henry: service rsyslog stop Oct 25 15:08:26 keybox sudosh: ssNjFZla22OvaB Henry: #011service rsyslog start Oct 25 15:07:46 keybox sudosh: ssNjFZla22OvaB Joseph: service rsyslog status Oct 25 15:06:21 keybox sudosh: ssNjF0la22OvaB Asher: service rsyslog statutss Oct 25 14:49:57 eqc-03-tpp sudosh: gkrMz1dLey0CS1 John: cat /etc/red#011#177#177#177#177#177#177#177#177#177#177#177#177#177#177#177r#177#177#177#177#177#177#177#177#177#177#177#177#177sys#177#177ervice rsyslog status Oct 25 14:48:26 keybox sudosh: VSjTDhPH3iM5MY Ahser: service rsyslog status Fields are: Date and Time = Oct 25 16:30:06 host = keybox index = sudosh_app sourcetype = sudosh I tried with the below mentioned query, but unable to create a conditional query.  index = sudosh_app_protected  host = * |eval "Critical Logging Events:" = "Rsyslog was Stopped on " + host, "Date and Time" = MonthDateTime, "User" = UserName, "Source" = sourcetype |table "Date and Time","Critical Logging Events:" , "User", "Source" Please help. Thank you in advance. 
Hi Experts, |search filed1=Enabled OR "Enabled" OR "Disabled" OR Disabled The above search is returning four rows. If i try to sum them based on the status ,still its returning four rows. |stats ... See more...
Hi Experts, |search filed1=Enabled OR "Enabled" OR "Disabled" OR Disabled The above search is returning four rows. If i try to sum them based on the status ,still its returning four rows. |stats sum(dc(servers)) by Status Status---> dc(servers) Enabled--> 10 Enabled--> 3 Disabled-> 23 Disabled->6 Thank you.    
Hello SPLUNKERS, We are seeing this error while integrating the SQL DB using DB connect add-on . Kindly let me know what the error is.  Version :  It is Sql Server 2014 EE The driver could not... See more...
Hello SPLUNKERS, We are seeing this error while integrating the SQL DB using DB connect add-on . Kindly let me know what the error is.  Version :  It is Sql Server 2014 EE The driver could not establish a secure connection to SQL Server by using Secure Sockets Layer (SSL) encryption. Error: "Certificates do not conform to algorithm constraints". ClientConnectionId:xxxxxxxxxxxxxxx