All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I work for a company that has Splunk used on Servers. it is governed by a main team, however the installation of Universal Forwarder is up to the individual teams, as a result, the version need... See more...
Hi, I work for a company that has Splunk used on Servers. it is governed by a main team, however the installation of Universal Forwarder is up to the individual teams, as a result, the version needs update from time to time. I am in the process of automating all software version downloads the platform I maintain uses and was wondering if there is a known way to connect with the splunk site and download the latest version of UniversalForwarded via script. I use powershell but could try translate other scripts if there is a method.  any info on URL and any header information syntax for login I need is appreciated Thank you
I have the below sample botsv3 sample data set which is sysmon in xml format. I need to convert that into json formatted events.  Sample Current XML format - code block :1     <Event xmlns='http:... See more...
I have the below sample botsv3 sample data set which is sysmon in xml format. I need to convert that into json formatted events.  Sample Current XML format - code block :1     <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'> <System> <Provider Name='Microsoft-Windows-Sysmon' Guid='{5770385F-C22A-43E0-BF4C-06F5698FFBD9}' /> <EventID>1</EventID> <Version>5</Version> <Level>4</Level> <Task>1</Task> <Opcode>0</Opcode> <Keywords>0x8000000000000000</Keywords> <TimeCreated SystemTime='2023-05-21T13:35:45.561534700Z' /> <EventRecordID>36885</EventRecordID> <Correlation /> <Execution ProcessID='3204' ThreadID='5508' /> <Channel>Microsoft-Windows-Sysmon/Operational</Channel> <Computer>BGIST-L.froth.ly</Computer> <Security UserID='S-1-5-18' /> </System> <EventData> <Data Name='UtcTime'>2023-05-21 15:17:59.931</Data> <Data Name='ProcessGuid'>{EBF7A186-1A1C-5B59-0000-0010732E0200}</Data> <Data Name='ProcessId'>2684</Data> <Data Name='Image'>C:\Windows\System32\svchost.exe</Data> <Data Name='FileVersion'>10.0.17134.1 (WinBuild.160101.0800)</Data> <Data Name='Description'>Host Process for Windows Services</Data> <Data Name='Product'>Microsoft® Windows® Operating System</Data> <Data Name='Company'>Microsoft Corporation</Data> <Data Name='CommandLine'>c:\windows\system32\svchost.exe -k networkservice -p -s CryptSvc</Data> <Data Name='CurrentDirectory'>C:\Windows\system32\</Data> <Data Name='User'>NT AUTHORITY\NETWORK SERVICE</Data> <Data Name='LogonGuid'>{EBF7A186-1A19-5B59-0000-0020E4030000}</Data> <Data Name='LogonId'>0x3e4</Data> <Data Name='TerminalSessionId'>0</Data> <Data Name='IntegrityLevel'>System</Data> <Data Name='Hashes'> MD5=32569E403279B3FD2EDB7EBD036273FA,SHA256=C9A28DC8004C3E043CBF8E3A194FDA2B756CE90740DF2175488337281B485F69</Data> <Data Name='ParentProcessGuid'>{EBF7A186-1A18-5B59-0000-0010CEA80000}</Data> <Data Name='ParentProcessId'>608</Data> <Data Name='ParentImage'>C:\Windows\System32\services.exe</Data> <Data Name='ParentCommandLine'>C:\Windows\system32\services.exe</Data> </EventData> </Event>"       Expected in JSON format as below. I have only included fields that are actually needed. code block : 2     { "ID": 1, "Timestamp": "2023-05-18T05:07:59.940594300Z", "EventData": { "FileVersion": "10.0.17134.1 (WinBuild.160101.0800)", "Company": "Microsoft Corporation", "TerminalSessionId": 0, "UtcTime": "2018-08-20 15:18:59.929", "Product": "Microsoft® Windows® Operating System", "LogonId": "0x3e7", "Description": "Find String (QGREP) Utility", "OriginalFileName": "findstr.exe", "Hashes": "MD5=BCC8F29B929DABF5489C9BE6587FF66D,SHA256=40F83CE0B6E1C894AB766591574ABD5B6780028C874410F2EC224300DF443C81", "ParentProcessId": "5428", "ParentCommandLine": "C:\\Windows\\system32\\cmd.exe /c netstat -nao | findstr /r \"LISTENING\"", "ProcessGuid": "{EBF7A186-0B7B-5B59-0000-001044A3CD01}", "ProcessId": "6236", "Image": "C:\\Windows\\System32\\findstr.exe", "User": "NT AUTHORITY\\SYSTEM", "LogonGuid": "{EBF7A186-AB15-5B58-0000-0020E7030000}", "LogonGuid": "{EBF7A186-AB15-5B58-0000-0020E7030000}", "IntegrityLevel": "System", "ParentProcessGuid": "{EBF7A186-0B7B-5B59-0000-0010249FCD01}", "ParentImage": "C:\\Windows\\System32\\cmd.exe", "RuleName": "", "CommandLine": "findstr /r \"LISTENING\"", "CurrentDirectory": "C:\\Windows\\system32\\" }, "Hostname": "BGIST-L.froth.ly", }       The key fields are EventID, Computer and entire EventData block What I have tried so far  Used | tojson command but it didn't created the nested EventData block. It just extracted all field-value pairs as individual objects then tried the below spl but it has fields hardcoded which is not desirable. We want them to be dynamically added to EventData block. The below SPL also led to lot of fields with null values as not all Event IDs had the same fields for obvious reasons.    index=main | rename EventID as ID Computer as Hostname eventtype as EventType | fillnull value="" | eval EventData=json_object("FileVersion", FileVersion, "Company", Company, "TerminalSessionId", TerminalSessionId, "UtcTime", UtcTime, "Product", Product,"LogonId",LogonId,"Description",Description,"Hashes",Hashes,"ParentProcessId",ParentProcessId,"ParentCommandLine",ParentCommandLine,"ProcessGuid",ProcessGuid,"ProcessId",ProcessId,"Image",Image,"User",User,"LogonGuid",LogonGuid,"IntegrityLevel",IntegrityLevel,"ParentProcessGuid",ParentProcessGuid,"ParentImage",ParentImage,"CommandLine",CommandLine,"CurrentDirectory",CurrentDirectory),_raw=json_set_exact(json_object(), "ID", ID, "Hostname", Hostname, "EventData", json_extract(EventData))    I tried the below query as well   index=botsv3 sourcetype=xmlwineventlog EventID=1 | spath | tojson   Which gave the following result where Eventdata atleast is still in block but both the fields appear as values in Event.EventData.Data field and values in Event.EventData.Data{@Name}. I need the values in Event.EventData.Data{@Name} field to appear as values of corresponding fields which appear as values in the Event.EventData.Data field and all that as part of EventData nested json block. Basically as shown in code block : 2   Any help on this would be highly appreciated!    
Is there a basic cheatsheet for setting up a new small scale distributed deployment?
The Qualys TA does not provide CIM parsing.
I have two json arrays of strings, I would like to see what values of array A is not present in array B and display them as a table.
how to perform lookup on CSV file from search on index? For example below:   I want to find out if  "name" on employee.csv exists on "name" on testindex With the data below, the result should be na... See more...
how to perform lookup on CSV file from search on index? For example below:   I want to find out if  "name" on employee.csv exists on "name" on testindex With the data below, the result should be name3   addr3   phone3      Please help. Thank you!! index=testindex    |  inputlookup employee.csv   | field name   ???  ==> does not work result:  name    |   address |  phone name1      addr1         phone1 name3      addr3         phone3 employee.csv name   |  position | company |  name3    position3   company3
I am making a trend chart of specific data set. What I am looking for is (generic example) index=nessus | eval Month=strftime(firstSeen,"%b") | chart count by severity Month So the end result wo... See more...
I am making a trend chart of specific data set. What I am looking for is (generic example) index=nessus | eval Month=strftime(firstSeen,"%b") | chart count by severity Month So the end result would be Months on the X axis and the count of severity (critica,high,medium) for each month. Each month would have a count of each severity.  But using the firstseeen date from the date event. 
Sample data: i have 2 types of data and below props given, i am seeing internal logs like ERROR JsonLineBreaker - JSON StramID:13457545565443322455 had parsing error: Unexpected character: 'a' - ... See more...
Sample data: i have 2 types of data and below props given, i am seeing internal logs like ERROR JsonLineBreaker - JSON StramID:13457545565443322455 had parsing error: Unexpected character: 'a' - data_source........ Do i need to modify props to capture 2 format of logs?? props: [sourcetype] INDEXED_EXTRACTIONS=json KV_MODE=none SHOULD_LINEMERGE=true TIMESTAMP_FIELDS=timestamp LINE_BREAKER=([\r\n]+)   {[-] UserID: Null host: apl-45678 level: medium message: cliendid: null, secondaryClientid: null, userid: unknown, respinsetime:1.34455 timestamp: 2022-01-22T21:23:44.897Z }       {"timestamp": "2022-01-22T21:23:44.897Z", "level":"applevel", "host":"apl-12345", "userid": "NA", "message": apl-12345-20144 - unknown - GET - / - REQ-NAMES - {"accept": "text/plain, application/json:*************************************************************************, "host:""apl-12345", "connection":"unknown"}"}       {[-] UserID: Null host: apl-45678 level: medium message: cliendid: null, secondaryClientid: null, userid: unknown, respinsetime:1.34455 timestamp: 2022-01-22T21:23:44.897Z }       {"timestamp": "2022-01-22T21:23:44.897Z", "level":"applevel", "host":"apl-12345", "userid": "NA", "message": apl-12345-20144 - unknown - GET - / - REQ-NAMES - {"accept": "text/plain, application/json:*************************************************************************, "host:""apl-12345", "connection":"unknown"}"}    
I have a cron job that creates a lookup file under $splunkhome$/etc/apps/search/lookups on one of the search heads. How do I get that file to be replicated to the other search heads?  I've created ... See more...
I have a cron job that creates a lookup file under $splunkhome$/etc/apps/search/lookups on one of the search heads. How do I get that file to be replicated to the other search heads?  I've created a lookup definition for it and it works great the first time, but after the file's been updated. The new results are only available on the local sheard head. 
I am trying to use a lookup we use to track usage of exceptions in one of our platforms so that we can remove unneeded exceptions as needed. In my search, I am bringing my log data in that would repr... See more...
I am trying to use a lookup we use to track usage of exceptions in one of our platforms so that we can remove unneeded exceptions as needed. In my search, I am bringing my log data in that would represent usage and taking the domain out of the email address to compare against the values in my lookup. the issue I am running into is that my log will show a domain of "test.example.com" but the exception that would be used is "*.example.com". I am looking for an elegant way to add the usage of "test.example.com" to the counter for "*.example.com".  My lookup has headers of OBJECT, CATEGORY, USAGE. OBJECT being where the domain would go. Once I get these counters sorted I would write back to the lookup table with the new value for USAGE.    Including the part of my search I am struggling with: | eval domain=replace(emailAddress,".*?@","") | stats count(domain) as USAGE by domain | eval CATEGORY="domain" | rename domain as OBJECT | table OBJECT,CATEGORY,USAGE | append [| inputlookup exceptions.csv] | stats sum(USAGE) AS USAGE by OBJECT, CATEGORY Any help is appreciated!    
Hello All, I'm trying to do a search "found ANC VITC in source 01:00:00;00" which works just fine, but I would like to omit these errors from the UTC times of 01:00:00;00 - 01:00:00;05 because betw... See more...
Hello All, I'm trying to do a search "found ANC VITC in source 01:00:00;00" which works just fine, but I would like to omit these errors from the UTC times of 01:00:00;00 - 01:00:00;05 because between those times the 01:00:00;00 timecode is legit.  Is this possible? A co-worker believes there is a result object "called_time" but I'm unclear of the syntax use.
Can you review how to set up custom thresholds for alerts.   Once you show me what I’m missing I should be fine. Edited to address typographical error in title. C.Landivar
For these following two events:     { "people": { "bob": 172, "maria": 161 } } { "people": { "bob": 172, "garth": 180 } }       I want to report the number of occurren... See more...
For these following two events:     { "people": { "bob": 172, "maria": 161 } } { "people": { "bob": 172, "garth": 180 } }       I want to report the number of occurrences of each person's name in the "people" object (the field name itself, not its value).  The desired result of my query against the two events above would be:     bob: 2 maria: 1 garth: 1       What SPL syntax/commands could I use to achieve this?  I have tried a variety of things online to no avail.    
I can't use the field extractor because the field configurations are frequently very different and it gives me errors so I've been using "| rex" instead.  Can someone help me adjust my regex to onl... See more...
I can't use the field extractor because the field configurations are frequently very different and it gives me errors so I've been using "| rex" instead.  Can someone help me adjust my regex to only capture "P3820 Houston to A345 Atlanta Line Down" for the field "Details" every time? | rex field= "(?<Details>.*)\s-\s\d{4}[Z]\s\d{2}\s[a-zA-Z]{3}\s-\s(\d{4}Z\s\d{2}\s[a-zA-Z]{3}|On)" field examples:  P3820 Houston to A345 Atlanta Line Down - 1339Z 19 May - On-going - TKT39390423 P3820 Houston to A345 Atlanta Line Down - 1339Z 19 May - 0834Z 20 May - TKT39390423 P3820 Houston to A345 Atlanta Line Down - 1339Z 19 MAY - Ongoing - TKT39390423 - 1339Z 19 May - On-going - TKT39390423 P3820 Houston - A345 Atlanta Line Down - 1339Z 19 MAY - Ongoing - INC39390423, DIRJ LLO MM#:394039 - 1339Z 19 May - On-going - TKT39390423 P3820 Houston - A345 Atlanta Line Down - 1339Z 19 MAY - 1834Z MAY - INC39390423, DIRJ LLO MM#:394039 - 1339Z 19 May - 0834Z 20 May - TKT39390423 I don't have any issue for the first two but when the date/time range is repeated I end up with everything before the second  "1339Z 19 May" included in the "Details" field
I am looking to have a time chart table that has a dropdown menu based on a token,  be able to show all of the values of the dropdown menu in its first option and have it displayed as the default. Ea... See more...
I am looking to have a time chart table that has a dropdown menu based on a token,  be able to show all of the values of the dropdown menu in its first option and have it displayed as the default. Each value only shows one line, and I'm looking for all of the lines to populate the chart in the All selection. This is my current time chart. This is the current values based on the token.  This is the current search.   index=main host=$token$* sourcetype=syslog process=elcsend "\"config " CentOS | rex "([^!]*!){2}(?P<type>[^!]*)!([^!]*!){4}(?P<role>[^!]*)!([^!]*!){23}(?P<vers>[^!]*)" | search role=std-dhcp | eval location=$token|s$ | timechart span=1d count by location     If unable to do that, I am also open to the option of removing the dropdown menu and only having the default time chart showcasing all of the values.
ERROR : could not remove all contents of '/opt/data/kvstore': 1 errors occured.description for first 1 :[{operation :"failed to remove directory", error:"directory not empty", file:"/opt/data/kvstore... See more...
ERROR : could not remove all contents of '/opt/data/kvstore': 1 errors occured.description for first 1 :[{operation :"failed to remove directory", error:"directory not empty", file:"/opt/data/kvstore/mongo"}]
Hi I setup a Splunk Enterprise instance on a windows vm to collect active directory logs. I wanted to forward these logs to both Splunk Cloud and a syslog server. I set up a universal forwarder on t... See more...
Hi I setup a Splunk Enterprise instance on a windows vm to collect active directory logs. I wanted to forward these logs to both Splunk Cloud and a syslog server. I set up a universal forwarder on the vm and I installed the credentials on both the Splunk enterprise instance and the universal forwarder.  I also made the changes to the output.conf file to send data from the 514 port. I see the events coming into Splunk Cloud but there doesn't seem to be events leaving the 514 port. I also noticed that Splunk Enterprise instance isn't receiving any events after setting this up, is that meant to happen? Do I need more than one universal forwarder to forward the logs in multiple places?
Hi all. I have recently started working on my workplace's Splunk and I got a request - to display all alerts that has the "notable" feature turned on. I tried google it and came across to a close a... See more...
Hi all. I have recently started working on my workplace's Splunk and I got a request - to display all alerts that has the "notable" feature turned on. I tried google it and came across to a close answer with "index=notable" but it's not enough because I want to get all alerts, not only those which got triggered and got notabled.     Thanks in advance.
I want to build a machine learning model to detect anomalies on the high volume ingestion index. The problem i'm facing is for the small indexes if i fit the model with DensityFunction. How to overco... See more...
I want to build a machine learning model to detect anomalies on the high volume ingestion index. The problem i'm facing is for the small indexes if i fit the model with DensityFunction. How to overcome this scenario  
Hello, I have a syslog server ingesting device logs which are sent from the deployment server, and then to the indexer. My esxi as well as other devices are sending logs every minute.  However, my ... See more...
Hello, I have a syslog server ingesting device logs which are sent from the deployment server, and then to the indexer. My esxi as well as other devices are sending logs every minute.  However, my firewall logs are only ingested every 4 hours on the indexer.  Could this be a latency issue, or is it the firewall causing the problem?   Thank you