All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, Could some one pls help me the lateral movement which  look for a user with remote NTLM (type 3) logins on an abnormal number of destinations.     Thanks
Thank you Giuseppe, I have found static Splunk enterprise security demo site. I appreciate your help. 
My apps running in docker containers currently use 8.2.9 splunk univeral forwarder that works fine. My image are based on linux alpine image.   I have for some time been trying to get 9.x.x UF work... See more...
My apps running in docker containers currently use 8.2.9 splunk univeral forwarder that works fine. My image are based on linux alpine image.   I have for some time been trying to get 9.x.x UF working instead but I cannot get it to work. When it boots, it prints the following error:     "/opt/splunkforwarder/bin/splunk" start --accept-license --answer-yes --no-prompt Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R splunk:splunk /opt/splunkforwarder" This appears to be your first time running this version of Splunk. Creating unit file... Error calling execve(): No such file or directory Error launching command: No such file or directory Failed to create the unit file. Please do it manually later. Splunk> The Notorious B.I.G. D.A.T.A. Checking prerequisites... Checking mgmt port [8089]: open Creating: /opt/splunkforwarder/var/lib/splunk Creating: /opt/splunkforwarder/var/run/splunk Creating: /opt/splunkforwarder/var/run/splunk/appserver/i18n Creating: /opt/splunkforwarder/var/run/splunk/appserver/modules/static/css Creating: /opt/splunkforwarder/var/run/splunk/upload Creating: /opt/splunkforwarder/var/run/splunk/search_telemetry Creating: /opt/splunkforwarder/var/run/splunk/search_log Creating: /opt/splunkforwarder/var/spool/splunk Creating: /opt/splunkforwarder/var/spool/dirmoncache Creating: /opt/splunkforwarder/var/lib/splunk/authDb Creating: /opt/splunkforwarder/var/lib/splunk/hashDb Checking conf files for problems... Done Checking default conf files for edits... Validating installed files against hashes from '/opt/splunkforwarder/splunkforwarder-9.1.2-b6b9c8185839-linux-2.6-x86_64-manifest' All installed files intact. Done All preliminary checks passed. Starting splunk server daemon (splunkd)... PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate validation for the httplib and urllib libraries shipped with the embedded Python interpreter; must be set to "1" for increased security         However it seems to start a background process but I dont see the logs in splunk. Using the status command kills the background process:   "/opt/splunkforwarder/bin/splunk" status Warning: Attempting to revert the SPLUNK_HOME ownership Warning: Executing "chown -R splunk:splunk /opt/splunkforwarder" splunkd 165 was not running. Stopping splunk helpers...     I have tried disabling boot start but that gives me a similar error::   "/opt/splunkforwarder/bin/splunk" disable boot-start Error calling execve(): No such file or directory Error launching command: No such file or directory execve: No such file or directory while running command /sbin/chkconfig     After researching this, I think it could be related to systemd perhaps? I dont think Alpine includes it but it uses OpenRC instead. However, I dont really have any use for this autostart feature anyway, is there a way to ignore/skip it somehow?
Hi @ITWhisperer need help, how many ways to show up in the dashboard where the eventids index=foo_win*  (host="PC*" EventID=1068) OR (host="PR**" EventID="1") OR (host="PR*" EventID="1") OR (ho... See more...
Hi @ITWhisperer need help, how many ways to show up in the dashboard where the eventids index=foo_win*  (host="PC*" EventID=1068) OR (host="PR**" EventID="1") OR (host="PR*" EventID="1") OR (host="PR*" EventID="1")....... where _time, server(host), eventid, severity (warning, critical, info) Desired to achieve like below snap.  
Hi @thanh_on, have you access to Splunk Show (https://show.splunk.com/login/?redirect=/)? here you can find a complete environment to test Enterprise Security with a relevant set of data. Ciao. G... See more...
Hi @thanh_on, have you access to Splunk Show (https://show.splunk.com/login/?redirect=/)? here you can find a complete environment to test Enterprise Security with a relevant set of data. Ciao. Giuseppe
Hi @hank72., As I said, in this way your are sure to load into the Data Models all the data from al indexes even if they aren't in the default search path. If you want, youculd limit the indexes to... See more...
Hi @hank72., As I said, in this way your are sure to load into the Data Models all the data from al indexes even if they aren't in the default search path. If you want, youculd limit the indexes to read for each Data model, but there's no reason to do this, so my hint is to leave it as it is. let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
@gcusello Could you please reply to my query as mentioned previously?  
The question has been answered many times before. @isoutamo already pointed you to a trove of resources for writing such search. If you don't understand some specifics about any of those things said ... See more...
The question has been answered many times before. @isoutamo already pointed you to a trove of resources for writing such search. If you don't understand some specifics about any of those things said in other threads wiith solutions don't hesitate to ask for explanation. But don't expect people to jump in and do your job for you - the issue is well known and has well known method of dealing with, explained many times. So all you need is to dig into those resources, read the solutions provided there and try to construct your own. If you encounter some obstacles along the way, ask away.
I did try to find in documentation that summaryindex is an alias for collect, but it's not documented as far as I can see.  But if you start typing | summaryin , splunk will show info for collect com... See more...
I did try to find in documentation that summaryindex is an alias for collect, but it's not documented as far as I can see.  But if you start typing | summaryin , splunk will show info for collect command.  So yes its the same command.
I nees a solution from scratch , if someone could help here?
You should accept ptrsnks answer not your reply.
Hi All, I don't have many resource to build an ideal network environment to forward logs to Splunk. So, I'm seeking a way to simulating or source to obtain many commonly data sources into Splunk (Li... See more...
Hi All, I don't have many resource to build an ideal network environment to forward logs to Splunk. So, I'm seeking a way to simulating or source to obtain many commonly data sources into Splunk (Like some SIEM solutions have scripts to forward syslog through port 514). Any answer will be highly appreciated. Regard.  
  I couldn't get "cird_address=remoteIP ."/32"" to work in my search. I created a more simple search and it worked fine.  Your suggestion was correct.  I need to do more work on my search. Thanks f... See more...
  I couldn't get "cird_address=remoteIP ."/32"" to work in my search. I created a more simple search and it worked fine.  Your suggestion was correct.  I need to do more work on my search. Thanks for your help!   Peter  
Being somewhat of a journeyman myself, the proper way to use timewrap was a bit of a mystery to me.  So, while the answer may be apparent to many, I was not sure how to wield the information. Than... See more...
Being somewhat of a journeyman myself, the proper way to use timewrap was a bit of a mystery to me.  So, while the answer may be apparent to many, I was not sure how to wield the information. Thank you for the response.  I will give it a go on Monday.
Yes I tried the .(dot) | eval  cird_address=remoteIP ./32 Error in 'EvalCommand': The expression is malformed. An unexpected character is reached at '/32'. | eval  cird_address=remoteIP ."/32" Th... See more...
Yes I tried the .(dot) | eval  cird_address=remoteIP ./32 Error in 'EvalCommand': The expression is malformed. An unexpected character is reached at '/32'. | eval  cird_address=remoteIP ."/32" This one does NOT show  an error, but i get no results.   Maybe there is something farther down in the search that's not correct. I check that and respond again. Thanks for your sugestion    
Note that I used System.Web.HttpUtility.JavaScriptStringEncode as a shortcut for encoding/escaping strings. KV_MODE = auto_escaped only handles a few escape sequences. If you prefer, you can simply r... See more...
Note that I used System.Web.HttpUtility.JavaScriptStringEncode as a shortcut for encoding/escaping strings. KV_MODE = auto_escaped only handles a few escape sequences. If you prefer, you can simply replace \ and " with \\ and \", respectively, in strings before writing them.
Hi @Ismail_BSA, We can use the SqlServer PowerShell module to read SQL Server audit files. As an administrator, install the SqlServer PowerShell module under PowerShell 5.1, which should be install... See more...
Hi @Ismail_BSA, We can use the SqlServer PowerShell module to read SQL Server audit files. As an administrator, install the SqlServer PowerShell module under PowerShell 5.1, which should be installed by default on all modern Windows releases: PS> Install-Module SqlServer With the module installed, we can read .sqlaudit files created by SQL Server using Read-SqlXEvent. Column/field information is available at https://learn.microsoft.com/en-us/sql/relational-databases/security/auditing/sql-server-audit-records. Columns with type bigint or varbinary will be read as byte arrays and must be converted to strings using a .NET object of the appropriate type. We can write a small PowerShell script to act as a stream reader for .sqlaudit files read by Splunk's archive processor (see below). Note that Read-SqlXEvent uses System.IO.Stream internally and calls Stream.Length, which throws "Stream does not support seeking" for forward-only streams. We'll work around this isssue by copying the stream to a temporary file, reading the temporary file, and finally, deleting the temporary file. C:\Temp\Stream-SqlAudit.ps1 $file = New-TemporaryFile $output = $file.Open([System.IO.FileMode]::Append, [System.IO.FileAccess]::Write) $stdin = [System.Console]::OpenStandardInput() $stdout = [System.Console]::Out $buffer = New-Object byte[] 16384 [int]$bytes = 0 while (($bytes = $stdin.Read($buffer, 0, $buffer.Length)) -gt 0) { $output.Write($buffer, 0, $bytes) } $output.Flush() $output.Close() Read-SqlXEvent -FileName "$($file.DirectoryName)\$($file.Name)" | %{ $event = $_.Timestamp.UtcDateTime.ToString("o") $_.Fields | %{ if ($_.Key -eq "permission_bitmask") { $event += " permission_bitmask=`"0x$([System.BitConverter]::ToInt64($_.Value, 0).ToString("x16"))`"" } elseif ($_.Key -like "*_sid") { $sid = $null $event += " $($_.Key)=`"" try { $sid = New-Object System.Security.Principal.SecurityIdentifier($_.Value, 0) $event += "$($sid.ToString())`"" } catch { $event += "`"" } } else { $event += " $($_.Key)=`"$([System.Web.HttpUtility]::JavaScriptStringEncode($_.Value.ToString()))`"" } } $stdout.WriteLine($event) } $file.Delete() We can use the invalid_cause and unarchive_cmd props.conf settings to call the PowerShell script. Note that unarchive_cmd strips or escapes quotes depending on the value unarchive_cmd_start_mode, so we've stored the PowerShell script in a path without spaces to avoid the use of quotes. If PowerShell can't find the path specified in the -File argument, it will exit with code -196608. Sample props.conf on forwarders, receivers (heavy forwarders or indexers), and search heads: [source::....sqlaudit] unarchive_cmd = powershell.exe -ExecutionPolicy RemoteSigned -File C:\Temp\Stream-SqlAudit.ps1 unarchive_cmd_start_mode = direct sourcetype = preprocess-sqlaudit NO_BINARY_CHECK = true [preprocess-sqlaudit] invalid_cause = archive is_valid = False LEARN_MODEL = false [sqlaudit] SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+) EVENT_BREAKER_ENABLE = true EVENT_BREAKER = ([\r\n]+) TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%N%Z MAX_TIMESTAMP_LOOKAHEAD = 30 KV_MODE = auto_escaped We can use a batch or monitor stanza to monitor the directory containing .sqlaudit files. Use a batch stanza if the files are moved to the monitored directory atomically to allow Splunk Universal Forwarder to delete the files after they're indexed. Sample inputs.conf: [monitor://C:\Temp\*.sqlaudit] index = main sourcetype = sqlaudit The script can be refactored as a scripted input; however, using the archive processor allows Splunk to perform file and change tracking on our behalf.
Have you tried using the other concatenation operator - dot vs plus?
Hi Jason, Did you find a solution for this? 
add observation dashboard risks to splunk integration as a incident  im in AU: https://portal.XX.xdr.trendmicro.com/#/app/sase  please advise  thanks