All Topics

Top

All Topics

Last month, the Splunk Threat Research Team had 2 releases of new security content via the Enterprise Security Content Update (ESCU) app (v4.24.0 and v4.25.0). With these releases, there are 27 new a... See more...
Last month, the Splunk Threat Research Team had 2 releases of new security content via the Enterprise Security Content Update (ESCU) app (v4.24.0 and v4.25.0). With these releases, there are 27 new analytics, 5 new analytic stories, 110 updated analytics, and 1 updated analytic story now available in Splunk Enterprise Security via the ESCU application update process. Content highlights include: New content to help detect Midnight Blizzard (also known as Nobelium and APT29) attack techniques, improving detection coverage against sophisticated threats targeting Microsoft 365 environments. To learn more about Midnight Blizzard and the security content created by the Splunk Threat Research Team, check out the blog “Hunting M365 Invaders: Navigating the Shadows of Midnight Blizzard.” A new analytic story with content that can be used to help detect Phemedrone Stealer activities. To learn more about Phemedrone Stealer and the security content created by the Splunk Threat Research Team, check out the blog “Unveiling Phemedrone Stealer: Threat Analysis and Detections.” New content to help detect the critical ConnectWise ScreenConnect CVE-2024-1709 vulnerability, which allows an attacker to bypass authentication using an alternate path or channel. New Analytics (27) Azure AD Admin Consent Bypassed by Service Principal Azure AD FullAccessAsApp Permission Assigned Azure AD Multiple Service Principals Created by SP Azure AD Multiple Service Principals Created by User Azure AD Privileged Graph API Permission Assigned Azure AD Service Principal Authentication O365 Admin Consent Bypassed by Service Principal O365 FullAccessAsApp Permission Assigned O365 Multiple Mailboxes Accessed via API O365 Multiple Service Principals Created by SP O365 Multiple Service Principals Created by User O365 OAuth App Mailbox Access via EWS O365 OAuth App Mailbox Access via Graph API O365 Privileged Graph API Permission Assigned Network Traffic to Active Directory Web Services Protocol Windows Privilege Escalation Suspicious Process Elevation Windows Privilege Escalation System Process Without System Parent Windows Privilege Escalation User Process Spawn System Process Windows SOAPHound Binary Execution Ivanti Connect Secure SSRF in SAML Component ConnectWise ScreenConnect Path Traversal ConnectWise ScreenConnect Path Traversal Windows SACL Windows Non Discord App Access Discord LevelDB Windows Time Based Evasion via Choice Exec Windows Unsecured Outlook Credentials Access In Registry ConnectWise ScreenConnect Authentication Bypass WordPress Bricks Builder plugin RCE New Analytic Stories (5) Office 365 Collection Techniques Phemedrone Stealer ConnectWise ScreenConnect Vulnerabilities Snake Keylogger WordPress Vulnerabilities Updated Analytics (110) Splunk unnecessary file extensions allowed by lookup table uploads Azure AD High Number Of Failed Authentications From Ip Azure AD Multi-Source Failed Authentications Spike Azure AD Privileged Role Assigned Azure AD Privileged Role Assigned to Service Principal Azure AD Service Principal Created Azure AD Service Principal New Client Credentials Azure AD Service Principal Owner Added Azure AD Tenant Wide Admin Consent Granted O365 Added Service Principal O365 Application Registration Owner Added O365 ApplicationImpersonation Role Assigned O365 Mailbox Inbox Folder Shared with All Users O365 Mailbox Read Access Granted to Application O365 Multi-Source Failed Authentications Spike O365 Multiple Users Failing To Authenticate From Ip O365 Service Principal New Client Credentials O365 Suspicious Admin Email Forwarding O365 Suspicious Rights Delegation O365 Suspicious User Email Forwarding O365 Tenant Wide Admin Consent Granted Correlation by Repository and Risk Correlation by User and Risk Any Powershell DownloadFile Any Powershell DownloadString Attacker Tools On Endpoint Create local admin accounts using net exe Create Remote Thread In Shell Application Creation of Shadow Copy Detect Certify Command Line Arguments Detect Certify With PowerShell Script Block Logging Detect Excessive Account Lockouts From Endpoint Detect New Local Admin account Detect Regasm with Network Connection Detect Regsvcs with Network Connection Detect Use of cmd exe to Launch Script Interpreters Disable Show Hidden Files Disable Windows SmartScreen Protection Disabling ControlPanel Disabling SystemRestore In Registry Download Files Using Telegram Elevated Group Discovery with PowerView Executable File Written in Administrative SMB Share Executables Or Script Creation In Suspicious Path Execute Javascript With Jscript COM CLSID Execution of File with Multiple Extensions Extraction of Registry Hives Hiding Files And Directories With Attrib exe Linux Account Manipulation Of SSH Config and Keys Linux Deletion Of Cron Jobs Linux Deletion Of Init Daemon Script Linux Deletion Of Services Linux Deletion of SSL Certificate Linux High Frequency Of File Deletion In Boot Folder Linux High Frequency Of File Deletion In Etc Folder MacOS LOLbin MacOS plutil Network Discovery Using Route Windows App Non Chrome Process Accessing Chrome Default Dir Non Firefox Process Access Firefox Profile Dir Overwriting Accessibility Binaries PowerShell - Connect To Internet With Hidden Window Rundll32 Process Creating Exe Dll Files Scheduled Task Deleted Or Created via CMD Schtasks scheduling job on remote system Spoolsv Spawning Rundll32 Spoolsv Writing a DLL Spoolsv Writing a DLL - Sysmon Suspicious Driver Loaded Path Suspicious mshta child process Suspicious Process DNS Query Known Abuse Web Services Suspicious Process File Path System Processes Run From Unexpected Locations Trickbot Named Pipe Windows Account Discovery for None Disable User Account Windows AD Replication Request Initiated by User Account Windows AD Replication Request Initiated from Unsanctioned Location Windows Admin Permission Discovery Windows Alternate DataStream - Base64 Content Windows Alternate DataStream - Executable Content Windows Credentials from Password Stores Chrome Extension Access Windows Credentials from Password Stores Chrome LocalState Access Windows Credentials from Password Stores Chrome Login Data Access Windows Gather Victim Network Info Through Ip Check Web Services Windows Process Injection Remote Thread Windows Registry Payload Injection Windows Replication Through Removable Media Windows Rundll32 WebDav With Network Connection Windows Scheduled Task Created Via XML Windows Scheduled Task Service Spawned Shell Windows Security Account Manager Stopped Windows Suspect Process With Authentication Traffic Windows UAC Bypass Suspicious Child Process Windows UAC Bypass Suspicious Escalation Behavior Windows WinLogon with Public Network Connection WinEvent Scheduled Task Created Within Public Path Detect DGA domains using pretrained model in DSDL Multiple Archive Files Http Post Traffic Plain HTTP POST Exfiltrated Data DNS Query Length With High Standard Deviation Detect Regasm Spawning a Process High Process Termination Frequency Linux Edit Cron Table Parameter Processes launching netsh Registry Keys Used For Persistence Suspicious Process Executed From Container File Windows File Transfer Protocol In Non-Common Process Path Windows Phishing PDF File Executes URL Link Windows System Network Connections Discovery Netsh Windows User Execution Malicious URL Shortcut File Updated Analytic Stories (1) NOBELIUM Group The team also published the following 3 blogs: Unveiling Phemedrone Stealer: Threat Analysis and Detections Hunting M365 Invaders: Navigating the Shadows of Midnight Blizzard Another Year of RATs and Trojan Stealer: Detection Commonalities and Summary Plus, Principal Threat Researcher Michael Haag hosted the Tech Talk "Using the Splunk Threat Research Team’s Latest Security Content.” During this Tech Talk, Michael provided: Best practices for accessing and using the team’s content in the ESCU app An overview of the team’s content updates between November and January Deeper dives into new content for detecting DarkGate malware, Office 365 account takeover, and Windows Attack Surface Reduction events You can watch the Tech Talk on-demand here. For all our tools and security content, please visit research.splunk.com. — The Splunk Threat Research Team
Is there a way to send all logs data to an NFS file system for required log retention from Splunk Opentelemetry?  
I have a json that looks like this: { "Field1" : [ { "id": 1234 "name": "John" }, { "id": 5678 "name": "Mary" "occupation": { "title": "lawyer", "employer": "law firm" } } ] } I want... See more...
I have a json that looks like this: { "Field1" : [ { "id": 1234 "name": "John" }, { "id": 5678 "name": "Mary" "occupation": { "title": "lawyer", "employer": "law firm" } } ] } I want to extract the value of the "name" field from the object that contains an occupation field (could be any). In this case I want to get "Mary" and store it inside a variable. How would I do this using splunk search language?
I configured a Macro name securemsg(1), I use this Marco in the following search: ....| eval log_info=_raw | 'securemsg(log_info)' | .... When I run this search I got following error: Error in 'Se... See more...
I configured a Macro name securemsg(1), I use this Marco in the following search: ....| eval log_info=_raw | 'securemsg(log_info)' | .... When I run this search I got following error: Error in 'SearchParser': Missing a search command before '''. Error at position '264' of search query 'search index="linuxos" sourcetype="syslog" host="C...{snipped} {errorcontext = fo=_raw | 'securemsg(}'. Please help. Thanks      
How would I add a permanent search or field to a sourctype?  For example: I have a set of a data that I have been able to snag a field out of using this search sourcetype="collectedevents" | rex fi... See more...
How would I add a permanent search or field to a sourctype?  For example: I have a set of a data that I have been able to snag a field out of using this search sourcetype="collectedevents" | rex field=_raw "<Computer>(?<Computer>[^<]+)</Computer>" Our sourcetype is "collectedevents"  And I found the way to pull the <Computer> field that was in the XML data down to a field "Computer" But what I would like to be able to do is to have that field be permanent, or transpose the "host =" to not be the host of the WEC but the host of the origin server that it came from.   Long story short, we have servers that we don't want the Splunk Forwarder on because we know that it can execute scripts creating a vulnerability with the Splunk Forwarder on these servers.  Any help is appreciated, thank you!
Hello, We have been investigating on missing 30% of Splunk logs in our production environment. I'm thinking it maybe due to TIME_FORMAT or due to high volume logs on production. Can you please let m... See more...
Hello, We have been investigating on missing 30% of Splunk logs in our production environment. I'm thinking it maybe due to TIME_FORMAT or due to high volume logs on production. Can you please let me know what should be the key-value for TIME_FORMAT on props.conf file?  Lagsec value is 1.5seconds on source logs and the splunk forwarder log source type where we are checking has 1.13s.  Additionally, source logs have format: 05/Mar/2024 SplunkForwarder logs have format: 2024-03-05 2048kbps on both dev and prod config file. Also, have ignoreOlderThan=1d so, looking to remove this parameter and fix TIME_FORMAT and check out. Can you please help or provide additional information to check?
Thanks in Advance. 1.I have a json object as content.payload{} and need to extract the values inside the payload.Already splunk extract field as content.payload{} and the result as  AP Import flow ... See more...
Thanks in Advance. 1.I have a json object as content.payload{} and need to extract the values inside the payload.Already splunk extract field as content.payload{} and the result as  AP Import flow related results : Extract has no AP records to Import into Oracle". But I want to extract all the details inside the content.payload. How can extract from splunk query or from props.conf file.I tried spath but cant able to get it. 2.How to rename wildcard value of content.payload{}* ?     "content" : { "jobName" : "AP2", "region" : "NA", "payload" : [ { "GL Import flow processing results" : [ { "concurBatchId" : "4", "batchId" : "6", "count" : "50", "impConReqId" : "1", "errorMessage" : null, "filename" : "CONCUR_GL.csv" } ] }, "AP Import flow related results : Extract has no AP records to Import into Oracle" ] },      
Hi, we are using ITSI Service map/Service Analyzer to monitor services.  we have an use case where for same service we need to add multiple KPI and those KPI depends on different entities. For Ex... See more...
Hi, we are using ITSI Service map/Service Analyzer to monitor services.  we have an use case where for same service we need to add multiple KPI and those KPI depends on different entities. For Example: We have Infrastructure related KPI which uses host as entity, another KPI is "service Up" which basically check service is up and in this case entity is "process name".  Also have KPI for Garbage collection which also has different entity. Question: I am trying to understand which is the best way to handle such scenario. where we can add all these KPI without making service map too complex.  
I want to pass dynamic value from my search result into email alert subject. I try $result.fieldname$ but it not coming up in the email alert  can someone help me? Thanks
Hi all, I set a corn job on alert my alert should not trigger between 9pm to 7am I used below corn job but I am receiving alerts after 9pm 0 0-21, 7-23 5-9 3 1-7 is this corn job correct?? Do I ne... See more...
Hi all, I set a corn job on alert my alert should not trigger between 9pm to 7am I used below corn job but I am receiving alerts after 9pm 0 0-21, 7-23 5-9 3 1-7 is this corn job correct?? Do I need to do any changes????
Hello everyone,  I followed steps to install DSDL : https://docs.splunk.com/Documentation/DSDL/5.1.1/User/InstallDSDL and do this scenario here https://www.sidechannel.blog/en/detecting-anomalies-us... See more...
Hello everyone,  I followed steps to install DSDL : https://docs.splunk.com/Documentation/DSDL/5.1.1/User/InstallDSDL and do this scenario here https://www.sidechannel.blog/en/detecting-anomalies-using-machine-learning-on-splunk/ But when I'm trying to start the container :    I get 403 error :  I checked roles, capabilities  I checked all kind of posts from the community I checked global permissions on DSDL  Is there a known issue about that ?  Have a good day all, Betty
Hi, I'd lilke to create a detailed report with info including the type of forwarder, the average KB/s, the OS, the IP, the splunk version but also with information to which exact index the forwarder... See more...
Hi, I'd lilke to create a detailed report with info including the type of forwarder, the average KB/s, the OS, the IP, the splunk version but also with information to which exact index the forwarder forwards to.  Is it possible to recreate the search from the monitoring console for forwarder instance and use it somehow to connect it to each index?  `dmc_get_forwarder_tcpin` hostname=* | eval source_uri = hostname.":".sourcePort | eval dest_uri = host.":".destPort | eval connection = source_uri."->".dest_uri | stats values(fwdType) as fwdType, values(sourceIp) as sourceIp, latest(version) as version, values(os) as os, values(arch) as arch, dc(dest_uri) as dest_count, dc(connection) as connection_count, avg(tcp_KBps) as avg_tcp_kbps, avg(tcp_eps) as avg_tcp_eps by hostname, guid | eval avg_tcp_kbps = round(avg_tcp_kbps, 2) | eval avg_tcp_eps = round(avg_tcp_eps, 2) | `dmc_rename_forwarder_type(fwdType)` | rename hostname as Instance, fwdType as "Forwarder Type", sourceIp as IP, version as "Splunk Version", os as OS, arch as Architecture, guid as GUID, dest_count as "Receiver Count", connection_count as "Connection Count", avg_tcp_kbps as "Average KB/s", avg_tcp_eps as "Average Events/s"   And probably somehow join it with  | tstats count values(host) AS host WHERE index=* BY index   The issue I see is that it searches dmc_get_forwarder_tcpin which is equal to index=_internal sourcetype=splunkd group=tcpin_connections (connectionType=cooked OR connectionType=cookedSSL) fwdType=* guid=* and I cannot find the indexes there. How can i connect it to each index?
So, I have a chart function that works perfectly! | chart sum(transactionMade) over USERNUMBER by POSTDATE But, I want my chart to have USERNUMBER and USERNAME. They are both correlated, so it shou... See more...
So, I have a chart function that works perfectly! | chart sum(transactionMade) over USERNUMBER by POSTDATE But, I want my chart to have USERNUMBER and USERNAME. They are both correlated, so it should not be an issue. I also want to add Team Number, which there is no correlation to USERNUMBER and USERNAME. Is it possible to have multiple fields after over? I can concatenate all the fields into one string, but it would be easier if they were separate columns. Thank you! 
We are currently changing our splunk server to a new one and during the change there was a mix up and we got data sent to the old instance (about 12h worth) which we would like to transfer to our new... See more...
We are currently changing our splunk server to a new one and during the change there was a mix up and we got data sent to the old instance (about 12h worth) which we would like to transfer to our new splunk instance. My thought was to do a search on the old one and then export the results, when I do this as a RAW format and then import it to the new one the data looks good but the field extracts for WinEventLog is not applied as it should (even tho I use the same Event type) how can I solve this? I've also tried to export it as xml, json, csv but the data looks worse than using RAW
How to Create Dataset inside DataModel and add new Fields in Dataset using Splunk SDK for Java
We need to monitor Azure API Management self-hosted gateway and get all the traces. The gateway is AKS container with image mcr.microsoft.com/azure-api-management/gateway:v2. Inside is .NET applicati... See more...
We need to monitor Azure API Management self-hosted gateway and get all the traces. The gateway is AKS container with image mcr.microsoft.com/azure-api-management/gateway:v2. Inside is .NET application with  /app $ dotnet --info Host: Version: 6.0.26 Architecture: x64 Commit: dc45e96840 .NET runtimes installed: Microsoft.AspNetCore.App 6.0.26 [/usr/share/dotnet/shared/Microsoft.AspNetCore.App] Microsoft.NETCore.App 6.0.26 [/usr/share/dotnet/shared/Microsoft.NETCore.App] We would expect to monitor it the same way as any other .NET application, but we dont catch any BT or traces. Also we inject following env - name: CORECLR_PROFILER value: "{57e1aa68-2229-41aa-9931-a6e93bbc64d8}" - name: CORECLR_ENABLE_PROFILING value: "1" - name: CORECLR_PROFILER_PATH value: "/opt/appdynamics-dotnetcore/libappdprofiler.so" - name: LD_DEBUG value: all - name: LD_LIBRARY_PATH value: /opt/appdynamics-dotnetcore/dotnet - name: IIS_VIRTUAL_APPLICATION_PATH value: "/" Please help.
Im using splunk enterprise v9.0.2.1 and MQTT modular input  app is installed. When receiving json input for MQTT modular input getting  ERROR JsonLineBreaker [1946662 parsing] - JSON StreamId:1164... See more...
Im using splunk enterprise v9.0.2.1 and MQTT modular input  app is installed. When receiving json input for MQTT modular input getting  ERROR JsonLineBreaker [1946662 parsing] - JSON StreamId:11645015375736311559 had parsing error:Unexpected character while looking for value: 'W' - data_source="mqtt", data_host="local", data_sourcetype="jklg_json". jklg_json props.conf: DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = (\{\"message\":.*\}) NO_BINARY_CHECK = true category = Custom description = JavaScript Object Notation format. For more information, visit http://json.org/ disabled = false pulldown_type = 1 SHOULD_LINEMERGE = false sample json: {"message":"hi","name":"jklg"} How to resolve this issue?  
Unable to import splunk-sdk  and splunklib python. Here are the error's I'm getting while importing. Any suggestions?    splunklib: error: command 'C:\\Program Files (x86)\\Microsoft Visual S... See more...
Unable to import splunk-sdk  and splunklib python. Here are the error's I'm getting while importing. Any suggestions?    splunklib: error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2022\\BuildTools\\VC\\Tools\\MSVC\\14.39.33519\\bin\\HostX86\\x64\\cl.exe' failed with exit code 2 [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pycrypto Running setup.py clean for pycrypto Failed to build pycrypto ERROR: Could not build wheels for pycrypto, which is required to install pyproject.toml-based projects   splunk-sdk : line 18, in <module> from splunklib.six.moves import map ModuleNotFoundError: No module named 'splunklib.six.moves' [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed × Encountered error while generating package metadata. ╰─> See above for output.
walmart_2.xml walmart_3.xml walmart_4.xml Scenerio I   When using below configuration in Inputs.conf we can able to monitor in splunk   [monitor://D:\scada_server\walmart_2.xml] disabled ... See more...
walmart_2.xml walmart_3.xml walmart_4.xml Scenerio I   When using below configuration in Inputs.conf we can able to monitor in splunk   [monitor://D:\scada_server\walmart_2.xml] disabled = false host = WALVAU-VIDI-1 index = 2313917_2797418_scada sourcetype = Scada_walmart_alarm crcSalt = <SOURCE> CHECK_METHOD = entire_md5   Scenerio 2   Hello Splunkers!! I need your help to fix this issue. When using below configuration in Inputs.conf we can't able to monitor in splunk.   [monitor://D:\scada_server\walmart_*.xml] disabled = false host = WALVAU-VIDI-1 index = 2313917_2797418_scada sourcetype = Scada_walmart_alarm crcSalt = <SOURCE> CHECK_METHOD = entire_md5   Please suggest some workaround.
Hi Team, We have a search head cluster and indexer cluster in our current Splunk environment.  The data to the indexer earlier was provided by multiple forwarders which had the endpoint for the Inde... See more...
Hi Team, We have a search head cluster and indexer cluster in our current Splunk environment.  The data to the indexer earlier was provided by multiple forwarders which had the endpoint for the Indexer. Now, since it is a multi-indexer architecture, we need a common point for the forwarder to point the data Please provide suggestions on how to set up the forwarders -> Deployment Server ->Cluster master architecture. I came across this one. But confused with the meaning of deployment client  https://community.splunk.com/t5/Deployment-Architecture/How-to-set-up-new-deployment-server-in-a-clustered-environment/m-p/514847   Thanks in advance!