All Topics

Top

All Topics

Hi All, urgent help here. I check whether is any activity done by a user on a client machine, so i use this query in splunk search - [host="domain controller's server name" "user's account name"]. Fr... See more...
Hi All, urgent help here. I check whether is any activity done by a user on a client machine, so i use this query in splunk search - [host="domain controller's server name" "user's account name"]. From the result, I see that there is multiple login/logout session within seconds and multiple Kerberos. I can confim that there is no physical user that is using the client machine. So can i ask why there is still wineventlogs (login/logout)?
Hello Splunk Community! Are you making the most out of the Splunk Education training units provided by your company? If you’re not, we’ve got a fun program to incentivize you! The Splunk Learning R... See more...
Hello Splunk Community! Are you making the most out of the Splunk Education training units provided by your company? If you’re not, we’ve got a fun program to incentivize you! The Splunk Learning Rewards Program.   While we believe life-long learning has its own rewards, the Splunk Learning Rewards Program offers current and new learners points for each class they complete, which can then be redeemed for super-fun Splunk swag. And, if you need more reasons to earn those points now, you can choose from limited time only Summer-themed swag in the Learning Rewards site. .We think getting decked out in some Splunk swag is the perfect incentive to register and complete the courses before those training units expire!   How It Works Participating in the Splunk Learning Rewards Program is a breeze. Here's what you need to do: Complete any paid-for Splunk Education Course: Choose from a variety of options, including eLearning with Labs and Instructor-Led Training, tailored to your preferences and learning style. Feel Accomplished: As you finish each course, take pride in your new knowledge and skills, knowing that you're one step closer to earning cool rewards. Access the Learning Rewards Site: Head over to the Learning Rewards Site and log in with your Splunk.com credentials. There, you can view the points you've earned and redeem them for fantastic Splunk swag. New accrued points show up within 48 - 72 hours after completing a course. Repeat: Don't stop with just one course! Keep participating, learning, and earning points for more exciting rewards.  Optional Bonus: Wear a new Splunk T-shirt to proudly validate your newly-acquired skills!   We've made it even easier for you to access the Splunk Learning Rewards Program. You can now find a link to it on the Training and Certification main navigation. So, no more hunting around – just click and get ready to explore! Please note that the Splunk Learning Rewards Program is exclusively available for customer-learners. Unfortunately, partners, Splunkers, SplunkWork+ Programs, or subscription customers are not eligible for this program. Want to Learn More? If you have any questions or need further details about the Splunk Learning Rewards Program, don't worry! We've got you covered with our comprehensive Splunk Learning Rewards FAQs. Additionally, you can check out this fun video that explains how the program works in a delightful way.   Don't miss out on this amazing opportunity to level up your Splunk skills and earn exciting rewards through the Splunk Learning Rewards Program. Start your learning journey today, and let the rewards come rolling in!      Happy Learning (and earning!)    - Callie Skokos on Behalf of the Splunk Education Crew
Hi, I have created a splunk email and it seems to be triggering it twice. Below the query and alert configuration. query: index="liquidity" AND cf_space_name="pvs-ad00008034" AND (msg.Extended_F... See more...
Hi, I have created a splunk email and it seems to be triggering it twice. Below the query and alert configuration. query: index="liquidity" AND cf_space_name="pvs-ad00008034" AND (msg.Extended_Fields.ValueAmount = "0" OR msg.Extended_Fields.ValueAmount = "NULL" OR msg.Results.Message="EWI Load process is completed*") | table _time, msg.Extended_Fields.DataSource, msg.Extended_Fields.ValueAmount, msg.Results.Message | sort by _time | rename msg.Extended_Fields.ValueAmount as ValueAmount | rename msg.Results.Message as Message | rename msg.Extended_Fields.DataSource as DataSource   trigger condition: search Message = "EWI Load process is completed*" | stats count as Total | search Total > 0
Configure automatic redaction of log message fields or patterns before indexing and storage  Announcing the release of Sensitive Data Masking capability for logs as a part of Data Security on Cloud... See more...
Configure automatic redaction of log message fields or patterns before indexing and storage  Announcing the release of Sensitive Data Masking capability for logs as a part of Data Security on Cloud Native Application Observability, powered by the Cisco FSO Platform. Why Sensitive Data Masking for logs? | How does the Cisco FSO Platform power Sensitive Data Masking for logs? | What's coming in the Security Insights FSO, Q1 FY24? | Additional resources   Why Sensitive Data Masking for logs? Masking sensitive data in logs is crucial for ensuring the protection and privacy of sensitive information. If exposed, personally identifiable information (PII), financial details, and healthcare records pose significant risks. By masking this data in logs, organizations can prevent unauthorized access, comply with data protection regulations, mitigate insider threats, reduce the attack surface for potential breaches, and enable effective auditing and investigation without compromising sensitive information.   Back to top How does the Cisco FSO Platform power Sensitive Data Masking for logs? Sensitive data masking capability for logs serves as the first proof point of the Cisco FSO Platform’s tenant-specific configuration, establishing a solid foundation for rapid development and expansion. This capability leverages the robustness of the Platform by using the Logs and Events Processing service to obfuscate the logs data and store the tenant-specific masking configurations in the Orion configuration store. By harnessing the Dashbase store, the Platform ensures that masked log records are securely stored, eliminating the need for raw message storage and ensuring adherence to the security standards within organizations.  Masking Rules, list view Masking Expressions, list view Log Explorer View displaying masked Social Security Number (SSN) Back to top What's coming in the Security Insights FSO? The Cisco FSO platform will offer the essential capability for manually configuring masking expressions and rules, and upcoming phases will introduce the Security Insights module providing automated sensitive data masking.   First unveiled at Cisco Live 2023, the Security Insights FSO platform module will launch in Q1 FY 24, delivering data security (sensitive data leakage) insights prioritized with Business Risk Observability (BRO).   The data security capability within this module will also expand beyond logs to encompass events and traces. This evolution will significantly bolster the functionality and effectiveness of sensitive data masking, empowering organizations with more comprehensive and automated protection for their sensitive information.  Back to top Additional resources Mask Sensitive Data in the Documentation  A Demonstration of Sensitive Data Masking on Logs  in the Knowledge Base Cisco introduces full-stack observability enhancement: Business Risk Observability in the Blog
A demonstration of Sensitive Data Masking on Logs  Available as of v23.6, Cisco AppDynamics’ Sensitive Data Masking capability for logs is a crucial element in ensuring the protection and privacy o... See more...
A demonstration of Sensitive Data Masking on Logs  Available as of v23.6, Cisco AppDynamics’ Sensitive Data Masking capability for logs is a crucial element in ensuring the protection and privacy of sensitive information. By masking the personally identifiable information (PII), financial details, healthcare records, and more in logs, organizations can prevent unauthorized access, comply with data protection regulations, mitigate insider threats, reduce the attack surface for potential breaches, enable effective auditing and investigation without compromising sensitive information  Sensitive Data Masking for Logs is part of Data Security on Cloud Native Application Observability, powered by the Cisco FSO Platform.    IN THIS ARTICLE: Demo video | Demo Chapters + Notes | Additional resources  Demo | Sensitive Data Masking on Logs  Demo Chapters and Notes  Demo Chapter + Timestamp Steps and Notes  Getting to the Masking Expressions tab 00:00:30 Click the Configure option on the left-hand side panel. Look for the new tab called Security.  Under Security, click on Data Security.  You will land on a page with a Masking Rules tab, and next to it, the Masking Expressions tab.   Back to top Understanding the Masking Expressions list 00:00:58  Under the Masking Expressions tab, you will see a list of previously created masking expressions.    To determine which are provided out-of-the-box, and which have been created by a user, refer to the Type column. “Default” indicates out-of-the-box masking expressions, and “Custom” refers to those created by a user. Back to top Overview of the Masking Expression fields  00:01:30  The first field here is the expression named “Custom SSN”.  The second field is for the Regex for matching data. The third field is Data Sensitivity, referring to the sensitivity of the data, which can vary from organization to organization, ranging from low, medium, high, to critical. Back to top Using masking expressions once they’ve been created  00:02:47 To use these masking expressions, go to the Masking Rules tab. You’ll see a list of the masking rules that have already been created.  To determine whether the rules have been enabled, look at the Monitoring Status column. Here, you can see the mask_social_security_number is enabled while the Mask_Credit_Card_Rule is currently disabled. Back to top Rule field overview  00:03:10  With the Mask SSN rule as example: The first field in a rule is its name—here Mask Social Security Number.    The second field shows the scope of the rule, referring to the set of Logs to which the masking expressions should be applied. In this scenario, you will need to provide this value in the log format field within your log collector’s YAML file — here k8s:ad_ecommerce_appdcloud_demo4_logs.    The third field is where you can select the masking expression that should be applied to this set of logs— here custom_ssn, which we had previously created in the Masking Expressions list. You can also go ahead and create a new masking expression right here, which would have the same fields we saw earlier when going through the Masking Expressions tab.    As you select the Masking Expressions, you will start seeing them pop up as labels right below the Masking Expressions tab.    The 4th field is how you want to mask your data. Here, you have the option to select either X character or a custom string.    We will go ahead and select the X character in this scenario. Which means that any masked data will be replaced with the character “x”. Back to top Saving and Enabling the rule 00:05:02   Let's go ahead and save this rule. Once a rule has been saved, it is disabled by default.   In this scenario, we had edited a rule which had already been enabled, so the monitoring status for this rule is already enabled.   Once a masking rule has been enabled, any data being ingested from the scope specified in the masking rule will be scanned for that particular pattern, and that data will be masked.  Back to top View the masked data in the Log Explorer  00:05:41 Let's go ahead and look at how the masked data will look.   Looking at the log's explorer, search for SSN on the search bar. Once I search for that, I can see log records that had an SSN entry which have now been masked with character “x”. Within the log store, the data is saved in this masked format. We do not save any raw messages to comply with security standards.  Back to top Additional resources  Mask Sensitive Data in the Documentation  Announcing Sensitive Data Masking for logs in News & Announcements   
So we rebuilt out SHs aby completely blowing them out and started with a fresh 9.1.01 install. Then just for kicks before making a SH  Cluster I installed the Splunk Security Essentials on one of th... See more...
So we rebuilt out SHs aby completely blowing them out and started with a fresh 9.1.01 install. Then just for kicks before making a SH  Cluster I installed the Splunk Security Essentials on one of the SHs and The app worked wonderfully but when I made it part of a cluster It gave  errors I am attaching a snipits of both so you can see. Keep in mind that all that was changed was that I put the SH into  a cluster and then got the errors.
Hello, support didn't provide way to have current state of squash_threshold in our clustered environment. They suggest increase from 2000 to 3000 for instance. How do you set this value? Thanks f... See more...
Hello, support didn't provide way to have current state of squash_threshold in our clustered environment. They suggest increase from 2000 to 3000 for instance. How do you set this value? Thanks for your help.
Hi There, I have just checked the Cloud Monitoring Console after receiving an email that noted some apps were ready to be upgraded to Python 3. I am using Splunk Cloud and saw the following informa... See more...
Hi There, I have just checked the Cloud Monitoring Console after receiving an email that noted some apps were ready to be upgraded to Python 3. I am using Splunk Cloud and saw the following information about my universal forwarders. I have attached a screenshot, but the date doesn't appear to make sense and the newer version is showing as being outdated. Any help would be appreciated, Jamie
Hi Team Is this feature available in cloud instance( free tier, trail version), to share the studio dashboard URL to an anonymous user( the url should be able to open in browser and mobile devices)... See more...
Hi Team Is this feature available in cloud instance( free tier, trail version), to share the studio dashboard URL to an anonymous user( the url should be able to open in browser and mobile devices).   Thanks.
AppDynamics monitors every execution of a business transaction within an application that has been instrumented, either using our agents or through OpenTelemetry. Both Business Transaction and Proces... See more...
AppDynamics monitors every execution of a business transaction within an application that has been instrumented, either using our agents or through OpenTelemetry. Both Business Transaction and Process Snapshots capture the details necessary for gaining a deeper understanding of method call performance...answering questions like, what line of code is taking the longest to run?  Documentation: Transaction Snapshot Diagnostic Sessions
Hi, We know how to change WebUI SSL certificate to a custom one. How about the certificate used for other ports (like JobManager)? Is it possible?
In the last month, the Splunk Threat Research Team (STRT) has had 2 releases of new security content via the Enterprise Security Content Update (ESCU) app (v4.6.0 and v4.7.0). With these releases, th... See more...
In the last month, the Splunk Threat Research Team (STRT) has had 2 releases of new security content via the Enterprise Security Content Update (ESCU) app (v4.6.0 and v4.7.0). With these releases, there are 8 new detections, 16 updated detections and 7 new analytic stories now available in Splunk Enterprise Security via the ESCU application update process. Content highlights include:  New searches that focus on potential malicious activities related to suspicious registry modification of Windows and malicious command line behavior, including potential exploitation attempts against Citrix ADC A new analytic story for the detection and investigation of unusual activities that relate to BlackByte ransomware Detections for CVE-2023-36884, an unpatched zero-day vulnerability affecting Windows and Microsoft Office products, and CVE-2023-3519, a vulnerability in NetScaler (formerly Citrix) Application Delivery Controller (ADC) and NetScaler Gateway A new analytic story to detect task scheduling activities related to MITRE ATT&CK technique T1053 New searches to detect activities related to Amadey, a type of malware that primarily operates as a banking Trojan  Detections for potential exploitation attempts against VMware vRealize Network Insight that align with the characteristics of CVE-2023-20887 New Analytic Stories:  BlackByte Ransomware CVE-2023-36884 Office and Windows HTML RCE Vulnerability Citrix Netscaler ADC CVE-2023-3519 Scheduled Tasks Amadey Graceful Wipe Out Attack VMware Aria Operations vRealize / CVE-2023-20887 New Detections:  Windows Modify Registry EnableLinkedConnections Windows Modify Registry LongPathsEnabled Windows Modify Registry Risk Behavior Windows Post Exploitation Risk Behavior Windows Common Abused Cmd Shell Risk Behavior Citrix ADC Exploitation CVE-2023-3519 Windows PowerShell ScheduleTask Windows Files and Dirs Access Rights Modification Via Icacls Updated Detections:  O365 Add App Role Assignment Grant User MSHTML Module Load in Office Product Office Document Spawned Child Process To Download Office Product Spawn CMD Process Office Product Spawning BITSAdmin Office Product Spawning CertUtil Office Product Spawning MSHTA Office Product Spawning Rundll32 with no DLL Office Product Spawning Windows Script Host ICACLS Grant Command Registry Keys Used For Persistence PowerShell 4104 Hunting Detect Baron Samedit CVE-2021-3156 Segfault Detect Baron Samedit CVE-2021-3156 Windows System Shutdown CommandLine VMWare Aria Operations Exploit Attempt The team has also published the following blogs: Amadey Threat Analysis and Detections I am the Snake Now: Analysis of Snake Malware For all our tools and security content, please visit research.splunk.com.  — The Splunk Threat Research Team  
Dear Team I have a splunk lookup with two fields, username and location. The lookup is populated every time the location is US. However, I want to see the location keeps on changing. So, I woul... See more...
Dear Team I have a splunk lookup with two fields, username and location. The lookup is populated every time the location is US. However, I want to see the location keeps on changing. So, I would like to write a query, which first checks if the username exists in the lookup and if it does, match the location in the event field to the lookup field. If the location from event field doesn't match with that of lookup field, it should remove that username from the lookup.  Any ideas or suggestions would be appreciated.  
I have the Microsoft Azure App for Splunk 2.0.1 and I have data for `azure-consumption` (sourcetype=azure:billing) via Splunk Add-on for Microsoft Azure 4.0.3. The Billing dashboard queries are filt... See more...
I have the Microsoft Azure App for Splunk 2.0.1 and I have data for `azure-consumption` (sourcetype=azure:billing) via Splunk Add-on for Microsoft Azure 4.0.3. The Billing dashboard queries are filtering on properties.subscriptionId, but this is not in my data, I have subscriptionGuid and subscriptionName. We have recently moved to an MCA single billing model, it looks like this has caused the issue as the legacy dashboards stopped at this time. It looks like the type: Microsoft.Consumption/usageDetails has completely changed, i notice that the 'kind' has changed from 'legacy' to 'modern'. Has anyone else encountered this and have a solution? Are the dashboards supposed to work for my data? Thanks
I have an issue with KV store "KV Store initialization failed" and to overcome with this issue. I have followed below steps and after that no new error shown for "KV Store initialization failed". But... See more...
I have an issue with KV store "KV Store initialization failed" and to overcome with this issue. I have followed below steps and after that no new error shown for "KV Store initialization failed". But after that all the entries under KV store lookup got vanished. Please help me how I should restore the entries. If restore is not possible then what is the other possible solution to restore the entries. Stop Splunk rename the current mongo folder to old Start Splunk And you will see a new mongo folder created with all the components.  
Hello, We have 1 master server (Receiver or Indexer) and 50 slave servers. All are LINUX servers.  Now, we need to install universal forwarder in all 50 Linux machines.  Is there any way to autom... See more...
Hello, We have 1 master server (Receiver or Indexer) and 50 slave servers. All are LINUX servers.  Now, we need to install universal forwarder in all 50 Linux machines.  Is there any way to automate this installation process without any manual work? It would be really helpful if i get the solution. Thanks, Ragav 
Hi looking to change a color of a field based on its value in a monitoring context like failed , successful kind of thing any idea how to do this would be great.   The table cell color is what I am l... See more...
Hi looking to change a color of a field based on its value in a monitoring context like failed , successful kind of thing any idea how to do this would be great.   The table cell color is what I am looking to change. 
Hello, I have installed the splunk enterprise free version on my pc and i have installed the app Splunk app for lookup file edting but unfortunatly doesn't works. When i try to upload a file .csv... See more...
Hello, I have installed the splunk enterprise free version on my pc and i have installed the app Splunk app for lookup file edting but unfortunatly doesn't works. When i try to upload a file .csv i have the following error "File is binary or file encoding is not supported, only utf-8 encoded files are supported splunk". I tried to change the permission on the app's folder on windows but i did not resolve the problem. I tested with a very easy csv, and this one is the result: In the csv the column test3,test,test2 were divided. I saved the .csv in all format. Thanks for the support!
hey, we're in the process of upgrading on our splunk single instances from 8.2.5 to 9.1.0.1 due to EOL. we have two splunk instances - production which is under the enterprise license, test which ... See more...
hey, we're in the process of upgrading on our splunk single instances from 8.2.5 to 9.1.0.1 due to EOL. we have two splunk instances - production which is under the enterprise license, test which is under the free license. we've first started upgrading our test splunk instance with the free license and we instantly saw these errors:     07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" [modular_input:349] [execute] [834704] Modular input: Splunk Assist exit with exception: Traceback (most recent call last): 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" File "/opt/splunk/etc/apps/splunk_assist/bin/assist/modular_input.py", line 342, in execute 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" self.do_run(input_definition["inputs"]) 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" File "/opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py", line 66, in do_run 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" if not should_run(self.logger, self.session_key): 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" File "/opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py", line 27, in should_run 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" sh = is_search_head(log, session_key) 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" File "/opt/splunk/etc/apps/splunk_assist/bin/assist/serverinfo.py", line 153, in is_search_head 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" cluster_mode = get_cluster_mode(log, session_key) 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" File "/opt/splunk/etc/apps/splunk_assist/bin/assist/serverinfo.py", line 257, in get_cluster_mode 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" raiseAllErrors=True 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" File "/opt/splunk/lib/python3.7/site-packages/splunk/rest/__init__.py", line 646, in simpleRequest 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" raise splunk.LicenseRestriction 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" splunk.LicenseRestriction: [HTTP 402] Current license does not allow the requested action 07-31-2023 07:01:44.412 +0000 ERROR ExecProcessor [670742 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py" .       this splunk assist issue seems to come from the usage of free license on splunk. I tried to disable the splunk-assist app but it wouldn't let me:     Cannot disable app: splunk_assist       as a result of the previous errors, we are seeing UI errors as well:      Unable to load common tasks. Refresh the page to try again.       any idea on how to proceed?
Hello,  my splunk courses (splunl )is not working,  after clicking on it, its just loading i am not getting access to course, please do let know.