All Topics

Top

All Topics

Can multiple wildcards be used in serverclass.conf whitelist file?  whitelist.from_pathname = /lookup/host.txt   Examples: M*WEB* *WBS*  
I have the below Trellis, is there a way to change the color for each Trellis? My code from Classic Dashboard.    search Cu $t_c$ En $t_e$ | timechart span=1h avg(Value) as AvgValue_Secs by Ca... See more...
I have the below Trellis, is there a way to change the color for each Trellis? My code from Classic Dashboard.    search Cu $t_c$ En $t_e$ | timechart span=1h avg(Value) as AvgValue_Secs by Category     I want something like this:  
Hello to all my dear friends We have SH-Cluster with 5 Search head and Enterprise Security(ES). When I want to add a new Threat List as a URL, I have to go to this address: ES APP\Configure\Data Enr... See more...
Hello to all my dear friends We have SH-Cluster with 5 Search head and Enterprise Security(ES). When I want to add a new Threat List as a URL, I have to go to this address: ES APP\Configure\Data Enrichment\Threat Intelligence Management But what happens after clicking on this page, the message Oops is displayed, can anyone help? Is the Input.local method the right method? Special Thank to Splunk  
Hello to all my dear friends In the past, I was able to import the logs of malware detected by mcafee into Splunk using Splunk DBConnect, now my question is, can I have a log of access to the centra... See more...
Hello to all my dear friends In the past, I was able to import the logs of malware detected by mcafee into Splunk using Splunk DBConnect, now my question is, can I have a log of access to the central management console of mcafee? Also, in which table are the logs related to the USB connection stored and how can I receive them in Splunk?
Hi, I have a 'complex' (for me at least) question.  What I want to achieve is the following: 1)  index=abc msg="*firewall off*" |table _time,hostname,msg >this will give me, for example: hos... See more...
Hi, I have a 'complex' (for me at least) question.  What I want to achieve is the following: 1)  index=abc msg="*firewall off*" |table _time,hostname,msg >this will give me, for example: hostname = machine1 msg = "the firewall has been turned off" >> I want to be triggered if someone turns off the firewall Now, the actual issue I have now is the following:  A few seconds before this event, I might get a "system update event" that updates the firewall (agent update), which is OK, and I do NOT want this event. I would need to combine both queries into 1 alert.   2)  index=abc hostname=machine1 NOT msg="*system updated*" I want to see the result of 1, but only if it was not preceeded by 2. I hope this makes sense.
Hi All, I am having an issue creating an alias simply going from DestinationPort to dest_port for SysMon EventID 3 I have tested:   index=my_index source=Sysmon | eval destinationPort=dest_p... See more...
Hi All, I am having an issue creating an alias simply going from DestinationPort to dest_port for SysMon EventID 3 I have tested:   index=my_index source=Sysmon | eval destinationPort=dest_port   I have seen in Splunk TA Sysmon that there is FIELDALIAS-dest_port=DestinationPort AS dest_port but still cannot convert DestinationPort to dest_port at Search time. Any suggestions, please? There are no other apps contradicting the precedence. Thank you!
Hello,  We are implenting splunk in our environment and right now i import every 7 days our vulnerability scan to splunk. My Task is to filter the Host and the CVE Number and get the output which h... See more...
Hello,  We are implenting splunk in our environment and right now i import every 7 days our vulnerability scan to splunk. My Task is to filter the Host and the CVE Number and get the output which host and CVE is new in the newest scan "New", which was in the old scan but is not there in the new scan "finished" and which is in both scan and is "unchanged"  The Problem is i do not have any information in the log data that the Host is finished or anything else . I have only 4 Fields: CVE ,extracted_Host, RISK Level = Critical,High and Medium and _time of course. Thats my try: index=vulnerability_scan Risk=Critical earliest=-7d latest=now | stats values(CVE) as CVE_7d by extracted_Host | appendcols [ search index=vulnerability_scan Risk=Critical earliest=now -7d latest=now | stats values(CVE) as CVE_now by extracted_Host ] | eval Status=case(isnull(CVE_7d) AND isnotnull(CVE_now), "New", isnotnull(CVE_7d) AND isnull(CVE_now), "Finished", isnotnull(CVE_7d) AND isnotnull(CVE_now), "Not Changed") | table extracted_Host, Status Problem with this i get only the output "finished" but most of the scans are in the old scan means that they are "unchanged". For me It is possible to split out the 3 outputs then i would build a Dashboard with the 3 Informations. I dont know if Splunk is the best tool to compare 2 timestamps like i will do? The Time Range is every time 7 days - maybe in the next time it will be  shorter but right now its 7 days.  Thanks for the help
I need to configure a service in Splunk ITSI, while creating a KPI am facing an issue. I gave a search string but when its generating a search I get an error in the result: Error in 'SearchParser': ... See more...
I need to configure a service in Splunk ITSI, while creating a KPI am facing an issue. I gave a search string but when its generating a search I get an error in the result: Error in 'SearchParser': The search specifies a macro 'aggregate_raw_into_entity' that cannot be found. Reasons include: the macro name is misspelled, you do not have "read" permission for the macro, or the macro has not been shared with this application. Click Settings, Advanced search, Search Macros to view macro information.   Is there any way to modify the Generated Search. 
Is Rhel 8.2 Compatible with Splunk Agent? Is there any existing server RHEL 8.2 with Splunk Agent? Or new costing? Please advice. Thank you in advance.
Hi In the example below, I clearly understand that the "hello world" will be updated in a Splunk event { "time": 1426279439, // epoch time "host": "localhost", "source": "random-data-ge... See more...
Hi In the example below, I clearly understand that the "hello world" will be updated in a Splunk event { "time": 1426279439, // epoch time "host": "localhost", "source": "random-data-generator", "sourcetype": "my_sample_data", "index": "main", "event": "Hello world!" } curl -H "Authorization: Splunk 12345678-1234-1234-1234-1234567890AB" https://localhost:8088/services/collector/event -d '{"event":"hello world"}' Now imagine that my json file contains many items like below { "time": 1426279439, // epoch time "host": "localhost", "source": "random-data-generator", "sourcetype": "my_sample_data", "index": "main", "event": "Hello world!" } { "time": 1426279538, // epoch time "host": "localhost", "source": "random-data-generator", "sourcetype": "my_sample_data", "index": "main", "event": "Hello eveybody!" } Is the curl command to use should be like this? curl -H "Authorization: Splunk 12345678-1234-1234-1234-1234567890AB" https://localhost:8088/services/collector/event -d '{"event":}'  Last question : instead using a prompt command to send the json logs in Splunk, is it possible to use a json script to do that? Or something else Is anybody has good examples of that? thanks
My current Splunk infra setup is clustered for Search Heads and Indexers. and  we are using deployer and cluster master to manage configs for the respective SH and IDX. For example, can I manually pl... See more...
My current Splunk infra setup is clustered for Search Heads and Indexers. and  we are using deployer and cluster master to manage configs for the respective SH and IDX. For example, can I manually placed an updated config in SH1 and then run a rolling restart so they will sync/replicate with each other. ? This is in the event the Deployer is down. But eventually once the Deployer is up , we will place the updated config in Deployer. So that when we run a sync, it will not affect/remove the file from the SH cluster. Will there be any issues in this scenario?
so after opening a case with Splunk tech support because we were unable to upgrade in place our Windows 2019 Servers from Splunk version 9.0.0 to 9.1.1 we were instructed to backup the ETC directory ... See more...
so after opening a case with Splunk tech support because we were unable to upgrade in place our Windows 2019 Servers from Splunk version 9.0.0 to 9.1.1 we were instructed to backup the ETC directory than uninstall Splunk and do a new install of 9.1.1 then copy the old ETC directory back over  well we did just that except we also put it on new/different hardware and now we can't log in to Splunk Web, we get the login screen it takes our credentials and then we get the three dots of death ... any help / advice is tremendously appreciated         
    OCTOBER 2023  See More, Act Faster, and Simplify Investigations with Splunk Enterprise Security 7.2 The latest release of Splunk Enterprise Security 7.2 introduces capabilities that... See more...
    OCTOBER 2023  See More, Act Faster, and Simplify Investigations with Splunk Enterprise Security 7.2 The latest release of Splunk Enterprise Security 7.2 introduces capabilities that deliver an improved workflow experience for simplified investigations; enhanced visibility and reduced manual workload; and customized investigation workflows for faster decision making. Learn more about the release in our blog, our Product News & Announcement Post and join the Tech Talk to watch the Splunk team walk through the new capabilities.  New to Enterprise Security? Check out The Beginner’s Guide to Security Monitoring for Enterprises. Splunk Mission Control 3.0 Release We’re happy to announce the release of Mission Control 3.0 which includes several new and exciting features made available to Splunk Enterprise Security Cloud users. Read the Splunk Community post to learn more about the new features available with this release. Security Content from the Splunk Threat Research Team The Splunk Threat Research Team released 8 new detections and 1 new analytic story in the last month. Read the Product News & Announcements post to learn more. The team also published the following blogs: Sharing is Not Caring: Hunting for Network Share Discovery Defending the Gates: Understanding and Detecting Ave Maria (Warzone) RAT Mockbin and the Art of Deception: Tracing Adversaries, Going Headless and Mocking APIs New blogs to help you make the most of Splunk Security Revisiting the Big Picture: Macro-level ATT&CK Updates for 2023 Threat Hunting for Dictionary-DGA with PEAK Deep learning in security: text-based phishing email detection with BERT model The PEAK Threat Hunting Framework  The PEAK Threat Hunting Framework takes the experience of top threat hunters and translates their insights to help you gain the most value from threat hunting across your entire security operations. Download your copy of “The PEAK Threat Hunting Framework” to discover more about the framework, including new hunt types and processes, defined deliverables, actionable metrics, and prioritized detection types.  See PEAK in action during our Model-Assisted Threat Hunting Powered by PEAK and Splunk AI webinar.  Unveiling the Complete Great Resilience Quest! We are excited to announce the release of the final two levels - “Proactive Response” & “Optimized Experiences” of the Great Resilience Quest! You can now fully explore your path to resilience and learn more about implementing security use cases in this interactive experience.    Platform Updates The Business Case for Unifying Security and Observability  As businesses and government organizations become more digital, more and more systems become mission-critical.  Given the potential business impact, executives and board members should accept these as business issues, ensuring system security and resilience must be addressed as part of business planning, risk management, and operations. Dive into the research from the Enterprise Strategy Group and Splunk to learn more. Introducing Federated Search for Amazon S3 for Splunk Cloud Platform Splunk is pleased to announce the general availability of Federated Search for Amazon S3, a new capability that allows you to search data from Amazon S3 buckets directly from Splunk Cloud Platform without the need to ingest it.  Enterprises Realize Benefits from Migrating to Cloud with Splunk Hear from other customers, leaders and practitioners who chose migrating to Splunk Cloud Platform as a better way to drive business value, efficiency and scale. Syslog in Splunk Edge Processor Supercharges Security Operations with Palo Alto Firewall Log Reduction Splunk Edge Processor now supports syslog-based ingestion protocols, making it well-equipped to wrangle complex and superfluous data. Users can deploy Edge Processor as an end-to-end solution for handling syslog feeds such as PAN logs, including the functionality to act as a syslog receiver, process and transform logs and route the data to supported destination(s). Go beyond the buzz and start harnessing the power of ML and AI Learn about the different AL/ML features across Splunk and leverage the recommended apps and use cases. Check out the new AI and ML tab on the Essentials Board to kickstart your journey. IDC Report: Enterprises Report Benefits of Migrating to Splunk Cloud Platform As enterprises’ rapid expansion to the cloud continues, IT leaders are continuously looking for ways to focus more on driving business value, efficiency and scale. Moving deployments to the cloud delivered as a software-as-a-service (SaaS) offers a win-win for current Splunk customers. In this analyst report, IDC examines the drivers and benefits that drove these enterprises to migrate deployments from on-premises to Splunk Cloud Platform delivered as a service. Customers documented for IDC the various business outcomes and benefits after migrating to the cloud and shared their journeys. Tech Talks, Office Hours and Lantern Tech Talks What’s New In Splunk SOAR? Join the Splunk SOAR team as they share more on the latest and greatest updates in Splunk SOAR. Register Now > Nov 8 - What’s New in Splunk SOAR 6.2 Nov 15 - Advance Your App Development with the Visual Studio Code Extension Streamline Data Ingestion with Deployment Server Essentials Learn the essential knowledge required for ingesting and managing any variety of data sources in Splunk, regardless of its origin or scale. Consider this your “Deployment Server: 101” essentials crash-course. Tune in here > Optimizing Customer Experiences with Splunk's Digital Experience Monitoring Discover Splunk's approach to digital experience monitoring. Splunk experts discuss the different elements of Splunk's Digital Experience Monitoring (DEM) portfolio and how it can help you optimize your customer experience. Read the Blog >   Community Office Hours Join our upcoming Community Office Hour sessions, where you can ask questions and get guidance! Security: Enterprise Security - Wed, Oct 25 at 1pm PT/4pm ET (Register here) Security: Risk-Based Alerting (RBA) - Wed, Nov 8  (Register here)  Security: SOAR - Wed, Nov 29  (Register here) Splunk Search - Wed, Dec 13  (Register here)   Splunk Lantern  This month’s Lantern blog highlights two sets of articles that illustrate how you can effectively use multiple parts of the Splunk product suite to solve some of your most crucial observability problems. These articles show the synergies between Splunk products and features, showcasing how they work together to enhance your outcomes beyond each product’s individual parts.  Education Corner Introducing Free eLearning for SOAR Administrators Developing Playbooks for Splunk Mission Control is targeted to help SOAR administrators gain the skills needed to harness the full potential of Splunk Mission Control. In this eLearning, SOAR playbook developers will learn how to use the SOAR Visual Playbook Editor (VPE) to create, test, and deploy playbooks for Mission Control. Admins will also learn how Mission Control and SOAR communicate, how playbooks operate within the Mission Control environment, and how to effectively utilize playbooks to interact with Mission Control incidents and response plans. This is your chance to take your SOAR expertise to “great new heights.”    Splunk Training Units Give You a Pass to Class At Splunk, we are committed to ensuring that our learning is accessible to everyone, everywhere. And, our customers understand the serious need for and value of a skilled workforce, which is why they add paid-for training into their Splunk software contracts. If you’re looking to take advantage of our eLearning with Labs or Instructor-led training courses, check in with your Customer Organization Manager who helps allocate training units (TUs) and tracks usage. Read our FAQ for more information about enrolling in STEP and accessing TUs for your courses.
    OCTOBER 2023      New Option to Get Your Data In - AWS PrivateLink Now Enabled for Observability Cloud We’ve enabled AWS PrivateLink for Splunk Observability Cloud. Now you can ... See more...
    OCTOBER 2023      New Option to Get Your Data In - AWS PrivateLink Now Enabled for Observability Cloud We’ve enabled AWS PrivateLink for Splunk Observability Cloud. Now you can experience the AWS gold standard for securely sending your data directly to the Observability ingest endpoint from your AWS environment. Splunk Observability Cloud’s PrivateLink is enabled on ALL AWS realms and is limited to connectors in the same AWS region. Learn more.     More log integrations between Splunk Platform and Observability Cloud We’re making it easier for Observability Cloud users to seamlessly bring their log data into Splunk Cloud or Enterprise Platform. With our new “Open in Splunk platform” button available in Log Observer UI, engineers wishing to analyze their logs in more detail can easily transpile the logs from Observability Cloud to Splunk Cloud and take advantage of best-in-class logging capabilities. Log Observer Connect users will be able to swiftly jump from in-context troubleshooting to deep-dive log investigations in one unified platform. Learn more.     Latest Enhancement to Splunk App for Content Packs 2.0: Saved Searches Deactivated by Default Up until now, installing Splunk App for Content Packs on top of ITSI or IT Essentials Work would result in up to 300+ ‘saved searches’ being activated by default. This often led to ITSI performance issues and higher SVC compute for things customers did not use. To improve this experience and give users better control, in Splunk App for Content Packs 2.0 we’ve introduced a significant change in behavior where all the ‘saved searches’ are deactivated by default. Users can then enable only the saved searches they need. Learn more.     Splunk Observability Delivers Up to 243% ROI  Splunk commissioned Forrester Consulting to conduct a Total Economic Impact™ (TEI) study to examine the potential return on investment (ROI) enterprises can realize by deploying Splunk Observability. A decreased number of system outages, improved developer productivity, reduced MTTR, labor cost savings and enhanced employee experience are just some of the business benefits enabled by Splunk Observability.  Understand the Total Economic Impact of Your Splunk Observability Investment. Read the Splunk Observability Total Economic Impact Report™.     Unveiling the Complete Great Resilience Quest! We are excited to announce the release of the final two levels of the Great Resilience Quest! The Quest is an interactive experience that provides guidance and resources for real-world use cases at each stage of the resilience journey. Learn more about how you can use Splunk Observability Cloud to meet your key business needs. Join the quest and become eligible for grand prizes!        Platform Updates The Business Case for Unifying Security and Observability  As businesses and government organizations become more digital, more and more systems become mission-critical.  Given the potential business impact, executives and board members should accept these as business issues, ensuring system security and resilience must be addressed as part of business planning, risk management, and operations. Dive into the research from the Enterprise Strategy Group and Splunk to learn more. Introducing Federated Search for Amazon S3 for Splunk Cloud Platform, now GA Splunk is pleased to announce the general availability of Federated Search for Amazon S3, a new capability that allows you to search data from Amazon S3 buckets directly from Splunk Cloud Platform without the need to ingest it.  Enterprises Realize Benefits from Migrating to Cloud with Splunk Hear from other customers, leaders and practitioners who chose migrating to Splunk Cloud Platform as a better way to drive business value, efficiency and scale. Go beyond the buzz and start harnessing the power of ML and AI Learn about the different AL/ML features across Splunk and leverage the recommended apps and use cases. Check out the new AI and ML tab on the Essentials Board to kickstart your journey. IDC Report: Enterprises Report Benefits of Migrating to Splunk Cloud Platform As enterprises’ rapid expansion to the cloud continues, IT leaders are continuously looking for ways to focus more on driving business value, efficiency and scale. Moving deployments to the cloud delivered as a software-as-a-service (SaaS) offers a win-win for current Splunk customers. In this analyst report, IDC examines the drivers and benefits that drove these enterprises to migrate deployments from on-premises to Splunk Cloud Platform delivered as a service. Customers documented for IDC the various business outcomes and benefits after migrating to the cloud and shared their journeys.     Tech Talks, Office Hours and Lantern Tech Talks OpenTelemetry: What’s Next. Logs, Profiles, and More  Tuesday, November 14 | 11AM PT / 2PM ET Register Starting with Observability: OpenTelemetry Best Practices Watch On-Demand Streamline Data Ingestion with Deployment Server Essentials Learn the essential knowledge required for ingesting and managing any variety of data sources in Splunk, regardless of its origin or scale. Consider this your “Deployment Server: 101” essentials crash-course. Tune in here > Optimizing Customer Experiences with Splunk's Digital Experience Monitoring Discover Splunk's approach to digital experience monitoring. Splunk experts discuss the different elements of Splunk's Digital Experience Monitoring (DEM) portfolio and how it can help you optimize your customer experience. Read the Blog >   Community Office Hours Join our upcoming Community Office Hour sessions, where you can ask questions and get guidance! Security: Enterprise Security - Wed, Oct 25 at 1pm PT/4pm ET (Register here) Security: Risk-Based Alerting (RBA) - Wed, Nov 8  (Register here)  Security: SOAR - Wed, Nov 29  (Register here) Splunk Search - Wed, Dec 13  (Register here)   Splunk Lantern  This month’s Lantern blog highlights two sets of articles that illustrate how you can effectively use multiple parts of the Splunk product suite to solve some of your most crucial observability problems. These articles show the synergies between Splunk products and features, showcasing how they work together to enhance your outcomes beyond each product’s individual parts.    Education Corner More Visibility with Splunk O11y Education Courses  Maximize your organization’s O11y investment with new, free Splunk Education self-paced courses. Start with Optimizing Metrics Usage with Splunk Metrics Pipeline to learn how best to optimize your data intake into Splunk Observability Cloud using Splunk Metrics Pipeline Management. Move onto Introduction to Log Observer Connect to learn how to discover trends in log data and use the product for root cause analysis. Or, use those training units to take our instructor-led training, Using SignalFlow in Splunk Observability Cloud, which covers how to use the SignalFlow analytics language in Splunk Observability Cloud.   Splunk Training Units Are Your Ticket to Paid Training At Splunk, we are committed to ensuring that our learning is accessible to everyone, everywhere. And, our customers understand the serious need for and value of a skilled workforce, which is why they add paid-for training into their Splunk software contracts. If you’re looking to take advantage of our eLearning with Labs or Instructor-led training courses, check in with your Customer Organization Managers who help allocate training units (TUs) and track usage. Read our FAQ for more information about enrolling in STEP and accessing TUs for your courses.    Until Next Month, Happy Splunking
OCTOBER 2023  GovSummit 2023 Registration is now open for Splunk’s largest, free annual event for government and agency leaders and decision-makers. We’re looking forward to bringin... See more...
OCTOBER 2023  GovSummit 2023 Registration is now open for Splunk’s largest, free annual event for government and agency leaders and decision-makers. We’re looking forward to bringing together government IT and security professionals for industry-leading event.Hear firsthand from government and industry experts about how agencies are adapting and thriving — and building digital resilience to improve their cyber strategy. Event Details: Ronald Reagan Building and International Trade Center, Washington, D.C. Wednesday, December 14, 2023 | 7:30 am - 4:30 pm EST Workshops: Splunk offers free virtual workshops. Join the top technical experts at Splunk for hands-on learning on Splunk SOAR, Insider Threat, and Enterprise Security.  Times vary and there is no cost to attend.   11/2 Enterprise Security, 1:00 - 4:00 ET.  Registration Page 11/9 Investigating with Splunk, 1:00 - 5:00 ET, Registration Page 11/16 Security Lunch & Learn, 1:00 - 4:00 ET, Registration Page 11/30 Splunk 4 Rookies, 1:00 - 4:00 ET, Registration Page Booth and Table Top Events: Be sure to stop by and see Splunk to learn about the latest trends at these industry events: 11/2 FL IT Leadership Forum,Tallahassee, FL 11/9 Kubecon, Chicago, IL 11/7 TechNet IndoPac, Honolulu 11/13 AL Digital Govt Summit, Montgomery, AL 11/13  Alamo Ace, San Antonio, TX 11/15 NLC City Summit 2023, Atlanta, GA 11/16 CyberTalks, Washington, DC 11/27 AWS re:Invent, Las Vegas, NV New Content: M-21-31 mandate:  Software supply chain attacks are increasingly complex and damaging, underscoring the importance of increased government visibility throughout a cybersecurity incident. In this blog, Splunk discusses the M-21-31 mandate and how Splunk plays an integral part in protecting the nation's most critical assets.  A Guide to Embracing a Zero Trust Security Model in Government Ebook   Explore the directive pivot to a zero trust architecture, discover five key issues in the shift to hybrid/multi-cloud and why the DoD’s “never trust, always verify" mindset is critical to achieving cyber resiliency so they can become more informed and prepare for the next cyberattack.   Platform Updates   The Business Case for Unifying Security and Observability  As businesses and government organizations become more digital, more and more systems become mission-critical. Therefore, application performance and security incidents can have a direct impact on the top and bottom line, as an hour of system downtime can equate to the loss of thousands or even millions of dollars. Given the potential business impact, executives and board members should accept these as business issues, ensuring system security and resilience must be addressed as part of business planning, risk management, and operations. Dive into the research from the Enterprise Strategy Group and Splunk to learn more.    Enterprises Realize Benefits from Migrating to Cloud with Splunk Hear from other customers, leaders and practitioners who chose migrating to Splunk Cloud Platform as a better way to drive business value, efficiency and scale.   IDC Report: Enterprises Report Benefits of Migrating to Splunk Cloud Platform As enterprises’ rapid expansion to the cloud continues, IT leaders are continuously looking for ways to focus more on driving business value, efficiency and scale. Moving deployments to the cloud delivered as a software-as-a-service (SaaS) offers a win-win for current Splunk customers. In this analyst report, IDC examines the drivers and benefits that drove these enterprises to migrate deployments from on-premises to Splunk Cloud Platform delivered as a service. Customers documented for IDC the various business outcomes and benefits after migrating to the cloud and shared their journeys. HSBC: Accelerated time to value and increased scalability by >300% Pacific Dental Services: Increased operational efficiencies by more than 40% GAF: Realized annual cost savings by 20%   Go beyond the buzz and start harnessing the power of ML and AI Learn about the different AL/ML features across Splunk and leverage the recommended apps and use cases. Check out the new AI and ML tab on the Essentials Board to kickstart your journey.   Tech Talks, Office Hours and Lantern Tech Talks Streamline Data Ingestion with Deployment Server Essentials Learn the essential knowledge required for ingesting and managing any variety of data sources in Splunk, regardless of its origin or scale. Consider this your “Deployment Server: 101” essentials crash-course. Tune in here > Optimizing Customer Experiences with Splunk's Digital Experience Monitoring Discover Splunk's approach to digital experience monitoring. Splunk experts discuss the different elements of Splunk's Digital Experience Monitoring (DEM) portfolio and how it can help you optimize your customer experience. Read the Blog > Community Office Hours Join our upcoming Community Office Hour sessions, where you can ask questions and get guidance on all things Enterprise Security, RBA, SOAR, and Splunk Search! Security: Enterprise Security - Wed, Oct 25 at 1pm PT/4pm ET (Register here) Security: Risk-Based Alerting (RBA) - Wed, Nov 8 at 1pm PT/4pm ET (Register here)  Security: SOAR - Wed, Nov 29 at 1pm PT/4pm ET (Register here) Splunk Search - Wed, Dec 13 at 1pm PT/4pm ET (Register here)   Splunk Lantern  This month’s Lantern blog highlights two sets of articles that illustrate how you can effectively use multiple parts of the Splunk product suite to solve some of your most crucial observability problems. These articles show the synergies between Splunk products and features, showcasing how they work together to enhance your outcomes beyond each product’s individual parts.  Education Corner New Course Alert! Troubleshooting Splunk Enterprise 9.1 Are you ready to tackle the mysteries of Splunk Enterprise 9.1? Then you’re ready for our latest course, Troubleshooting Splunk Enterprise 9.1. This 9-hour, paid instructor-led course designed for Splunk administrators covers topics and techniques for troubleshooting a standard Splunk distributed deployment using the tools available with Splunk Enterprise. This hands-on, lab-oriented class will help you gain troubleshooting experience before attending more advanced courses. Debug a distributed deployment today!   Free Splunk Training for University Students, Faculty, and IT Staff Bridging the growing cyber skills gap is critical to securing our digital world, which is why we continue to grow and scale our offerings within the Splunk Academic Alliance Program to better serve students, faculty, and staff within non-profit universities, colleges, and schools. Currently, we offer 21 free, self-paced courses with hands-on labs as part of our commitment to training the next generation of cyber experts. Find out more about the Splunk Academic Alliance Program, including a one-year, 10GB license for Splunk Enterprise and Foundational training all at no cost.     Until next month, Happy Splunking  
I would like to allow list a url from my dashboards so that no more redirection warnings pop up.  Per the documentation, I can do this by editing web-features.conf on my SHs. What would be the prope... See more...
I would like to allow list a url from my dashboards so that no more redirection warnings pop up.  Per the documentation, I can do this by editing web-features.conf on my SHs. What would be the proper way to push this as a bundle? I tried creating and modifying web-features.conf  in an app context on the SHC Deployer (../shcluster/apps/myapp/default/web-features.conf) directory but I still got the pop up (yes, I restarted the SHs). After using "apply shcluster-bundle",  I used btool AND show config to verify the config changes appeared on the SHs. No dice. If I modify web-features.conf directly on the SHs (../etc/system/local/web-features.conf), it works perfectly. Thank you! my edited web-features.conf below: [feature:dashboards_csp] dashboards_trusted_domain.domain1 = *myurl.com  
how to convert below json array to table {   "Group10": {     "owner": "Abishek Kasetty",     "fail": 2,     "total": 12,     "agile_team": "Punchout_ReRun",     "test": "",     "pass": 6,  ... See more...
how to convert below json array to table {   "Group10": {     "owner": "Abishek Kasetty",     "fail": 2,     "total": 12,     "agile_team": "Punchout_ReRun",     "test": "",     "pass": 6,     "report": "",     "executed_on": "Mon Oct 23 03:10:48 EDT 2023",     "skip": 0,     "si_no": "10"   },   "Group09": {     "owner": "Lavanya Kavuru",     "fail": 45,     "total": 190,     "agile_team": "Hawks_ReRun",     "test": "",     "pass": 42,     "report": "",     "executed_on": "Sun Oct 22 02:57:43 EDT 2023",     "skip": 0,     "si_no": "09"   } } Expected Output ________________________  ________________________  ________________________ agile_team                                              pass                                                       fail ________________________  ________________________  ________________________ Hawks_ReRun                                           42                                                      45
Hi Team, We are observing discrepancy in calculation when the timestamp is less the 100ms. Example: Response time: “2023-10-23 14:46:14.84” Request time: “2023-10-23 14:46:13.948”   When “Respo... See more...
Hi Team, We are observing discrepancy in calculation when the timestamp is less the 100ms. Example: Response time: “2023-10-23 14:46:14.84” Request time: “2023-10-23 14:46:13.948”   When “Response time – Request time” value should be “136ms” but in Splunk it showing as “890ms”.   While calculating Splunk is considering inbound value as ““2023-10-23 14:46:14.840ms” instead of “.84ms” as its in 2 digits. So, is there any possibility to resolve this discrepancy from the Splunk query level or .conf level.    Regards, Siva.
Hi All,   Splunk "head" command by default retrieves top 10 columns and 10 results. may i know if we can control the number of columns to be retrieved. index= <Splunk query>| timechart span=15m co... See more...
Hi All,   Splunk "head" command by default retrieves top 10 columns and 10 results. may i know if we can control the number of columns to be retrieved. index= <Splunk query>| timechart span=15m count by dest_domain usenull=f useother=f | head 15 e.g. _time|column1|...............................................................|coulmn15 1 2 - - 15