All Topics

Top

All Topics

Hi splunk team. I wonder which version of Ciber vision is supported by the API realeas v 2.0 for splunk enterprise
Please forgive me, I am new to Splunk. I'm trying to create a dashboard that visualizes successful/failures logins. I don't have anyone I work with that's a professional or even knowledgeable/expe... See more...
Please forgive me, I am new to Splunk. I'm trying to create a dashboard that visualizes successful/failures logins. I don't have anyone I work with that's a professional or even knowledgeable/experienced enough to help. So, I started to use ChatGPT to help develop these strings. After I got the base setup from ChatGPT, I tried to fill in the sourcetypes. But now I'm getting this error: Error in 'EvalCommand': The expression is malformed.  Please let me know what I need to do to fix this. Ask away please. It'll only help me get better.   index=ActiveDirectory OR index=WindowsLogs OR index=WinEventLog ( (sourcetype=WinEventLog (EventCode=4624 OR EventCode=4625)) # Windows logon events OR (sourcetype=ActiveDirectory "Logon" OR "Failed logon") # Active Directory logon events (adjust keywords if needed) ) | eval LogonType=case( EventCode=4624, "Successful Windows Login", EventCode=4625, "Failed Windows Login", searchmatch("Logon"), "Successful AD Login", searchmatch("Failed logon"), "Failed AD Login" ) | eval user=coalesce(Account_Name, user) # Combine Account_Name and user fields | eval src_ip=coalesce(src_ip, host) # Unify source IP or host | stats count by LogonType, user, src_ip | sort - count
To investigate the issue of missing data in Splunk for a period of 3-4 hours, where gaps were observed in the _internal index as well as in performance metrics like network and CPU data, But still ca... See more...
To investigate the issue of missing data in Splunk for a period of 3-4 hours, where gaps were observed in the _internal index as well as in performance metrics like network and CPU data, But still can't able to find out the potential root cause of data missing in Splunk. Please help me what I need to investigate more to find out the potential root cause of the data gap in Splunk. Gap into the _internal index data Network performance data gap is visible Gap in the CPU performance data      
I want to make a sound alert in my dashboard studio dashboard. Is it even possible?
Our Splunk Add-on app was created with python modules ( like cffi, cryptography and PyJWT) where these modules are placed under app /bin/lib folder..  this add-on is working as expected. When we try... See more...
Our Splunk Add-on app was created with python modules ( like cffi, cryptography and PyJWT) where these modules are placed under app /bin/lib folder..  this add-on is working as expected. When we try to upgrade Splunk Enterprise from 8.2.3  to 9.3,  our add-on is failing to load python modules and throwing error 'No module named '_cffi_backend'    Note: we are running on python 3.7. and updated Splunk python SD to latest 2.0.2
How do you get a Saved Search to ignore a specific automatic lookup? The reason for wanting to do this is because the lookup being used is very large and the enrichment is not needed for a specific ... See more...
How do you get a Saved Search to ignore a specific automatic lookup? The reason for wanting to do this is because the lookup being used is very large and the enrichment is not needed for a specific search. Using something like | fields - FieldA FieldB Did not not speed up the search (where FieldA and FieldB are fields that are matched on in the automatic lookup) When the automatic lookup has the permissions changed to just one app then the saved search runs very fast but I do not believe keeping it like that is an option. Ideally there would be an option that could be a setting just for this one saved search so that it would not know the automatic lookup exists. Thanks in advance for any suggestions.
With the app export functionality, app developers and cloud admins can now export app files and associated user changes made in the cloud. This feature brings the ability to more easily reuse app... See more...
With the app export functionality, app developers and cloud admins can now export app files and associated user changes made in the cloud. This feature brings the ability to more easily reuse application components, troubleshoot configuration issues, and better understand and support existing applications in the Splunk Cloud Platform. Watch this Tech Talk to learn… How to keep local copies and exported snapshots of your apps and associated app data. How to simplify the development, management and debugging of Splunk Cloud Apps Recommended tactics with ACS's APIs, CLI, and Terraform Watch the full Tech Talk here:
We are excited to introduce our enhanced UI that brings together AppDynamics and Splunk Observability. This is the first step toward building an integrated Splunk Observability Cloud and AppDynamics-... See more...
We are excited to introduce our enhanced UI that brings together AppDynamics and Splunk Observability. This is the first step toward building an integrated Splunk Observability Cloud and AppDynamics-powered observability experience. The changes will be available for all users in mid-November.  Profile: Easily accessible user preferences and actions  We consolidated user profiles, preferences and application-level actions such as logging out, switching organizations, and changing color themes into a single profile menu. It is now intuitively located in the top right corner for fast and easy navigation.  Top Bar Navigation: Enhancements to notify users about important product updates   New updates to the top bar include repositioning the “What's New” and “Help and Support” functionality so users are notified about important product updates, and have easy access to support resources. Previously, this functionality was less accessible and hidden in the Settings menu.  Left Navigation Panel: Seamless navigation across products and capabilities  We updated the look and feel of the left navigation panel and added labels to the icons to make in-product navigation easier and more accessible. We continue to support expanded and collapsed versions of the left navigation panel to address user preferences.    Home Page: Consistent home page experience across Splunk Observability and AppDynamics A new layout to drive a more cohesive experience across Splunk Observability and AppDynamics. We will further enhance the Splunk Observability home page with personalized content to meet users’ troubleshooting needs.  If you have any feedback/suggestions, or would like to be a design partner for this, please reach out to your Splunk representative. Keep an eye out for these changes which will be implemented automatically mid- November! As usual, if you have any questions, don't hesitate to get in touch with your Splunk representative.
Welcome to another episode of "Splunk Smartness," where we explore how Splunk Education can revolutionize your approach to problem-solving and learning Splunk. I’m your host, Callie Skokos, and today... See more...
Welcome to another episode of "Splunk Smartness," where we explore how Splunk Education can revolutionize your approach to problem-solving and learning Splunk. I’m your host, Callie Skokos, and today we’re joined by Patrick Tatro, Security Advisor and Splunk Administrator from Wisconsin. Patrick, it’s great to have you here! Patrick Tatro: Thanks, Callie! Excited to share my experiences and hopefully inspire a few folks along the way. Callie: Patrick, I hear you have quite the interesting life outside of work, managing a forest and practicing bow hunting. Can you tell us a bit about that?   Patrick: Absolutely! I manage about 100 acres of forest here in Wisconsin, focusing on keeping the ecosystem healthy by targeting invasive species like Buckthorn. It’s challenging but rewarding, especially when I get to use the wood in my sauna. I like to compare it to coding or using Splunk; it's all about managing resources creatively and effectively. Callie: I love that comparison! Now speaking of Splunk, how did you initially get into using it? Patrick: My journey with Splunk started back in 2019 when I was supporting a customer with their vulnerability scanning architecture. I found the existing tools weren't up to the task, so I got totally immersed into learning and understanding Splunk. I started with free online training courses, attended an app developer bootcamp at .conf, and just dedicated myself to learning, much like how I learned bow hunting. I've used all of Splunk's education resources, from online training to instructor-led courses and reading books to dive deep from beginner to advanced levels. Callie: Wow, so the training helped you learn which of the Splunk products? Patrick: I use Splunk Cloud and Splunk Enterprise Security mostly. These tools allow me to manage and analyze data efficiently, which is critical in both my corporate role and my consulting work for K-12 education. Callie: It’s so great that you’ve found incredible value from Splunk. How do you incorporate Splunk into your work with schools and private companies? Patrick: In K-12, I help schools analyze their data to better assess their needs and guide them on security. For smaller companies, I offer a group membership to a private consulting group, which includes one-on-one sessions. Splunk’s APIs help automate much of the onboarding and management process, making everything more streamlined. Callie Skokos: That’s incredibly innovative. You’ve mentioned the powerful insights Splunk offers. Can you expand on that? Patrick: One of the biggest insights is confidence in data handling. With Splunk, as long as you can get the data into the system, the only limit is your creativity. For example, in cybersecurity, instead of “asking” the tool to identify problems—which often misses the root cause—Splunk allows me to integrate data from various sources into a single dashboard, providing a comprehensive view and deeper context to security events. Callie: It sounds like Splunk has really transformed how you approach security. Have you taken any more Splunk training recently? Patrick Tatro: Yes, I attended the Data Science Bootcamp at .conf24. It was a fantastic introduction to machine learning and really highlighted the opportunities for applying the technology and concepts within Splunk. I also participated in the Boss of the SOC competition and learned a lot from team collaboration there. Callie Skokos: Incredible. So, I imagine you have a few Splunk certifications? Patrick Tatro: I do—I’m certified as a Core User, Power User, Splunk Cloud Admin, Splunk ES Admin, and Cyber Defense Analyst. These certifications have been crucial in building a robust understanding of what Splunk can do. Callie Skokos: That’s great. Do you engage with other Splunk users to share best practices and get new tips and tricks? Patrick Tatro: Yes, I’m very active in the Splunk Community. I use the Slack channels and find the resource-sharing and Q&A immensely helpful. It’s a community-driven platform, and the support from other users, especially the Splunk Trust members, consistently boosts my confidence in using Splunk effectively. Callie Skokos: As you look to the future, where else do you think Splunk can take you? Patrick Tatro: I’m really interested in either becoming a full-time Splunk Consultant or an App Developer—or possibly both. The goal is to continue leveraging Splunk to create solutions that make data actionable and insightful, not just for myself but for clients across various industries. Callie Skokos: Patrick, it’s been fantastic hearing about your journey and how Splunk has played a role in it. Thanks so much for joining us today. Patrick Tatro: Thank you, Callie. I appreciate the opportunity to share my story! _______________________________________________ That's it for today's episode of "Splunk Smartness." Thank you all for tuning into "Splunk Smartness." We'll be back next time with more insights on how Splunk can enhance your tech skills and career – and help you make your organization more resilient. Until then, stay smart and keep Splunking!
Hi  Can you please help me to create multi line chart with the below data.  Data in the below format is fetched in SPlunk. I need to create a multi line chart with the same data as below:  Data : ... See more...
Hi  Can you please help me to create multi line chart with the below data.  Data in the below format is fetched in SPlunk. I need to create a multi line chart with the same data as below:  Data :  On the X axis : Time  Y axis : column1  Count1, count2 and count3 should be the 3 lines in the multi line chart.  Last command in the Splunk Query to fetch the data in the table form is below :  | table column1  column2  Time Count1 Count2 Count3  With this data can we create a multi linechart in SPlunk ?     
I would like to update my universal forwarders to send data to 2 separate endpoints for 2 separate splunk environments.  How can I do this using my Deployment Server.  I already have an App that I wi... See more...
I would like to update my universal forwarders to send data to 2 separate endpoints for 2 separate splunk environments.  How can I do this using my Deployment Server.  I already have an App that I will use for UF update.
Hi, Before asking i did try to find but not able to locate the thread that has this kind of datetime values..so i had to come up with this new thread I have the datetime values in string format like... See more...
Hi, Before asking i did try to find but not able to locate the thread that has this kind of datetime values..so i had to come up with this new thread I have the datetime values in string format like Thu 10 Oct 2024 08:48:12:574 EDT   sometimes there may be a null in it - thats how it is  what is that i have to do with this is get/derive into separate columns day name like Thursday day of month like 10 month like Oct year 2024 week - weeknumber like 2 or 3 Time part into separate column like 08:48:12:57  - not worried about EDT separate the time components into again 08 as Hour 48 as Min 12 as Sec not worried about ms still looking for threads with this kind of but...again sorry this is a basic one just needs more searching
We are excited to announce several exciting updates for Edge Processor aimed at hardening overall product resiliency and directly addressing some of the top feedback we’ve received from our customers... See more...
We are excited to announce several exciting updates for Edge Processor aimed at hardening overall product resiliency and directly addressing some of the top feedback we’ve received from our customers, including support for additional data sources!  You can always check out your Data Management home page for the latest Edge Processor release notes. Here’s what’s new:  Data export queuing resiliency (learn more)  Filled exporter queues can now back pressure data to upstream clients to prevent data loss, which closely mimics the behavior of splunkd’s ingestion pipelines. Along with this change, we’ve also removed the single-threading bottleneck which previously limited overall throughput. Edge Processor receiver acknowledgement from HEC sources (learn more)  You can now use the new Edge Processor receiver ACK feature to confirm if the Edge Processor successfully received data sent by HEC data sources, thereby bolstering reliability and reducing the risk of overall data loss. This also unlocks data sources that require a form of acknowledgement such as Data Firehose (see below). AWS Data Firehose support (release notes)  Users can now directly ingest logs from AWS Data Firehose into Edge Processor. With Data Firehose’s integration across 20+ AWS services, you now can easily stream data from sources like Amazon CloudWatch, SNS, AWS WAF, Network Firewall, IoT, and more. See this Lantern article for a step-by-step guide! Customize timezone in event for syslog data (release notes) Admins now have the ability to flexibly denote specific time zones on RFC 3164 syslog data that does not have a time zone set. This applies to pipelines bound for the destinations ‘AWS S3’ and ‘Splunk platform using the services/collector/event HEC endpoint’. Optimized Edge Processor restart behavior (release notes) This feature reduces the number of restart loops for misconfiguration or error states in Edge Processor. Previously, these types of restart loops could impact other services and result in degraded resource performance - but no more! Feedback is always welcome in ideas.splunk.com or in the Data Management Slack channel (request access here). Head to the Data Management resource hub for more resources. Enjoy! Splunk Data Management Team
Am having trouble getting a .json file into splunk through the backend to help support a customized dashboard. Is there a particular step i need to follow to get it in through the deployer?
I need a query that lists URLs a particular host has reached out in a particular time e.g in the last 24 hours. Please help
I have onboarded data from a system,  that scatters actual events over many logging events. Especially successful or failed logins cause me some headache. Successful login: <timestamp> Connection '... See more...
I have onboarded data from a system,  that scatters actual events over many logging events. Especially successful or failed logins cause me some headache. Successful login: <timestamp> Connection 'id123' from '192.168.1.100' has logged onto the server
 <timestamp> User 'johndoe' logged on (Connection id='id123')
 [ Time passes until John eventually decides to logoff again] 
<timestamp> Connection 'id123' from has logged off the server Failed login: <timestamp> Connection 'id123' from '192.168.1.100' has logged onto the server
 <timestamp> Connection 'id123' from has logged off the server   Of course, I can fiddle around with transaction or even stats or whatever to list successful and failed logins or create an alert for it. However that is absolutely not elegant. What is best practice, to get those data nicely streamlined with eventtypes and tags?
Where can I download the installer for Splunk Enterprise 9.2.1?  
i am trying to verify  the username from editing the code but i do not know where to submit the code. i checked the domumentation but it only advice on how to edit the code but it does not mention wh... See more...
i am trying to verify  the username from editing the code but i do not know where to submit the code. i checked the domumentation but it only advice on how to edit the code but it does not mention where to sbmit the code.