All Topics

Top

All Topics

I'm working with a table of conversation data, all conversations start out as a bot chat and can be escalated to a human agent. The ConversationId remains persistent through the escalation. Each Con... See more...
I'm working with a table of conversation data, all conversations start out as a bot chat and can be escalated to a human agent. The ConversationId remains persistent through the escalation. Each ConversationEntry is a message, inbound or outbound, in a MessagingSession. ConversationId is the MessagingSession parent to the individual entries in/out All MessagingSessions I'm looking at will have an EventType=ChatbotEstablished, not all will have an EventType=BotEscalated. I can't figure out how to calculate the percentage of conversations that had an escalation. Below is my query and a stats output. I'm trying to figure out how I get BotEscalated/ChatbotEstablished. index=sfdc sourcetype=sfdc:conversationentry EntryType IN ("ChatbotEstablished", "BotEscalated") | stats count(ConversationId) as EntryCount by EntryType EntryType EntryCount BotEscalated 3 ChatbotEstablished 10
I am having an issue with splunk version 9.0.4.1 it is not giving me the correct amount of license usage for my splunk instance. All the data appears as required however the license usage is not bein... See more...
I am having an issue with splunk version 9.0.4.1 it is not giving me the correct amount of license usage for my splunk instance. All the data appears as required however the license usage is not being defined giving us unlimited usage. 
All, I am having this issue with my Splunk env. I keep getting Injestion_latency_gap_multiplier has exceeded configured value. It is saying it is an issue with my indexers. Any information would hel... See more...
All, I am having this issue with my Splunk env. I keep getting Injestion_latency_gap_multiplier has exceeded configured value. It is saying it is an issue with my indexers. Any information would help I am running version 9.0.4.1.
September 2023 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another edition of indexEducation, the newsletter that takes an untraditional twist on w... See more...
September 2023 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another edition of indexEducation, the newsletter that takes an untraditional twist on what’s new with Splunk Education. We hope the updates about our courses, certification, and technical training will feed your obsession to learn, grow, and advance your careers. Let’s get started with an index for maximum performance readability: Training You Gotta Take | Things You Needa Know | Places You’ll Wanna Go  Training You Gotta Take Cybersecurity | Cyber Skills Education is a Hot Topic There’s big news on the cybersecurity front! On July 31, 2023, the Biden Administration announced its National Cyber Workforce and Education Strategy to tackle the shortage of cybersecurity pros in the U.S. by boosting cyber skills education. At Splunk, we've had our eyes on the growing skills gap for a while, which is why we continue to offer more and more free cybersecurity and skills training – available and accessible anywhere, anytime – with free self-paced training for every Splunk security product. In fact, cybersecurity courses are just part of our curriculum of almost 50 free self-paced courses – including our newest, “The Cybersecurity Landscape” and “Security Operations and the Defense Analyst.” Plus, an entire catalog of self-paced training with labs and instructor-led courses. Stay cyber-savvy, friends.  Gotta Get the Skills | Cyber Skills are Hot  Observability | O11y, O11y Oxen Free Did you know that observability – in technical terms – is abbreviated as O11y and is defined as the ability to measure the internal states of a system by examining its outputs? Organizations use observability tools to improve the performance of their distributed IT systems – solutions like those from Splunk designed to help organizations gain insight into and understanding of their applications and infrastructure. If you’re new to Splunk, find out how it works by taking our free O11y training, including our newest courses: Introduction to Log Observer Connect and Optimizing Metrics Usage with Splunk Metrics Pipeline Management.   Gotta Learn to O11y  | Be Cool at School  Things You Needa Know What is Splunk | Out with the Old, In with the New Unlike that old, well-worn Splunk t-shirt, some things really do need to be retired. This holds true for the free Splunk Education “What is Splunk” eLearning, which is being incorporated into an updated “Intro to Splunk” eLearning and will no longer be available after December 31, 2023. Have no fear! No knowledge will be lost – it’s simply an exercise in  ‘combine and delete’ – and  it is no longer a prerequisite for many of Splunk Education Learning Paths. Oh, and while we have you, in addition to our curriculum of free coursework, we also offer self-paced learning with labs and virtual instructor-led courses taught by really cool humans.  Needa Know What’s New | Get Trained Anywhere, Anytime  ALPs | Splunk Education Spans the Globe Have you ever wondered how you can access Splunk Education Training and Certification in your own region, in your own language, with local support? Well wonder no more! The Splunk Authorized Learning Partner (ALP) program is an extension of Splunk Education – offering you access to the quality of education you've come to expect from us. ALPs offer courses that dive into Cloud, Security, and Observability for administrators, architects, and users – in your language, timezone, and location. “Parles-tu Français?” Needa Know About Local Training | ALPs Across the Globe Places You’ll Wanna Go To Splunk University | Experience it with Tan Jia Le Did you know that Splunk University is brought to you each year by Splunk Education – and happens the weekend before our annual user conference? Well, if you’ve ever been curious and wondered what’s all the buzz about, then have we got a story for you! Find out how a college student – who is also a Splunk user – won a conference pass to Splunk University and is taking his learning to a whole new level. Read this and more on the Splunk Community blog.  Wanna Go to Splunk University | Experience it with Jia Le To the Badge Store | Follow the Path Let the experts guide you in your quest to acquire your next Splunk Certification badge with curated learning paths. A hot ticket these days is the Splunk Core Certified User Learning Path and Exam Prep. With this path, you can build a strong foundation of basic Splunk platform skills – things like searching, using fields and look-ups, and creating alerts and basic statistical reports and dashboards. This entry-level certification exam prep path is a great place to start if you support Splunk Enterprise or Splunk Cloud platforms. You can find the learning path in the STEP learning platform…just a click and login away. Wanna Get a Badge | Get Exam Ready Find Your Way | Learning Bits and Breadcrumbs   Go on a Quest | Resilience Can Be Learned Go to STEP | Get Upskilled Go Watch Tech Talks | Deep-Dives for Technical Practitioners Go Discuss Stuff | Join the Community Go Social | LinkedIn for News Go Share | Subscribe to the Newsletter Thanks for sharing a few minutes of your day with us – whether you’re looking to grow your mind, career, or spirit, you can bet your sweet SaaS, we got you. If you think of anything else we may have missed, please reach out to us at indexEducation@splunk.com.  Answer to Index This: Dimensions
Hi everyone,    I've seen a few posts on here and elsewhere that seem to detail the same issue I'm having, but none of the solutions do the trick for me. Any help is appreciated.  The goal is t... See more...
Hi everyone,    I've seen a few posts on here and elsewhere that seem to detail the same issue I'm having, but none of the solutions do the trick for me. Any help is appreciated.  The goal is to flag users whose search engine queries (fieldname searched_for) contain words stored in a lookup table. Because those words could occur anywhere in the search query, wildcard matching is needed.   I have a lookup table called keywords.csv. It contains two columns:  keyword,classification splunk,test classification   The first use of the lookup works as it should, showing only events with keyword match anywhere in searched_for:       | search [| inputlookup keywords.csv | eval searched_for="*".keyword."*" | fields searched_for | format]         Next step is enrich the remaining events with the classification, and then filter out all events without a classification as such:       | lookup keywords.csv keyword AS searched_for OUTPUT classification | search classification=*         The problem is the above SPL only enriches events in which the keyword exactly matches searched_for. If I search in Google for "splunk", the events are enriched; If I search for "word splunk word", the event is not enriched. Is there a way around this without using | lookup? Or am I doing something wrong here? I'm out of ideas. I've tried: Prepending and appending * to the keyword in the lookup table (*splunk*) Adding lookup definition with matchtype WILDCARD(searched_for) Thought maybe the issue is due to searched_for being an evaluated field, so I changed the matchtype and SPL to the field "url". It is coming straight from the logs and contains the search query string. Still get no enrichment. Deleted and re-created the lookup, definition, and matchtype.
How do I automatically update event timestamps in an email template to a specific local time zone, accounting for daylight saving time?  Automatically update event timestamps of the events in an em... See more...
How do I automatically update event timestamps in an email template to a specific local time zone, accounting for daylight saving time?  Automatically update event timestamps of the events in an email template to align with a specific time zone’s daylight-saving time changes.  In this article... SaaS Controllers and their time zones Why the need for an automatic code snippet?  Limitations and Considerations  How does the code snippet work?  How to use the code snippet?  Code snippet    SaaS controllers and their time zones  SaaS customers rely on the time zone configured in the Controller to determine the offset between their local time zone and the one used by the Controller. Therefore, you need to manually modify the timestamp value in each template (email or HTTP) and keep it updated according to whether daylight saving is active or not.    Why the need for an automatic script?  Using an automatic script rather than relying on a manual process to update the timestamp value eliminates potential update gaps for twice-yearly daylight-saving time updates.  If daylight-saving time is active, customers need to manually update the timestamp value, which can be error-prone and requires being done twice a year.    Considerations and limitations While this solution implements automated updates for US daylight saving time, you can use it as a template for daylight saving schedules in any other country.  In the US:  Daylight saving begins on the first Sunday of October at 2:00 a.m.  Daylight saving ends on the second Sunday of March at 2:00 a.m.  The code currently works for full-hour offsets, but it can be updated to work with partial-hour offsets.  This implementation considers that the SaaS controller is using the UTC time zone.    How does the code snippet work?  Daylight Savings is applied during December, January, and February.  During October, the first step is to find the date of the first Sunday of the month. If the current day is on or after the first Sunday after 2:00 am, daylight saving is applied.  During March, the first step is to find the second Sunday of the month. If the current day is on or before the second Sunday before 2:00 am, daylight saving is applied.  Then, modify the time zone acronym from UTC to the local acronym.    How do I use the time change code snippet?  Define a Custom Templating Variable named localTZ, which will store the customer’s local time zone acronym (for example: ET for Eastern Time).  Define a Custom Templating Variable named offset, which stores the value of the time difference between the SaaS controller’s time zone and customer’s time zone without daylight saving.  In the template, use the code defined in section: Code  NOTE | the code is using $latestEvent.eventTime as initial time value  NOTE | The index of the months and weekdays are zero based, so January is index 0 and December is index 11, and the week starts on Sunday with 0 and ends on Saturday with 6.  Example of the variable declaration in the Custom Templating Variables Code  NOTE | If required, delete all comments.  ## Variable declaration  #set($date = $latestEvent.eventTime)  #set($month = $date.getMonth())  #set($day = $date.getDate())  #set($weekDay = $date.getDay())     ## By default, the offset is applied to the hours variable  #set ($newHours = ${date.getHours()} - $offset)  #set($daylightSavingOffset = $offset + 1)     ## In the months of January, February, and December, the daylight-saving time is applied  #if($month == 11 || $month == 0 || $month == 1)       #set($newHours = ${date.getHours()} - $daylightSavingOffset)  #end     ## In the month of October, it is required to find the first Sunday of the month to verify if the daylight saving time needs to be applied  #if($month == 10)       #set($tempDate = $latestEvent.eventTime)       #set($firstSunday = $weekDay)          ## Loop through the first 8 days of the month to find the first Sunday of the month       #foreach($tempDay in [1..8])            $tempDate.setDate($tempDay)            #set($tempWeekDay = $tempDate.getDay())               ## If the current day we are looping is Sunday we store that day and break the loop            #if($tempWeekDay == 0)                 #set($firstSunday = $tempDay)                 #break            #end       #end          ## If the current day is after the first Sunday of October, the daylight-saving time is applied       #if($day > $firstSunday)            #set($newHours = ${date.getHours()} - $daylightSavingOffset)       ## If the current day is the first Sunday of Octuber, the daylight-saving time is only applied if the current hour is or has passed 2:00 a.m       #elseif($day == $firstSunday && $newHours >= 2)            #set($newHours = ${date.getHours()} - $daylightSavingOffset)       #end  #end     ## In the month of March, it is required to find the second Sunday of the month to verify if the daylight-saving time needs to be applied  #if($month == 2)       #set($tempDate = $latestEvent.eventTime)       #set($secondSunday = $weekDay)          ## Variable use to count the number of Sundays that have been looped       #set($counter = 0)          ## Loop through the first 15 days of the month to find the second Sunday of the month       #foreach($tempDay in [1..15])            $tempDate.setDate($tempDay)            #set($tempWeekDay = $tempDate.getDay())               ## If the current day we are looping is Sunday we need to increase the counter if looped Sundays            #if($tempWeekDay == 0)                 #set($counter = $counter + 1)                    ## If the number of looped Sundays is 2, it means that the second Sunday of the month has been found, the day is stored, and the loop stopped                 #if($counter == 2)                      #set($secondSunday = $tempDay)                      #break                 #end            #end       #end          ## If the current day is before the second Sunday of March, the daylight-saving time is applied       #if($day < $secondSunday)             #set($newHours = ${date.getHours()} - $daylightSavingOffset)       ## If the current day is the second Sunday of March, the daylight-saving time is only applied if the current hour has not passed 2:00 a.m       #elseif($day == $secondSunday && $newHours < 2)            #set($newHours = ${date.getHours()} - $daylightSavingOffset)       #end  #end ## Generate the new date using the new hours with the offset  $date.setHours(${newHours})  #set ($dateString =${date.toString()})  ## Change the timezone acronym  #set ($replacedDateString=${dateString.replace('UTC',$localTZ)}) 
Hello, I need help to filter fields of an event and in this way reduce the size of the log before indexing it in splunk, I was reviewing the documentation and using ingest actions it is possible to... See more...
Hello, I need help to filter fields of an event and in this way reduce the size of the log before indexing it in splunk, I was reviewing the documentation and using ingest actions it is possible to exclude events based on regular expressions, however I do not need to exclude events if not specific fields
  how to extract the node name from the different GC source location: I have below sample three source location and I am looking for rex that can extract node name as "node02, Node03 and "web39". M... See more...
  how to extract the node name from the different GC source location: I have below sample three source location and I am looking for rex that can extract node name as "node02, Node03 and "web39". My rex command is not working. source= E:\total\int\ts1\Ddoss\node\node02\data\gc.log source=E:\total\int\ts1\Ddoss\swxx\node03\data\gc.log source=E:\total\int\ts1\Ddoss\web\web39\data\gc.log
Hi, I have an issue with our HEC service in our Splunk standalone installation (9.0.6). It simply does not complete the TCP connection for some unknown reason. Local FW is OFF. Ping works but TCP d... See more...
Hi, I have an issue with our HEC service in our Splunk standalone installation (9.0.6). It simply does not complete the TCP connection for some unknown reason. Local FW is OFF. Ping works but TCP does not complete the connection.   everything else works normally. I can connect to Splunk and search data, and universal forwarders report commonly (no deployment errors)... only HEC does not work as it should. HEC global settings from wireshark, the TCP retransmition can be seen but I can't find the root cause for it.   any idea of what could be happening? many thanks.      
Hello, FYI we had "The TCP output processor has paused the data flow" messages with extreme indexers slowness after OS updates and kernel update on linux Redhat 8.8.  
I need help to be able to capture variables in the MODSECURITY log. I can't create regular expressions well, is there an addon that can make it easier.
In my search results, I am getting IP and user details. I want to filter my search results if the same IP has been used by any user "*@xyz.com" in last 30 days.    
the large size logs like as below it's not a regular json data, therefore need to using rex to get fields A logs have name and uid B and C logs have uid and oid the dashboard accept input name, i... See more...
the large size logs like as below it's not a regular json data, therefore need to using rex to get fields A logs have name and uid B and C logs have uid and oid the dashboard accept input name, it allow multiple name with comma then using the name to find the uid and figure out the related uid and oid data from B logs and exclude from c logs so, I don't know how to  1. in a search statement substitute using the value of users be a keyword 2. combine the field data with comma for using  function search data in (...)    Thanks. -- for example: A logs: ... x1 ...uid=123... ... y2 ...uid=456... ... z3 ...uid=789... B logs: .... oid=989 ...uid=123 ... .... oid=566 ...uid=456 ... .... oid=486 ...uid=789 ... C logs: ...cancel_order... oid=989 ...uid=123 ... ...cancel_order... oid=566 ...uid=456 ... ...cancel_order... oid=486 ...uid=789 ... a dashboard has a input box text: users, and user can input multiple users with comma the value of users will be like "x1,z3" I wont to put the value in a search statement such us | makeresults | eval users="x1,z3" | eval names=replace(users, ",", " OR ")    =>excepted result: x1 OR z3 | search source="alog" $names$     => Substitute the names value into keyword | rex "name=(?<name>\S+)" | rex "uid=(?<uid>\d+)" | table name,uid | join type=left max=0 uid [ source="blog"  | rex "uid=(?<uid>\d+)" | rex "oid=(?<oid>\d+)" | search uid in (uids)    => uids combin the uid values with comma ex: (123,456,789) | table uid,oid ] | join type=left max=0 oid [ source="clog" cancel_order | rex "uid=(?<uid>\d+)" | rex "oid=(?<oid>\d+)" | search uid in (uids)    => uids combin the uid values with comma ex: (123,456,789) | table uid,oid,status ] | where isnull(status) | stats count(oid) by name
"The new Office 365 message trace logs have a delay throttle of 24 hours. I believe I understand the reasons behind this decision. Real-time information is important for SOC (Security Operations Cent... See more...
"The new Office 365 message trace logs have a delay throttle of 24 hours. I believe I understand the reasons behind this decision. Real-time information is important for SOC (Security Operations Center), and having a 24-hour gap in real-time data is a critical issue. One potential solution is to implement two Office 365 add-ons: one configured with the recommended settings and the other with the minimum possible delay time. Does this proposal make sense to anyone, and are there any associated risks?" Thank you for the help. 
Hi, i have created classic dashboard based on saved search because my saved search used as asset management search which contain a lot of fields. For now i need to create 3 input textbox and 1 drill... See more...
Hi, i have created classic dashboard based on saved search because my saved search used as asset management search which contain a lot of fields. For now i need to create 3 input textbox and 1 drilldown. Below are search that i used to match my token with search. | savedsearch "test 1" | search hostname=$hostname$, ip=$ip$, ID=$id$, location=$location$ However, search above doesn't work for the inputs field. Also i might need to add more inputs fields in future. Please assist me on this. Thank you 
We are currently ingesting ServiceNow Logs through the Splunk Add-on for Service Now TA. However, the logs aren't being parsed properly, as they are in a raw log format, which makes it increasingly d... See more...
We are currently ingesting ServiceNow Logs through the Splunk Add-on for Service Now TA. However, the logs aren't being parsed properly, as they are in a raw log format, which makes it increasingly difficult to build any kind of dashboard etc. Does anyone have any knowledge or experience in changing ServiceNow logs from a raw format to a structured format? Any help would be greatly appreciated
Hello Team, I have 2 input drilldown filter - filter1, filter2. Filter2 is based on the token from filter1.  When i click submit i should pass the token from filter1, filter2 to create a new dashboa... See more...
Hello Team, I have 2 input drilldown filter - filter1, filter2. Filter2 is based on the token from filter1.  When i click submit i should pass the token from filter1, filter2 to create a new dashboard.  
My server has windows version 2016 and it has splunk 7 , now i want to upgrade it to splunk 9 and 2019 version. what should be the flow to upgrade , so that i dont loose any old splunk 7 Data?
Dear all, i'm trying to intergate Wso2-api manager but i cannot see any BT status agent status is up, i follow this link: https://medium.com/@raj10x/monitor-wso2-products-using-appdynamics... See more...
Dear all, i'm trying to intergate Wso2-api manager but i cannot see any BT status agent status is up, i follow this link: https://medium.com/@raj10x/monitor-wso2-products-using-appdynamics-8faf72e83a7  for custom pojo but it didn't work. anyone succcess for this platform?
I would like to build  splunk attack range and perform series of attack on my splunk server using AWS. Do I need to create image of my server to do that? Is that even possible? How I can test my exis... See more...
I would like to build  splunk attack range and perform series of attack on my splunk server using AWS. Do I need to create image of my server to do that? Is that even possible? How I can test my existing infrastructure using this tool, instead of creating splunk server that is created with the tool automatically? I've already have read this docs: https://attack-range.readthedocs.io/en/latest/Attack_Range_AWS.html https://github.com/splunk/attack_range https://www.splunk.com/en_us/blog/security/attack-range-v3-0.html