All Topics

Top

All Topics

I have a dashboard with statistics table and I want to add color to the font alone in the statistic table.There is no condition to be given.I have to give color to the font for all rows in the table.... See more...
I have a dashboard with statistics table and I want to add color to the font alone in the statistic table.There is no condition to be given.I have to give color to the font for all rows in the table.How to do it?    
we are trying to find why server error appears on search head though don't see any errors in logs and no high CPU usage found, running with v7.3.5
I do not see any option to close/cancel course for Splunk learning courses once access expired... It is always showing in progress and comment  "Your access to this course has expired. Access ended ... See more...
I do not see any option to close/cancel course for Splunk learning courses once access expired... It is always showing in progress and comment  "Your access to this course has expired. Access ended December XX" not going any further.. no options to cancel or close!
Hi all, I am trying to integrate MS SQL audit log data with a UF instead of DB Connect.  What is the best and recommended way to do it that maps all fields?  At the moment it is integrated with... See more...
Hi all, I am trying to integrate MS SQL audit log data with a UF instead of DB Connect.  What is the best and recommended way to do it that maps all fields?  At the moment it is integrated with the UF and using the "Splunk Add-on for Microsoft SQL Server" With that the MS SQL events can be identified by SourceName=MSSQLSERVER or SourceName=MSSQL* However it does not work properly work as most of the fields are not extracted and mapped. For example the user is also not translated User= NOT_TRANSLATED
Hi Splunkers, I'm struggling with setting up an appropriate line breaker for data from log file.  The example is below. I tried to use Event-breaking policy set to "every line", but it doesn't work... See more...
Hi Splunkers, I'm struggling with setting up an appropriate line breaker for data from log file.  The example is below. I tried to use Event-breaking policy set to "every line", but it doesn't work fine as the last line consists of 3 events. I would like to break lines based on [abcdef.abcs][info][gc], but I'm not entirely sure whether it's possible.  Could you please take a look?  [883722.688s][info][gc] GC(40135) Pause Init Mark (process weakrefs) 1653.109ms [883734.774s][info][gc] GC(40135) Concurrent marking (process weakrefs) 12086.056ms [883736.181s][info][gc] GC(40135) Concurrent precleaning 1406.445ms [883738.907s][info][gc] GC(40135) Pause Final Mark (process weakrefs) 2724.588ms [883738.908s][info][gc] GC(40135) Concurrent cleanup 72424M->72273M(153600M) 0.229ms [883739.217s][info][gc] GC(40135) Concurrent evacuation 308.624ms [883739.217s][info][gc] GC(40135) Pause Init Update Refs 0.137ms [883742.192s][info][gc] GC(40135) Concurrent update references 2975.050ms [883742.195s][info][gc] GC(40135) Pause Final Update Refs 1.175ms [883742.196s][info][gc] GC(40135) Concurrent cleanup 80318M->62137M(153600M) 0.204ms [883742.197s][info][gc] Trigger: Allocated since last cycle (15943M) is larger than allocation threshold (15360M) [883742.224s][info][gc] GC(40136) Concurrent reset 26.618ms [883743.575s][info][gc] GC(40136) Pause Init Mark 1349.467ms
Hi everyone, is there an official document for the necessary api permissions? Or does anyone know about these permissions? Thank you
Hi,I'm unable to install app on my Splunk Cloud Platform evaluation! Is this a limitation? E.g can't test / evaluate application on Splunk Cloud Platform evaluation? Thanks for clarifying.  
Hi  My system is Linux.  Am trying to monitor 3 users in an index.  The last time they login, IP address etc. There are over 180+ user. How do I get the search to show just the three users I want e... See more...
Hi  My system is Linux.  Am trying to monitor 3 users in an index.  The last time they login, IP address etc. There are over 180+ user. How do I get the search to show just the three users I want e.g James Peter and John? Thanks
Hi, I`m following this article in an attempt to ingest Teams data into Splunk and I need some help with testing the webhook - can someone confirm what the webhook URL is ?         curl WE... See more...
Hi, I`m following this article in an attempt to ingest Teams data into Splunk and I need some help with testing the webhook - can someone confirm what the webhook URL is ?         curl WEBHOOK_ADDRESS -d '{"value": "test"}'           Also, looking at the documentation for the Teams Add-on for Splunk it states that "theTeams Webhook is not available for Splunk Cloud installations." - has anyone found an alternative solution for Cloud Deployments ? We use Splunk in a hybrid (cloud + on prem) environment. Many thanks.
Hi I'm trying to extract some json values into tables for a dashboard. The log line that i'm using is something like the below       username=myUser notificationPreferences= [class Notific... See more...
Hi I'm trying to extract some json values into tables for a dashboard. The log line that i'm using is something like the below       username=myUser notificationPreferences= [class NotificationPreferences { category=cat1, categoryDescription=category1 receiveEmailNotifications=false receiveSmsNotifications=false }, class NotificationPreferences { category=cat2 categoryDescription=category2 receiveEmailNotifications=false receiveSmsNotifications=true }]         As you can see, its just a standard toString on a java class that the developers are outputting. What i want is a table of users and categories, with each category having the associated details, eg User Category Email SMS myUser1 Category1 false false myUser1 Category2 false true myUser2 Category1 true true   I started by trying to tidy up the json        | rex field=notificationPreferences mode=sed "s/\[class NotificationPreferences/prefs:[ /g" | rex field=notificationPreferences mode=sed "s/, class NotificationPreferences/, /g"       Which makes the notificationPreferences field a bit better       username=myUser notificationPreferences= prefs:[ { category=cat1, categoryDescription=category1 receiveEmailNotifications=false receiveSmsNotifications=false },{ category=cat2 categoryDescription=category2 receiveEmailNotifications=false receiveSmsNotifications=true }]       But from here im struggling with what i need to do in terms of spath and extractions to get both categories to work. I only ever seem to get the first category to appear in my results. Any help would be great Thanks  
Dears, I have installed Splunk app for linux  & add on in my Splunk enterprise paid license version. Installed splunk forwarder in all hosts & added cpu, vmstat & df in input.conf file in remote se... See more...
Dears, I have installed Splunk app for linux  & add on in my Splunk enterprise paid license version. Installed splunk forwarder in all hosts & added cpu, vmstat & df in input.conf file in remote servers. Now i want to create dashboard for live monitoring for mentioned linux metrics  & alerts for that. Need to help to do that or have any good documents please share.
Hi Team, We have a field called Status=Start and Status=Success OrderId is one field When orderId has the Status=start and if there is no Status=Success for 10 mins it should be considered as fa... See more...
Hi Team, We have a field called Status=Start and Status=Success OrderId is one field When orderId has the Status=start and if there is no Status=Success for 10 mins it should be considered as failure May i know how to write a condition for this?
In this release, we provide three new capabilities to help security teams detect suspicious behavior in real-time, quickly discover the scope of an incident to respond accurately, and improve securit... See more...
In this release, we provide three new capabilities to help security teams detect suspicious behavior in real-time, quickly discover the scope of an incident to respond accurately, and improve security workflow efficiencies using embedded frameworks.  We've introduced cloud-based streaming analytics* cloud-based streaming analytics integrates with Splunk’s risk-based alerting (RBA) framework to deliver enhanced analytics for improved situational awareness and response time to suspicious behavior.  This feature brings scalable real-time streaming analytics for a broader range of advanced security detections and focuses on addressing common use cases including insider threat, credential access and compromise, lateral movement, and living off the land attacks. *Initial availability to eligible US-East Splunk Cloud customers only   Splunk Enterprise Security 7.1 also brings a new visualization feature, Threat Topology, which provides a comprehensive view into security incidents, enabling analysts to quickly determine the scope of security incidents, and achieve faster time to initiate an investigation. Last but not least, with our new MITRE ATT&CK® visualization security analysts can quickly build situational awareness around an incident in the context of the MITRE ATT&CK Matrix. Security analysts can leverage and visualize MITRE ATT&CK annotations in Splunk Enterprise Security risk events and get a comprehensive picture of how the asset or identity has been impacted by various tactics and techniques. So Why Wait? Upgrade today to Splunk Enterprise Security 7.1!     
I have the following data that I'm trying to timechart the differences between: 2023-02-16T16:14:04: Data Processing Phase -1 completed 2023-02-16T14:01:00: Data Processing Phase -1 starting 2023... See more...
I have the following data that I'm trying to timechart the differences between: 2023-02-16T16:14:04: Data Processing Phase -1 completed 2023-02-16T14:01:00: Data Processing Phase -1 starting 2023-02-16T14:01:00: Data Collection Phase 3 (Final Collection Phase) completed 2023-02-16T11:34:10: Data Collection Phase 2 starting 2023-02-16T11:34:10: Data Collection Phase 1 completed 2023-02-16T11:34:10: Data Collection Phase 3 (Final Collection Phase) starting 2023-02-16T11:34:10: Data Collection Phase 2 completed 2023-02-16T09:01:36: Data Collection Phase 1 starting   I've sliced up the data using the following SPL, but that will only give me a look at the time differences over the selected timeline.  I can't figure out how to slice this data up so that I'm able to timechart the differences over multiple runs of the Data Collection Phases. | stats first(_time) as End, last(_time) as Start by Phase, PhaseIdentifier | eval RunTime = round((End - Start) / 60, 0) | eval Start=strftime(Start, "%c") | eval End=strftime(End, "%c") | rename RunTime AS "RunTime (Minutes)"   I'm used to working more with metrics and logs that spit out runtimes, so this has been vexing me for entirely too long...
Hi, need some help in crafting a search query that could get count by a regex and display counts in a table.   The log msg we have is "Successfully submitted: admin-mobile" or "Successfully submi... See more...
Hi, need some help in crafting a search query that could get count by a regex and display counts in a table.   The log msg we have is "Successfully submitted: admin-mobile" or "Successfully submitted: admin". I'd like to count numbers of msg contains "admin-mobile" and "admin" respectively and show them in a table.   I know that I can get one count by: `| search "Successfully submitted: admin-mobile" | stats count` and it will show in a table.  Question is how to get the other count. Thanks.   The result i'd like to have is like below, in a table format: submissionType        count    admin-mobile              999 admin                              888
There is a bug in the Job "Share" button: It only works for admins! Analysts have mentioned that within the search head, they are unable to share jobs with one other. This means that while anyone ... See more...
There is a bug in the Job "Share" button: It only works for admins! Analysts have mentioned that within the search head, they are unable to share jobs with one other. This means that while anyone is investigating alerts, reviewing things, etc., he must share the actual search SPL query and whomever he is working with has to re-run that search again. I know that the default "admin" role allows any user to view the private jobs of others but of course we don't want to give the "admin" role to all of the analysts. I reviewed all of the Splunk "capabilities" and could not find one that would be related to this permission. I am going to open a case, but I am looking for a quicker work-around solution.
Very strange scenario. I'll use a rex statement to retrieve data and it works perfectly. If I copy and paste the rex command that Splunk used (Copied from Job Inspector) it does not work. I'll receiv... See more...
Very strange scenario. I'll use a rex statement to retrieve data and it works perfectly. If I copy and paste the rex command that Splunk used (Copied from Job Inspector) it does not work. I'll receive an error. An actual snippet of raw data that I've used as an example in my erex statement. The data in bold is what went into my example. "usbProtocol":1,"deviceName":"Canon Digital Camera","vendorName":"Canon Inc.", And the job inspector spat out the following: | rex "(?i)\"deviceName\\\":\\\"(?P<Device>[^\\]+)" And the data looked perfect, like so; Canon Digital Camera   But if I use that rex statement spat out by the Job Inspector in my search Splunk says nay nay; The error in Splunk received was "Error in 'rex' command: Encountered the following error while compiling the regex '(?i)"deviceName\":\"(?P<Device>[^\]+)': Regex: missing terminating ] for character class."   I reached out to a coworker that provided | rex ".*deviceName(?<Model>.*?)," And it works to a degree, but includes characters that I'd rather not see in my data. Actual example of what is spat out; \":\"Canon Digital Camera\" Just also mentioning this in case it matters - where there is no data available/null within the "deviceName" raw data, it will show like this; \":\"\" I'd really appreciate some guidance with my regex code. I've been delving into this lately, used many training materials, but can't seem to figure this one out?!
Hi,  Dashboard help is needed. I am attaching pictures. Please provide SPL.  I need to connect dynamically an item selected from the "Year OA Motives" drop-down to the tabs/buttons below (SPL n... See more...
Hi,  Dashboard help is needed. I am attaching pictures. Please provide SPL.  I need to connect dynamically an item selected from the "Year OA Motives" drop-down to the tabs/buttons below (SPL needed).  By Clicking each tab will open up a table containing the corresponding data in a panel below the tabs (SPL neeeded). Regards, Selina.
I have a dashboard created with network data already in that dashboard.  I am trying to create a search dropdown or search bar so I can search for a specific IP and it will show all the data for that... See more...
I have a dashboard created with network data already in that dashboard.  I am trying to create a search dropdown or search bar so I can search for a specific IP and it will show all the data for that IP in my dashboard.
There a about 3 ways to set up outputs.conf and  when you trying to setup forwarders.  you can either do a cli entry to add a forwarder server(and indexer) or you cand edit outputs .conf files We... See more...
There a about 3 ways to set up outputs.conf and  when you trying to setup forwarders.  you can either do a cli entry to add a forwarder server(and indexer) or you cand edit outputs .conf files We made the outputs .conf according to a tutorial we saw but were have issues getting data in. So the question is what is broken about our outputs.conf file  (also side note originallt the.102 address wasnt in the files and neither was default-autolb group) thanks