All Topics

Top

All Topics

Hi, We are looking for migration guidance from Exabeam to Splunk . Is there a way to migrate data from Exabeam data lake to Splunk ? Also any documentation, guidelines present for Exabeam customer... See more...
Hi, We are looking for migration guidance from Exabeam to Splunk . Is there a way to migrate data from Exabeam data lake to Splunk ? Also any documentation, guidelines present for Exabeam customer migration to Splunk. Please let me know. Thanks. Guru
Good Morning  Does anyone currently use Splunk or an App in Splunk to monitor folder size?  We are currently been asked to set up new folders for fileshare for various teams and as our storage reso... See more...
Good Morning  Does anyone currently use Splunk or an App in Splunk to monitor folder size?  We are currently been asked to set up new folders for fileshare for various teams and as our storage resource are near end we'd like to monitor each users' folder size. The ideal scenario would be that there would be a threshold in size put on each folder and when the folder is near capacity then an alert would trigger and the IT Team would take action.  Kind regards,   Paula      
Referring to previous question (Solved: How to insert hyperlink to the values of a column ... - Splunk Community) how can I add 2 different URLs for 2 different columns in the table such that, the re... See more...
Referring to previous question (Solved: How to insert hyperlink to the values of a column ... - Splunk Community) how can I add 2 different URLs for 2 different columns in the table such that, the respective hyperlink opens only when the value in the respective column is clicked.             "eventHandlers": [                 {                     "type": "drilldown.customUrl",                     "options": {                         "url": "$row.firstLink.value$",                         "newTab": true                     }                 },                 {                     "type": "drilldown.customUrl",                     "options": {                         "url": "$row.secondLink.value$",                         "newTab": true                     }                 }             ]
hello , I have a dashboard in which there are many panels and in each panels I am using geostats command to show the results of the search of  that particular panel in world map. I want to add zoom f... See more...
hello , I have a dashboard in which there are many panels and in each panels I am using geostats command to show the results of the search of  that particular panel in world map. I want to add zoom feature in it.  Let me explain So lets say I am on panel 1 and i have zoom on America  to see in which area are the results showing just like this. Now what I want is that if I switch to different panel it should also be zoomed in from America. Is that possible.
Hi All, I want to filter out null values.In my field the ImpCon having null values.Now i want to filter the values which i dont want to show in the table.I am trying below query .which is showing t... See more...
Hi All, I want to filter out null values.In my field the ImpCon having null values.Now i want to filter the values which i dont want to show in the table.I am trying below query .which is showing the null values. | eval ImpCon=mvmap(ImpConReqID,if(match(ImpConReqID,".+"),"ImpConReqID: ".ImpConReqID,null())) | eval orcaleid=mvfilter(isnotnull(oracle)) | eval OracleResponse=mvjoin(orcaleid," ")
Specifically speaking the dataSources section discussed here: https://docs.splunk.com/Documentation/Splunk/9.2.1/DashStudio/dashDef#The_dataSources_section   Hypothetically, I have two tables, eac... See more...
Specifically speaking the dataSources section discussed here: https://docs.splunk.com/Documentation/Splunk/9.2.1/DashStudio/dashDef#The_dataSources_section   Hypothetically, I have two tables, each stored in individual data source stanzas: Table 1 = ds.search stanza 1 Table 2 = ds.search stanza 2 The goal is to append the tables together, and then use the "stats join" method to merge the two tables together. If possible, this merge could be done as a ds.chain type stanza with two extend options, but it does not appear to be allowed. Here's the documentation for Data source options. https://docs.splunk.com/Documentation/Splunk/9.2.1/DashStudio/dsOpt  The document seems to be missing options like "extend", so I'm hoping someone knows if there's any additional options that is hidden. Now, I am trying to avoid using the [] subsearches because of 50,000 row limit, so the following append command will not be desired: <base search> | append [search ....] Anyone with mastery of JSON hacks might know if appending two data sources stanzas together be possible? Thank you.
hi , i have installed Splunk on my Ubuntu desktop. i have logged in once . however during second time log in it said unable to connect  
Dive into the deep end of data by earning a Splunk Certification at .conf24. We're enticing you again this year with an onsite testing center and budget-friendly exam prices. Elevate your Splunk savv... See more...
Dive into the deep end of data by earning a Splunk Certification at .conf24. We're enticing you again this year with an onsite testing center and budget-friendly exam prices. Elevate your Splunk savvy and turbocharge your salary with a Splunk industry-recognized certification — from platform to security and observability. Massive Markdown on Exam Fees Exams onsite at .conf24 are just $25 USD – a nice cool 80% off the usual rate. But, you've gotta be at .conf24 in Vegas to grab this hot deal. Exclusive Onsite Beta Exam Excitement If you haven’t already heard, we’re dropping a brand-new FREE Beta Splunk certification exam at .conf24. Jetting to Vegas for Splunk .conf24 or Splunk University? This is your golden ticket to be among the first to earn the fresh Splunk Certified Cybersecurity Defense Engineer (CDE) exam. If you're wavering on whether to register, perhaps snagging this elite cert will be the nudge you need. Beta Certification Deets and Prerequisite The CDE will allow you to show off your prowess in wielding security tools via Splunk Enterprise and Enterprise Security. But, just a heads up: you must have the Splunk Certified Cybersecurity Defense Analyst (CDA) cert in order to sit for the CDE exam. CDA Prerequisite is now on the House  We’re so pumped about the new CDE Certification that we’re giving the CDA Certification – your gateway to CDE – for FREE, but only onsite at .conf24. Don't let a little prerequisite hassle keep you from our latest and greatest Splunk Certification.    Certification Testing Center Timetable  Toscana 3608 -- The Venetian - Tuesday: Noon to 5:00 PM - Wednesday: 10:30 AM to 6:00 PM - Thursday: 8:00 AM to 6:00 PM - Friday: 8:00 AM to Noon   Already on the .conf24 list? Lock in your exam slot ASAP to snag the time that suits you best! Your .conf24 discount will kick in at checkout. First-time Splunk cert challenger? Get started by creating your Splunk.com account. Then link up your accounts using this registration form (needs a splunk.com login via STEP).
Hello. I am interested in data that occurs from Tuesday night on 8 PM until 6 AM. The caveat is that I need 2 separate time periods to compare. One of which is the 2nd Tuesday of each month until the... See more...
Hello. I am interested in data that occurs from Tuesday night on 8 PM until 6 AM. The caveat is that I need 2 separate time periods to compare. One of which is the 2nd Tuesday of each month until the 3rd Thursday. The other is any other day in the month.  So far I have:  | eval day_of_week=strftime(_time, "%A") | eval week_of_month=strftime(_time, "%U" ) | eval day_of_month=strftime(_time, "%d") | eval start_target_period=if(day_of_week=="Tuesday" AND week_of_month>1 AND week_of_month<4, "true", "false") | eval end_target_period=if(day_of_week=="Thursday" AND week_of_month>2 AND week_of_monthr<4, "true", "false") | eval hour=strftime(_time, "%H") | eval time_bucket=case( (start_target_period="true" AND hour>="20") OR (end_target_period="true" AND hour<="06"), "Target Period", (hour>="20" OR hour<="06"), "Other Period" ) My issue is that my "week of month" field is reflecting the week of the year. Any help would be greatly appreciated.  EDIT: I placed this in the wrong location, all apologies. 
Hello, Recently we replaced our Syslog server from rsyslog to syslog-ng.  We are collecting the network device's log - every source logged its own <IPaddress.log> file. Universal forwarder pushing t... See more...
Hello, Recently we replaced our Syslog server from rsyslog to syslog-ng.  We are collecting the network device's log - every source logged its own <IPaddress.log> file. Universal forwarder pushing them to the indexer.  Inputs, outputs are ok the data flowing, sourcetype is standard syslog. Everything is working as expected... Except for some sources... I spotted this because the log volume has dropped since the migration. For those, I do not have all of the events in Splunk.  I can see the file on the syslog server, let's say there are 5 events per minute. The events are the same - for example, XY port is down - but not identical; the timestamp in the header and the timestamp in the event's message are different. (events are still the same length). So in the log file, there are 5 events/min, but in Splunk, I can see only one event per 5 minutes. The rest are missing... Splunk randomly picks ~10% of the events from the log file (all the extractions are ok for those, there is no special character or something in the "dropped" events...) I feel it is because of similar events - Splunk thinks they are duplicated - but other hand it cannot be, because they are different. Any advice? Should I try to add some crc salt or try to change the sourcetype? BR. Norbert  
I am new to Splunk Mission Control and assigned to demo the Splunk Cloud platform with the following features:  Incident Management: Simplifies the detection, prioritization, and response process. ... See more...
I am new to Splunk Mission Control and assigned to demo the Splunk Cloud platform with the following features:  Incident Management: Simplifies the detection, prioritization, and response process. Investigative Capabilities: Integrates diverse data sources for thorough investigations. Automated Workflows: Reduces repetitive tasks through automation. Collaboration Tools: Facilitates communication and information sharing within the SOC team. Details: Provide examples of automated workflows specific to common SOC scenarios. Can somebody provide me with links to "How to Videos and documentation to set up up my Demo. Thank You
Dot net agent status is at 100% after deleting .Net and Machine agent. All the servers were rebooted and checked on the server for AppD related services, folders. They were all removed.  Could this ... See more...
Dot net agent status is at 100% after deleting .Net and Machine agent. All the servers were rebooted and checked on the server for AppD related services, folders. They were all removed.  Could this be related to old data still reflecting on AppD controller ? 
I use appdynamics to send a daily report on Slow or failed transactions and while the email digest report is helpful is there a way to include more detailed information about the data collectors (nam... See more...
I use appdynamics to send a daily report on Slow or failed transactions and while the email digest report is helpful is there a way to include more detailed information about the data collectors (name and value) in the email digest report?  Is this something done using custom email templates?
index=abcd "API : access : * : process : Payload:" |rex "\[INFO \] \[.+\] \[(?<ID>.+)\] \:" |rex " access : (?<Event>.+) : process" |stats count as Total by Event |join type=inner ID [|search index=a... See more...
index=abcd "API : access : * : process : Payload:" |rex "\[INFO \] \[.+\] \[(?<ID>.+)\] \:" |rex " access : (?<Event>.+) : process" |stats count as Total by Event |join type=inner ID [|search index=abcd "API" AND ("Couldn't save") |rex "\[ERROR\] \[API\] \[(?<ID>.+)\] \:" |dedup ID |stats count as Failed ] |eval Success=Total-Failed |stats values(Total),values(Success),values(Failed) by Event Event values(Total) values(Success) values(Failed) Event1 76303 76280 23 Event2 4491 4468 23 Event3 27140 27117 23 Event4 118305 118282 23 Event5 318810 318787 23 Event6 9501 9478 23 I am trying to join to different search (index is common) on ID field and then trying to group them by "Event" field but the Failed column is showing the same value for all the events.
Register here.  This thread for the Office Hours session on Splunk IT Service Intelligence (ITSI) on Wed, July 31, 2024 at 1pm PT/4pm PT. This is your opportunity to ask questions related to Splunk ... See more...
Register here.  This thread for the Office Hours session on Splunk IT Service Intelligence (ITSI) on Wed, July 31, 2024 at 1pm PT/4pm PT. This is your opportunity to ask questions related to Splunk ITSI including: Analyzing IT Service Health Reducing Alert Noise Isolating and Prioritizing Actionable Events Building Executive Dashboards to Visualize Health of Business Anything else you’d like to learn!  Please submit your questions at registration. You can also head to the #office-hours user Slack channel to ask questions (request access here).    Pre-submitted questions will be prioritized. After that, we will open the floor up to live Q&A with meeting participants.   Look forward to connecting!
is there playbook for this kind of thing? playbook "user password policy enforcement "
Hello. I'm using the trial and following the instructions for sending to APM with a manually instrumented Python app as seen below:       apiVersion: apps/v1 kind: Deployment spec: selector: ... See more...
Hello. I'm using the trial and following the instructions for sending to APM with a manually instrumented Python app as seen below:       apiVersion: apps/v1 kind: Deployment spec: selector: matchLabels: app: your-application template: spec: containers: - name: myapp env: - name: SPLUNK_OTEL_AGENT valueFrom: fieldRef: fieldPath: status.hostIP - name: OTEL_EXPORTER_OTLP_ENDPOINT value: "http://$(SPLUNK_OTEL_AGENT):4317" - name: OTEL_SERVICE_NAME value: "blah" - name: OTEL_RESOURCE_ATTRIBUTES value: "service.version=1"         If I'm using the Splunk distribution of the otel collector, how can I get the dns name of the `OTEL_EXPORTER_OTLP_ENDPOINT` without having to use `status.HostIp`? 
Hello, Does the below log paths of windows logs can be ingested into Splunk and if this is available in any add-on's? Microsoft\Windows\Privacy-Auditing\Operational EventLog Thanks
Team, I got 3 logs, I need to fetch Transaction_id,Event and Total_Count from LOG1. After that I need to join the 3 logs to get Successfull and Failures. successfull transaction will have only LOG2... See more...
Team, I got 3 logs, I need to fetch Transaction_id,Event and Total_Count from LOG1. After that I need to join the 3 logs to get Successfull and Failures. successfull transaction will have only LOG2. Failure transactions will have both LOG2 and LOG3 Finally I need data in timechart (span=1h). _time Event Total_Count Successfull Error LOG1 = 024-05-29 12:35:49.288 [INFO ] [Transaction_id] : servicename : access : Event : process : Payload: LOG2 = 2024-05-29 12:11:09.226 [INFO ] [Transaction_id] : application_name : report : servicename (Async) : DB save for SubscribersSettingsAudit record completed in responseTime=2 ms LOG3 = 2024-05-24 11:25:36.307 [ERROR] [Transaction_id] : application_name : regular : servicename (Async) : Couldn't save the SubscribersSettings record in DB
Hi,    I was wondering how to correlate data using different sources.    For example:    Source A contains:  User ID = 123   Source B contains User ID =123  User email = user@user   I wa... See more...
Hi,    I was wondering how to correlate data using different sources.    For example:    Source A contains:  User ID = 123   Source B contains User ID =123  User email = user@user   I want to find the user related to the UserID 123 (which comes up after my search). I want to do this by getting the User emal from Source B.  My search runs in Source A since there are some fields I need from there.