All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Kindly help me with a new SPL In am getting results for the existing below SPL. I tried applying a new condition in existing SPL EventID=4662 Properties=*EncryptedDSRMPasswordHistory. But i am gett... See more...
Kindly help me with a new SPL In am getting results for the existing below SPL. I tried applying a new condition in existing SPL EventID=4662 Properties=*EncryptedDSRMPasswordHistory. But i am getting the unwanted results for EventID4662. So I want the existing SPL result to compare the below new condition and filter the result if Properties result has "msLAPS-Password".  New Condition: index=winsec_prod EventID=4662 Properties=*EncryptedDSRMPasswordHistory* Existing SPL:     index=winsec_prod 4794 OR (4657 AND DSRMAdminLogonBehavior) OR ((4104 OR 4103) AND DsrmAdminLogonBehavior) | search ((EventCode=4794) OR (EventCode=4657 ObjectName="*HKLM\System\CurrentControlSet\Control\Lsa\DSRMAdminLogonBehavior*") OR (EventCode IN (4104,4103) ScriptBlockText="*DsrmAdminLogonBehavior*")) | eval username=coalesce(src_user,user,user_id), Computer=coalesce(Computer,ComputerName) | stats values(dest) values(Object_Name) values(ScriptBlockText) by _time, index, sourcetype, EventCode, Computer, username | rename values(*) as *      
Hi I have main dashboard "MFA Compliance Rate" as shown below in screenshot.   I have enabled drilldown feature in "MFA Compliance Rate Per Country" panel of main dashboard. Drilldown dashboa... See more...
Hi I have main dashboard "MFA Compliance Rate" as shown below in screenshot.   I have enabled drilldown feature in "MFA Compliance Rate Per Country" panel of main dashboard. Drilldown dashboard name is "Country_Compliance" in same splunk app.  Able to pass country data from main dashboard to drilldown dashboard below mentioned screenshot have on click config. I want to pass dropdown field value from main dashboard to drilldown dashboard.   For Example : I want to pass "Business-Unit" Dropdown value to drilldown dashboard along with "country" value after clicking on particular country bar from "MFA Compliance Rate Per Country" panel of main dashboard. Help me out how to pass dropdown value to drilldown dashboard. Thanks Abhineet kumar    
Hi, I have a alert query that uses mstats, I want this query to not throw alert during public holidays (from 9 AM to 5 PM). I have created a lookup holidays.csv with columns "Date","Description". Ho... See more...
Hi, I have a alert query that uses mstats, I want this query to not throw alert during public holidays (from 9 AM to 5 PM). I have created a lookup holidays.csv with columns "Date","Description". How can i use this lookup with the already mstats command to check for the date and time in the lookup file and if its in the timerange in the file then not trigger the alert or probably not search. Thanks in advance. Lookup file:  
I have a search result which gives 2 columns country_name and bytes of data transferred. How can I create a map visualization out of this that shows how many bytes were transferred to each country.  ... See more...
I have a search result which gives 2 columns country_name and bytes of data transferred. How can I create a map visualization out of this that shows how many bytes were transferred to each country.  Thanks
index="jenkins_console" source="*-deploy/*" NOT (source="*/gremlin-fault-injection-deploy/*" OR source="*pipe-test*" OR source="*java-validation-*") ("Approved by" OR "*Finished:*") | fields source |... See more...
index="jenkins_console" source="*-deploy/*" NOT (source="*/gremlin-fault-injection-deploy/*" OR source="*pipe-test*" OR source="*java-validation-*") ("Approved by" OR "*Finished:*") | fields source | stats count(eval(match(_raw, "Approved by"))) as count_approved, count(eval(match(_raw, ".*Finished:*."))) as count_finish by source | where count_approved > 0 AND count_finish > 0 | stats dc(source) as Total | appendcols [ search(index="jenkins_console" source="*-deploy/*" NOT (source="*/gremlin-fault-injection-deploy/*" OR source="*pipe-test*" OR source="*java-validation-*") ("Finished: UNSTABLE" OR "Finished: SUCCESS" OR "Approved by" OR "Automatic merge*" OR "pushed branch tip is behind its remote" OR "WARNING: E2E tests did not pass")) | fields source host | stats count(eval(match(_raw, "Approved by"))) as count_approved, count(eval(match(_raw, "Finished: SUCCESS"))) as count_success, count(eval(match(_raw, "Finished: UNSTABLE"))) as count_unstable, count(eval(match(_raw, "Automatic merge.*failed*."))) as count_merge_fail, count(eval(match(_raw, "WARNING: E2E tests did not pass"))) as count_e2e_failure, count(eval(match(_raw, "pushed branch tip"))) as count_branch_fail by source, host | where count_approved > 0 AND (count_success > 0 OR (count_unstable > 0 AND (count_merge_fail > 0 OR count_branch_fail > 0 OR count_e2e_failure > 0))) | stats dc(source) as success ] | stats avg(success) as S, avg(Total) as T | eval percentage=( S / T * 100) | fields percentage,success, Total
o365 addon has been running fine. Token expired on the Azure side, so I generated a new one. Updating the Splunk addon gives me the error "Only letters, numbers and underscores are supported." and ... See more...
o365 addon has been running fine. Token expired on the Azure side, so I generated a new one. Updating the Splunk addon gives me the error "Only letters, numbers and underscores are supported." and highlights the Tenant Subdomain or Tenant Data Center fields (see attachment). I can't complete the update without values in these fields. Not sure what to do here.
Hi, I'm struggling to confirm in the docs whether this is permitted or not? I'm working on a TA for Netgear Wi-Fi, the log format is not brilliant to work with but I want to extract the ssid (Wi-Fi)... See more...
Hi, I'm struggling to confirm in the docs whether this is permitted or not? I'm working on a TA for Netgear Wi-Fi, the log format is not brilliant to work with but I want to extract the ssid (Wi-Fi) network name. There are two formats of log containing this. I have written: EXTRACT-ssid EXTRACT-wifi_join_leave_ssid   Wi-Fi/default/props.conf EVAL-src_mac = bssid Wi-Fi/default/props.conf EXTRACT-bssid = \"bssid\"\:\"(?<bssid>\w+\-\w+\w+\-\w+\-\w+\-\w+\-\w+)" Wi-Fi/default/props.conf EXTRACT-ssid = \"ssid\"\:\"(?<ssid>.*?)" Wi-FI/default/props.conf EXTRACT-wifi_join_leave_ssid = (disconnected\sfrom\s|connected\sto\s)(?<ssid>.+?)(?: with an RSSI|}$)     Both these extractions appear to work just fine at search time which really surprised me, I was obsessing over trying to combine a long REGEX with an OR. I've obviously referred to: https://docs.splunk.com/Documentation/Splunk/9.1.1/Admin/Propsconf Which makes it clear that the CLASS must be unique (no problem) but the capture group name gets no mention?
Recently we did an upgrade of Splunk from version 8.0.5 to 9.1.1.  After the upgrade we get the message "Failed to load source for Wordcloud visualization" when we select the Wordcloud visualization.... See more...
Recently we did an upgrade of Splunk from version 8.0.5 to 9.1.1.  After the upgrade we get the message "Failed to load source for Wordcloud visualization" when we select the Wordcloud visualization. We use the most recent version 1.11 (Wordcloud Custom Visualization | Splunkbase). It should be compatible with version 9.1 according to splunkbase. Has anyone the same issue or knows a solution for this ?
Hello, good day I am very new to Splunk, i and my team want to work on a mini project using splunk cloud with the topic "Splunk Enterprise: An organization's go-to in detecting cyberthreats" how/wh... See more...
Hello, good day I am very new to Splunk, i and my team want to work on a mini project using splunk cloud with the topic "Splunk Enterprise: An organization's go-to in detecting cyberthreats" how/where can i easily get datasets/logs that i can use in splunk for monitoring and analysis.  and what best way should we go about this topic?
Hi, We have a splunk cloud instance, and a few of our systems dont have an out of the box add on, so we decided to try and get data via api. However our instance dosent have any api data inputs, nor... See more...
Hi, We have a splunk cloud instance, and a few of our systems dont have an out of the box add on, so we decided to try and get data via api. However our instance dosent have any api data inputs, nor can we find any way to create an input of our own. We tried to install the add on builder app, but the installation fails every time. Is there any way to create our own add on, or a way to get splunk to pull data via api?
We have a few instances hosted in AWS that are extremely underutilized (single digit avg. cpu% for the 3 months. The AWS compute optimizer has recommended the following changes to the instances Cur... See more...
We have a few instances hosted in AWS that are extremely underutilized (single digit avg. cpu% for the 3 months. The AWS compute optimizer has recommended the following changes to the instances Current Instance Type | Recommended Instance Type c4.4xLarge  | r6i.xlarge c4.8xlarge | r6i.2xlarge and  r6i.xlarge c5.2xLarge | r6i.large,  r6i.xlarge,  t3.medium,  t3.small c5.4xlarge | r6i.2xlarge c5.9xlarge | r6i.4xlarge c5.xLarge | r6i.large t3.medium |  t3.large t3.micro | t3.medium   We noticed that most of the recommendations are about replacing 'compute-optimized' instances with new-gen 'mem-optimized' intances. This also reduced the CPU cores.    Question - can we consider and replace the instances based on the recommendations.
This splunk search is not showing any result.   index=os OR index=linux sourcetype=vmstat OR source=iostat [| input lookup SEI-build_server_lookup.csv where platform=eid_rhel6 AND where NOT (role-c... See more...
This splunk search is not showing any result.   index=os OR index=linux sourcetype=vmstat OR source=iostat [| input lookup SEI-build_server_lookup.csv where platform=eid_rhel6 AND where NOT (role-code-sonar) | fields host | format ] | rex field=host (?<host>\w+)?\..+" | timechart avg(avgWaitMillis) | eval cores=4 | eval loadAvg1mipercore=loadAvg1mi/cores | stats avg(loadAvg1mipercore) as load by host   Please help to correct my search.
Hi, I am looking to parse the nested JSON events. basically need to break them into multiple events. I an trying some thing like this but its just duplicating same record in multiple lines.   ... See more...
Hi, I am looking to parse the nested JSON events. basically need to break them into multiple events. I an trying some thing like this but its just duplicating same record in multiple lines.   | spath path=list.entry{}.fields output=items | mvexpand items   I am looking to get all key/vale pair as single event under  "fields"  Sample Records   { "total": 64, "list": { "entry": [ { "recordId": 7, "created": 1682416024092, "id": "e70dbd86-53cf-4782-aa84-cf28cde16c86", "fields": { "NumDevRes001": 11111, "NumBARes001": 3, "lastUpdated": 1695960000000, "engStartDate": 1538452800000, "RelSupport001": 0, "UnitTest001": 0, "Engaged": 1, "ProdGroup001": 1, "QEResSGP001": 0.5, "QEResTOR001": 1, "QEResLoc001": 3, "SITBugs001": 31, "QEResIND001": 5, "QEResLoc003": 3, "QEResLoc002": 3, "Project": "Registration Employee Directory Services", "AutoTestCount001": 1657, "AppKey001": "ABC", }, "ownedBy": "TEST1" }, { "recordId": 8, "createdBy": "TEST2", "created": 1682416747947, "id": "91e88ae6-0b64-48fc-b8ed-4fcfa399aa3e", "fields": { "NumDevRes001": 22222, "NumBARes001": 3, "lastUpdated": 1695960000000, "engStartDate": 1538452800000, "RelSupport001": 0, "UnitTest001": 0, "Engaged": 1, "ProdGroup001": 1, "QEResSGP001": 0.5, "QEResTOR001": 1, "QEResLoc001": 3, "SITBugs001": 31, "QEResIND001": 5, "QEResLoc003": 3, "QEResLoc002": 3, "Project": "Registration Employee Directory Services", "AutoTestCount001": 1657, "AppKey001": "ABC", }, "ownedBy": "TEST2" } ] } }          
Hello I'm trying to calculate ratio of two fields but im getting wrong results if i'm calculating each one of them separately im getting right results but together something is wrong     ... See more...
Hello I'm trying to calculate ratio of two fields but im getting wrong results if i'm calculating each one of them separately im getting right results but together something is wrong     index=clientlogs sourcetype=clientlogs Categories="*networkLog*" "Request.url"="*v3/auth*" Request.url!=*twofactor* "Request.actionUrl"!="*dev*" AND "Request.actionUrl"!="*staging*" | eval UserAgent = case(match(UserAgent, ".*ios.*"), "iOS FE",match(UserAgent, ".*android.*"), "Android FE",1=1, "Web FE") | dedup UserAgent, _time | stats count as AttemptsFE by UserAgent _time | appendcols [search index=clientlogs sourcetype=clientlogs Categories="*networkLog*" "Request.url"="*v3/auth*" Request.url!=*twofactor* "Request.actionUrl"!="*dev*" AND "Request.actionUrl"!="*staging*" "Request.status" IN (201, 207) NOT "Request.data.twoFactor.otp.expiresInMs"="*" | eval UserAgent = case(match(UserAgent, ".*ios.*"), "iOS FE",match(UserAgent, ".*android.*"), "Android FE",1=1, "Web FE") | dedup UserAgent, _time | streamstats count as SuccessFE by UserAgent _time] | eval SuccessRatioFE = round((SuccessFE/AttemptsFE)*100, 2) | eval SuccessRatioFE = (SuccessFE/AttemptsFE)*100 | timechart bins=100 avg(SuccessRatioFE) as SuccessRatioFE BY UserAgent      
I'm planning to start an integration between Splunk and ESET endpoint security cloud platform, but I facing the following issue: the Syslog-ng server started receiving uncleared/encrypted logs from ... See more...
I'm planning to start an integration between Splunk and ESET endpoint security cloud platform, but I facing the following issue: the Syslog-ng server started receiving uncleared/encrypted logs from the ESET endpoint security, so the logs appear on the HF server like this:  ^A^B  ^L 7 ^] ^W  ^^  ^Y  ^X # ^W (^D^C^E^C^F^C^H^G^H^H^H ^H 2 I think I want to decrypt the logs when received by the syslog-ng because Splunk can't handle any decryption process, I need help with how I can decrypt the logs in the Syslog-ng.
Trying to find anomalies for events. I have multiple services and multiple customers. I have an error "bucket" that is caputuring events for failures, exceeded, notified, etc. I'm looking for a way ... See more...
Trying to find anomalies for events. I have multiple services and multiple customers. I have an error "bucket" that is caputuring events for failures, exceeded, notified, etc. I'm looking for a way to identify when there are anomalies or outliers for each of the services/customers. I have combined (eval) service, customer, and the error and just counting the number of error events generated by each service/customer. So for example: svcA svcB svcC custA custB custC would give svcA-custA-failures 10 svcA-custA-exceeded 5 svcA-custA-notified 25 svcB-custA-failures 11 svcB-custA-exceeded 9 svcB-custA-notified 33 svcB-custB-failures 3 svcA-custB-exceeded 7 svcA-custB-notified 22 svcA-custC-exceeded 8 svcA-custC-failures 3 svcA-custC-notified 267 svcC-custC-exceeded 1 svcC-custC-failures 4 svcC-custB-notified 145 svcC-custA-notified 17   Something along the lines of this: | eval Svc-Cust-Evnt=Svc."-".Cust."-".Evnt | stats sum(error) by Svc-Cust-Evnt | rename sum(error) as count | sort -count
I have a search and subsearch that is working as required but there is a field in the subsearch that I want to display in the final table output but is not a field to be searched on. index=aruba sou... See more...
I have a search and subsearch that is working as required but there is a field in the subsearch that I want to display in the final table output but is not a field to be searched on. index=aruba sourcetype="aruba:stm" "*Denylist add*" OR "*Denylist del*" | eval stuff=split(message," ") | eval mac=mvindex(stuff,4) | eval mac=substr(mac,1,17) | eval denyListAction=mvindex(stuff,3) | eval denyListAction= replace (denyListAction,":","") | eval reason=mvindex(stuff,5,6) | search mac="*:*" [ search index=main host=thestor Username="*adgunn*" | dedup Client_Mac | eval Client_Mac = "*" . replace(Client_Mac,"-",":") . "*" | rename Client_Mac AS mac | fields mac ] | dedup mac,denyListAction,reason | table _time,mac,denyListAction,reason What I want is for the value held in field Username to be included in the table command of the outer search.  How do I pass it from the subsearch to be used in the table command and not used as part of the search? Thanks.
Hi, I have a simple xml dashboard. I want to be able to move the Export-To-PDF button (more of a html button) to the bottom of the dashboard in order to print the whole dashboard.  Any easy way of d... See more...
Hi, I have a simple xml dashboard. I want to be able to move the Export-To-PDF button (more of a html button) to the bottom of the dashboard in order to print the whole dashboard.  Any easy way of doing this? Thank You Everyone!  
Hi Splunkers...  Assumptions... The HF we want to deploy.. it should be inside a DMZ network, the license master is outside the DMZ and all necessary ports will be opened as required now the questi... See more...
Hi Splunkers...  Assumptions... The HF we want to deploy.. it should be inside a DMZ network, the license master is outside the DMZ and all necessary ports will be opened as required now the question is.. Can License Master to HF have only one way direction communication(info flow is only from LM to HF... not two way, in the sense... there will be no HF to LM info flow) OR the LM to HF requires two way communication by default.    please suggest, thanks.   
Can anyone provide a link to Splunk Mission Control API documentation?   Thank you