All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Starter content for the ask-question category for the upcoming app. CISOs currently have difficulty demonstrating the value of the organization’s security program to other executives as well as bo... See more...
Starter content for the ask-question category for the upcoming app. CISOs currently have difficulty demonstrating the value of the organization’s security program to other executives as well as board members.   Additionally, meaningful performance indicators do not exist to inform executive leadership and the board of the resilience of the organization’s security posture nor do indicators exist to provide focus on the "day-to-day" tasks that are a priority. Does Splunk have Security Metrics App covering Network, Endpoint, and Vulnerability focused dashboards? Mean time to Detect Mean time to Investigate Mean time to Respond Trend of the Mean time to detect/respond/investigate % of incidents detected within the organizations SLA's % of incidents investigated within the organizations SLA's % of incidents respond within the organizations SLA's Trend of # 5,6, and 7 ES Notables last 24 hrs Risk notables last 24 hrs Trend of 9 and 10 ES notables abandoned last 24 hours Risk notables abandoned last 24 hrs Trend of 12 and 13 #splunk_security_metrics
How do I grab all of the versions of Splunk EXCEPT the top 1, basically the opposite of index=winconfig sourcetype="WMIC:InstalledProduct" Name="*UniversalForwarder*" | top limit=1 Version | table... See more...
How do I grab all of the versions of Splunk EXCEPT the top 1, basically the opposite of index=winconfig sourcetype="WMIC:InstalledProduct" Name="*UniversalForwarder*" | top limit=1 Version | table Version It would be nice if there was a top limit=-1 component.   Or, How do I negate a subsearch? index=winconfig sourcetype="WMIC:InstalledProduct" Name="*UniversalForwarder*" [search index=winconfig sourcetype="WMIC:InstalledProduct" Name="*UniversalForwarder*" | top limit=1 Version | table Version] | dedup host, Version | table host Name Version I want to search for all computers with other versions of Splunk
VERY new to splunk.  I have a query that scans a vulnerability report for critical vulnerabilities: index=vulnerability severity=critical | eval first_found=replace (first_found, "T\S+", "") | eva... See more...
VERY new to splunk.  I have a query that scans a vulnerability report for critical vulnerabilities: index=vulnerability severity=critical | eval first_found=replace (first_found, "T\S+", "") | eval first_found_epoch=strptime(first_found, "%Y-%m-%d") | eval last_found=replace (last_found, "T\S+", "") | eval last_found_epoch=strptime(last_found, "%Y-%m-%d") | eval last_found_65_days=relative_time(last_found_epoch,"-65d@d") | fieldformat last_found_65_days_convert=strftime(last_found_65_days, "%Y-%m-%d") | where first_found_epoch>last_found_65_days | sort -first_found | dedup cve | rename severity AS Severity, first_found AS "First Found", last_found AS "Last Found", asset_fqdn AS Host, ipv4 AS IP, cve AS CVE, output AS Description | streamstats count as "Row #" | table Severity,"First Found","Last Found",Host,IP,CVE,Description,Reason   Which gives me output similar to this: critical 2023-10-11 2023-11-20 host1.example.com 192.168.101.12 CVE-2021-0123 blah blah blah critical 2023-03-25 2023-11-20 host2.example.com 192.168.101.25 CVE-2022-0219 blah blah blah critical 2023-06-23 2023-11-20 host3.example.com 192.168.101.102 CVE-2023-0489 blah blah blah critical 2023-08-05 2023-11-20 host4.example.com 192.168.101.145 CVE-2023-0456 blah blah blah I also have a .csv lookup file where I keep extra information on certain hosts: ScanHost                      ScanIP                   target-CVE            Reason host2.example.com 192.168.101.25 CVE-2022-0219 CVE can not be mitigated What I'm trying to do is to take the Host from the search and if it matches a ScanHost in the CSV then fill in the Reason field from the .csv.
Hello Members, I would like to import/show data in a splunk dashboard. This data is results from a mysql query run by php to create an html page with the results in an html table. Most likely the ... See more...
Hello Members, I would like to import/show data in a splunk dashboard. This data is results from a mysql query run by php to create an html page with the results in an html table. Most likely the easiest way to do this would be to write the data to a csv file, and use splunk forwarder to send the data to splunk. The data needs to be checked once a day. I was wondering if there is a way to build a dashboard from the data via the splunk REST api, or import/forward the html page that get created from the mysql query. The query is run on a remote server. I looked at the splunk-sdk-python but its implementation is not user friendly - it requires docker, which I cannot get running for some reason.   I am open to any and all suggestions. thanks eholz
Hello, Is it possible to configure Splunk to receive webhook with some information added to it and if it is can you give me link to the tutorial? 
We have recently setup SAML Authentication on our Splunk search which will be accessed by our Vendor using SSO authentication.  I wanted to enquire if LDAP authentication can also be enabled which wi... See more...
We have recently setup SAML Authentication on our Splunk search which will be accessed by our Vendor using SSO authentication.  I wanted to enquire if LDAP authentication can also be enabled which will be local to my team ? Also, what if SAML authentication or group mapping on our idP (Azure AD) breaks at some time and we will not be able to get into Splunk.  Is there or can we enable local admin login on the Splunk search which will be managed by our Splunk admin?
When performing a query that creates a summary report, the associated search.log file shows: ResultsCollationProcessor - writing remote_event_providers.csv to disk. Then two hours later reports: ... See more...
When performing a query that creates a summary report, the associated search.log file shows: ResultsCollationProcessor - writing remote_event_providers.csv to disk. Then two hours later reports: StatsBuffer::read: Row is too large for StatsBuffer, resizing buffer.  row_size=77060 needed_space=11536 free_space=153653063 This is soon followed by lots of ~min-by-min output of: SummaryIndexProcessor - Using tmp_file=/opt/splunk/..../RMD....tmp messages. What might be happening in that two hour window?  We are running Splunk Enterprise 9.1.1 under Linux. @koronb_splunk @C_Mooney 
I would like to find a way to list the dependency between dashboards and indexes. I'm using the following query to get the list of all the Dashboards using the index Oracle which is an event Index. ... See more...
I would like to find a way to list the dependency between dashboards and indexes. I'm using the following query to get the list of all the Dashboards using the index Oracle which is an event Index.             | rest splunk_server="local" "/servicesNS/-/-/data/ui/views" | search "eai:data"="*index=oracle*" | eval Type="Dashboards" | table Type title eai:acl.app author eai:acl.perms.read             This query is working fine but not for Metrics index. Am I missing something ?
Hey Splunkers, I wanted to get a list of all the lookup files on my SH and their file sizes along with other data. I can't get the size from the rest API. Appreciate any and all answers.  Below... See more...
Hey Splunkers, I wanted to get a list of all the lookup files on my SH and their file sizes along with other data. I can't get the size from the rest API. Appreciate any and all answers.  Below are the searches I've been trying to use: | rest/servicesNS/-/-/data/lookup-table-files | rename "eai:acl.app" as app, "eai:acl.owner" as owner, "eai:acl.sharing" as sharing, "eai:data" as path | table title owner sharing app |foreach title [| inputlookup <<FIELD>> | foreach *      [| eval b_<<FIELD>>=len(<<FIELD>>) + 1 ]  | addtotals b_* fieldname=b  | stats sum(b) as b  | eval mb=b/1000/1000, gb=mb/1000  | fields mb] The foreach does not allow non-streaming commands and thus this does not work. Using a direct eval like below: | rest/servicesNS/-/-/data/lookup-table-files | rename "eai:acl.app" as app, "eai:acl.owner" as owner, "eai:acl.sharing" as sharing, "eai:data" as path | table title owner sharing app |eval size= [| inputlookup 'title' | foreach *      [| eval b_<<FIELD>>=len(<<FIELD>>) + 1 ]  | addtotals b_* fieldname=b  | stats sum(b) as b  | eval mb=b/1000/1000, gb=mb/1000  | fields mb] This also does not work since the inner search cannot see the outer values.  I have been trying to work with subsearches, foreach and the map command but couldn't get anywhere. Thanks in advance folks
Hi, The beyond trust log fields are not getting extracted. I tried both Index time field extraction and Search time field extractions to extract the fields. Below are the sample logs. Dec 5 13:12... See more...
Hi, The beyond trust log fields are not getting extracted. I tried both Index time field extraction and Search time field extractions to extract the fields. Below are the sample logs. Dec 5 13:12:06 192.168.251.35 Agent Desc: "",Agent ID: "AppAudit",Agent Ver: "",Category: "PMM API SignAppIn",Source Host: "",Event Desc: "",Event Name: "Login",OS: "",Event Severity: "0",Source IP: "192.168.251.35",Event Subject: "106",Event Type: "0",User: "VPN-OTSA-EDMS-HANU",Workgroup Desc: "",Workgroup ID: "",Workgroup Location: "",AuditID: "2200680",ActionType: "Login",SystemName: "PMM API SignAppIn",AppUserID: "106",CreateDate: "12/5/2023 1:11:34 PM",UserName: "VPN-OTSA-EDMS-HANU",IPAddress: "192.168.251.35",Mapped Credential: "Primary",Mapped Credential Id: "2",Mapped Credential Description: "OFID-PS-Usersync",Mapped Credential Platform: "ActiveDirectory",Mapped Credential Domain/Server: "opecfund.org",Authenticate Credential Id: "2",Authenticate Credential UserName: "opecfund.org\OFID-PS-Usersync@opecfund.org",Authenticate Credential Description: "OFID-PS-Usersync",Authenticate Credential Platform: "ActiveDirectory",Domain Name: "opecfund.org",SAM Account Name: "VPN-OTSA-EDMS-HANU",Group: "opecfund.org\OFID-BTPRAPS-Vendor",Authentication Type: "Active Directory via API" Dec 5 13:12:06 192.168.251.35 Agent Desc: "",Agent ID: "AppAudit",Agent Ver: "",Category: "PMM API SignAppIn",Source Host: "",Event Desc: "",Event Name: "Login",OS: "",Event Severity: "0",Source IP: "192.168.251.35",Event Subject: "38",Event Type: "0",User: "SA-BTPSGlobalRequestor4SRAAPI",Workgroup Desc: "",Workgroup ID: "",Workgroup Location: "",AuditID: "2200678",ActionType: "Login",SystemName: "PMM API SignAppIn",AppUserID: "38",CreateDate: "12/5/2023 1:11:23 PM",UserName: "SA-BTPSGlobalRequestor4SRAAPI",IPAddress: "192.168.251.35",Authentication Type: "API" Dec 5 13:12:06 192.168.251.35 Agent Desc: "",Agent ID: "AppAudit",Agent Ver: "",Category: "PMM API Authentication Rule Failure",Source Host: "",Event Desc: "",Event Name: "Login",OS: "",Event Severity: "0",Source IP: "192.168.251.35",Event Subject: "-1",Event Type: "0",User: "opecfund.org\SA-BTPSGlobalRequestor4SRAAPI",Workgroup Desc: "",Workgroup ID: "",Workgroup Location: "",AuditID: "2200677",ActionType: "Login",SystemName: "PMM API Authentication Rule Failure",AppUserID: "-1",CreateDate: "12/5/2023 1:11:23 PM",UserName: "opecfund.org\SA-BTPSGlobalRequestor4SRAAPI",IPAddress: "192.168.251.35",Host Name: "SVR-BTPS01",User Name: "opecfund.org\SA-BTPSGlobalRequestor4SRAAPI",API Key: "****************************************************************************************************************************4416",IP Address: "192.168.251.35",Authentication Rule: "API Key",Message: "Invalid RunAs - UserNameOrPasswordAreIncorrect" Dec 5 13:11:06 192.168.251.35 Agent Desc: "",Agent ID: "AppAudit",Agent Ver: "",Category: "PMM API SignAppIn",Source Host: "",Event Desc: "",Event Name: "Login",OS: "",Event Severity: "0",Source IP: "192.168.251.35",Event Subject: "38",Event Type: "0",User: "SA-BTPSGlobalRequestor4SRAAPI",Workgroup Desc: "",Workgroup ID: "",Workgroup Location: "",AuditID: "2200675",ActionType: "Login",SystemName: "PMM API SignAppIn",AppUserID: "38",CreateDate: "12/5/2023 1:10:28 PM",UserName: "SA-BTPSGlobalRequestor4SRAAPI",IPAddress: "192.168.251.35",Authentication Type: "API". Can someone help here Below are the props and transform which i tried on Index time field extraction [beyondtrust] KV_MODE = none SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+)(\w{3} \d{1,2} \d{2}:\d{2}:\d{2} \d+\.\d+\.\d+\.\d+) NO_BINARY_CHECK = true REPORT-keyvaluepairs = keyvalue [keyvalue] REGEX = (\w+\s?\w+): "[^"]*" FORMAT = $1::$2 MV_ADD = true   Search time field extraction: [beyondtrust] EXTRACT-AgentDesc = Agent Desc: "(?P<Agent_Desc>[^"]+)" EXTRACT-AgentID = Agent ID: "(?P<Agent_ID>[^"]+)" EXTRACT-AgentVer = Agent Ver: "(?P<Agent_Ver>[^"]+)" EXTRACT-Category = Category: "(?P<Category>[^"]+)" EXTRACT-SourceHost = Source Host: "(?P<Source_Host>[^"]+)" EXTRACT-EventDesc = Event Desc: "(?P<Event_Desc>[^"]+)" EXTRACT-EventName = Event Name: "(?P<Event_Name>[^"]+)" EXTRACT-OS = OS: "(?P<OS>[^"]+)" EXTRACT-EventSeverity = Event Severity: "(?P<Event_Severity>\d+)" EXTRACT-SourceIP = Source IP: "(?P<Source_IP>[^"]+)" EXTRACT-EventSubject = Event Subject: "(?P<Event_Subject>[^"]+)" EXTRACT-EventType = Event Type: "(?P<Event_Type>\d+)" EXTRACT-User = User: "(?P<User>[^"]+)" EXTRACT-WorkgroupDesc = Workgroup Desc: "(?P<Workgroup_Desc>[^"]+)" EXTRACT-WorkgroupID = Workgroup ID: "(?P<Workgroup_ID>[^"]+)" EXTRACT-WorkgroupLocation = Workgroup Location: "(?P<Workgroup_Location>[^"]+)" EXTRACT-AuditID = AuditID: "(?P<Audit_ID>\d+)" EXTRACT-ActionType = ActionType: "(?P<Action_Type>[^"]+)" EXTRACT-SystemName = SystemName: "(?P<System_Name>[^"]+)" EXTRACT-AppUserID = AppUserID: "(?P<App_User_ID>[^"]+)" EXTRACT-CreateDate = CreateDate: "(?P<Create_Date>[^"]+)" EXTRACT-UserName = UserName: "(?P<UserName>[^"]+)" EXTRACT-IPAddress = IPAddress: "(?P<IPAddress>[^"]+)" EXTRACT-AuthenticationType = Authentication Type: "(?P<Authentication_Type>[^"]+)" EXTRACT-HostName = Host Name: "(?P<Host_Name>[^"]+)" EXTRACT-APIKey = API Key: "(?P<API_Key>[^"]+)" EXTRACT-IPAddress2 = IP Address: "(?P<IP_Address2>[^"]+)" EXTRACT-AuthenticationRule = Authentication Rule: "(?P<Authentication_Rule>[^"]+)" EXTRACT-Message = Message: "(?P<Message>[^"]+)"
Hello   I'm using Splunk 9.1.1 . Like the title, I am looking for a way to check the cherrypy version of the Splunk. How can I check it? I look forward to hearing from you.
Hello community, I'm having a problem that's probably easy to solve, but I can't figure it out. I have a query that will query an index that contains alerts from Splunk OnCall. And I count each ale... See more...
Hello community, I'm having a problem that's probably easy to solve, but I can't figure it out. I have a query that will query an index that contains alerts from Splunk OnCall. And I count each alert source (via the associated routingkey from OnCall) and its status (Acknowledged or not). `victorops_incidents` | sort lastAlertTime desc | dedup incidentNumber | fields * | search org="*" routingKey=** pagedPolicies{}.policy.name!=0_Reroute_alertes currentPhase!=RESOLVED | eval currentPhase=case(like(currentPhase, "%UNACKED%"), "Non acquitté", like(currentPhase, "%ACKED%"), "En cours") | eval routingKey=case(like(routingKey, "%routingcontrol-m%"), "Control-M", like(routingKey, "%dyn%"), "Dynatrace", like(routingKey, "%centreon%"), "Centreon", like(routingKey, "%servicepilot%"), "ServicePilot", like(routingKey, "%p_1%"), "P1") | rename currentPhase as Etat, routingKey as Source | chart count by Etat, Source | sort - Etat I have an almost perfect table which summarizes everything but I am missing some information: I sometimes have a source which has not generated any alert so it is absent from the table (in the screen below, I have the sources "Control-M", "Dynatrace" and "ServicePilot" but I am missing "Centreon" because the latter did not have any incidents in the period of time) : My question is the following: how to make all the sources appear but display 0 when they have not had any alerts? Best regards, Rajaion
Hi all, i want to extract fields from event which is in json format INFO [processor: anchsdgeiskgcbc/5; event: 1-57d28402-9058-11ee-83b7-021a6f9d1f1c] : DETAILS: [ { "ERROR_MESSAGE": "\nError: ... See more...
Hi all, i want to extract fields from event which is in json format INFO [processor: anchsdgeiskgcbc/5; event: 1-57d28402-9058-11ee-83b7-021a6f9d1f1c] : DETAILS: [ { "ERROR_MESSAGE": "\nError: abchdvshsuaajs.\n", "NUMBER": "123r57", "DB_TIMESTAMP": "2023-11-30" }, { "ERROR_MESSAGE": "\nError: ehwegagsuabajehss.\n", "NUMBER": "63638w82u", "DB_TIMESTAMP": "2023-11-30" }, and similarly we have these error data in one event Fields to be extracted - ERROR_MESSAGE NUMBER DB_TIMESTAMP
hi, i built an app, when I run the app's action in a playbook, I dont have an option to get the data results. I used: action_result.add_data() but it didnt seem to make a difference, how can I so... See more...
hi, i built an app, when I run the app's action in a playbook, I dont have an option to get the data results. I used: action_result.add_data() but it didnt seem to make a difference, how can I solve it?
Hello all, I am going to upgrade to Splunk to version 9.1.x. Inside my app I use lot of  JS scripts . When im performing the jquery scan, I get the below errors messages : This /opt/splunk/... See more...
Hello all, I am going to upgrade to Splunk to version 9.1.x. Inside my app I use lot of  JS scripts . When im performing the jquery scan, I get the below errors messages : This /opt/splunk/etc/apps/biz_svc_insights/appserver/static/jQueryAssets/ExtHCJS.js is importing the following dependencies which are not supported or externally documented by Splunk.  highcharts This /opt/splunk/etc/apps/biz_svc_insights/appserver/static/node_modules/requirejs/bin/r.js is importing the following dependencies which are not supported or externally documented by Splunk.  requirejs logger Can anyone please help me on this error ? Any hints are appreciated. Kind regards, Rajkumar Reddi .
Hello, I am creating a dashboard (Simple XML) with a table panel as shown below: This is actually a dashboard for Telephony System and number of columns (and names, of course) will be changed b... See more...
Hello, I am creating a dashboard (Simple XML) with a table panel as shown below: This is actually a dashboard for Telephony System and number of columns (and names, of course) will be changed based on which agents are logged in at a time. For example, at 9 AM: Queue, Agent 1, Agent 4, Agent 9 at 3 PM: Queue, Agent 1, Agent 4, Agent 5, Agent 11 at 1 AM: Queue, Agent 5, Agent 9, Agent 11 Now, in this table panel, I want to replace 1 with Green Tick and 0 with Red Cross in all the columns.  Can you please suggest how this can be achieved? I have tried this using eval and replace but as columns are dynamic, I am unable to handle this. Thank you. Edit: Sample JSON Event: { AAAA_PMC_DT: 05-Dec-2023 13:04:34 Agent: Agent 1 Block: RTAgentsLoggedIn Bound: in Queue(s):: Queue 1, Queue 3, Queue 4, Queue 5, Queue 7, Queue 10 } SPL: index="telephony_test" Bound=in Block=RTAgentsLoggedIn _index_earliest=-5m@m _index_latest=@s | spath "Agent" | spath "Queue(s):" | spath "On pause" | spath AAAA_PMC_DT | fields "Agent" "Queue(s):" "On pause" AAAA_PMC_DT | rename "Queue(s):" as Queue, "On pause" as OnPause, AAAA_PMC_DT as LastDataFetch | eval _time=strptime(LastDataFetch,"%d-%b-%Y %H:%M:%S") | where _time>=relative_time(now(),"-300s@s") | where NOT LIKE(Queue,"%Outbound%") | sort 0 -_time Agent | dedup Agent | eval Queue=split(Queue,", ") | table Agent Queue | mvexpand Queue | chart limit=0 count by Queue Agent  
Hello, I'm integrating the .txt file in Splunk, however while integrating the file my events are breaking into single line not all events but many of them are breaking into single line. Attaching ... See more...
Hello, I'm integrating the .txt file in Splunk, however while integrating the file my events are breaking into single line not all events but many of them are breaking into single line. Attaching the log file in comments. Below is how my data is appearing on Splunk when I add this txt file into Splunk. Is there any way I can limit the starting and ending point of my event. I want my data to be started from @ID and ends on REMARK.    And if I use regex "(@ID[\s\S]*?REMARK[\s\S]*?)(?=@ID|$)" while adding the data, many of my logs are getting missing attaching the snapshot of it also. not sure how to resolve this issue,  if anyone can know how i can integrate this .txt file to get my event start from (@ID to REMARK)    
How to get a single table from this query having all the correlationId together in one table  
Hi, I want to integrate AppDynamics into my Xamarin Application. I created a trial Account with AppDynamics. Is it possible to create an iOS user Agent, through a Trial Account? I am unable to get ... See more...
Hi, I want to integrate AppDynamics into my Xamarin Application. I created a trial Account with AppDynamics. Is it possible to create an iOS user Agent, through a Trial Account? I am unable to get the EUM App-Key for my Trial Account.   Govind.
Hi Splunkers, I have a doubt about a custom app customization. For a customer, we created with Splunk Addon Builder a simple app to use as "container": every customization we perform, such as Correl... See more...
Hi Splunkers, I have a doubt about a custom app customization. For a customer, we created with Splunk Addon Builder a simple app to use as "container": every customization we perform, such as Correlation rules, reports and so on, is assigned to this app. So, in its first release, the app has no particular panel, features and so on; let's say that just "exist". To be clearer: if I login and open the app, what I see is  this: and that's totally fine, due we did not perform any kind of customizations. So now, the question is: if I want to include the search function inside this app, how I can achieve this? I mean, we want avoid, when when we need to perform a search, to go on Search and Reporting app; we would be able to perform searches inside our app. For now, we don't need panel with specific charts, based on particular query: we want simple to be able to use (if it is possible of course) the Search and Reporting app/its functionality inside our app.