All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi all, How we could install the the app Anomali ThreatStream  via self service in Splunk cloud ES, what is the process of it. Thanks..
My query index= nonjVs source = nonjavs | stats vaules(_time ) as start time values(_time) as endtime by empid  Displayed below formate but I want to see my stat time  and end time "%m-%d-%y %H:%M:... See more...
My query index= nonjVs source = nonjavs | stats vaules(_time ) as start time values(_time) as endtime by empid  Displayed below formate but I want to see my stat time  and end time "%m-%d-%y %H:%M:@%S" Starttime 165575758474.67768  End time 16777894894.67788  
Hi all, I need some support regarding ingestion of .pcap files to SPLUNK.  We have another application that already creates .pcap files in a directory, and they rotate in a good enough manner.... o... See more...
Hi all, I need some support regarding ingestion of .pcap files to SPLUNK.  We have another application that already creates .pcap files in a directory, and they rotate in a good enough manner.... on all of the needed machines(and I already have UF there as I am ingesting log files from these machines). I do not want /or need to push splunk to create new pcaps for me. I just want to transfer the files to SPLUNK, and hopefully be able to then analyze them there. I have read what I found regarding Stream App, and I have to say it is a bit inconsistent and unclear. I have architecture with UFs->HFs->IDXs->SHs. We have installed Stream app on the SH, the TA_stream on the SH, Indexers and HFs, I have Stream installed on UF as well. However I still do not see option for PCAPs in Data Input menu of settings.  It is probably lack of configuration on my side, but I fail to find the documentation how to configure the app on each of the nodes. SPLUNK is 9.0.2/on Linux If someone has already gone through that challenge, I'd appreciate a piece of advice.    
index=test sourcetype=csv source=prtg.csv host=prtg device=all "Down for"=* | rename "Down for" AS Downtime | eval "Downtime"=replace('Downtime',"d","") | dedup _raw | table Device, Downtime I... See more...
index=test sourcetype=csv source=prtg.csv host=prtg device=all "Down for"=* | rename "Down for" AS Downtime | eval "Downtime"=replace('Downtime',"d","") | dedup _raw | table Device, Downtime Is there a way to only show any devices with a downtime greater than 90 in that table?
Hello Splunkers !! Below are the screenshot visualization we need to achieve through Splunk.   We need Ordertype on Yaxis, and  cross Ailsle distance & time on Xaxis. Please let me know how can ... See more...
Hello Splunkers !! Below are the screenshot visualization we need to achieve through Splunk.   We need Ordertype on Yaxis, and  cross Ailsle distance & time on Xaxis. Please let me know how can I use this in timechart command ?        
Hi. I do not understand how can i see all available endpoints for application? Currently i need the Splunk DB Connect app endpoints. But in future i may need to know others app endpoints.
I am trying to experiment with splunk to gather windows logs from my computer. However, I do not see my client in "Forwarder Management" so I think I may have misconfigured the receiving indexer. I a... See more...
I am trying to experiment with splunk to gather windows logs from my computer. However, I do not see my client in "Forwarder Management" so I think I may have misconfigured the receiving indexer. I am trying to uninstall the Universal Forwarder so I can reinstall it. I am attempting to follow the Splunk documentation: Uninstall the universal forwarder - Splunk Documentation but am unsuccessful in uninstalling the forwarder.   I have some screenshots to help understand my problem:  the result when running command msiexec /x splunkuniversalforwarder-<...>-x86-release.msi I have the SplunkForwarder Service in my services menu. I believe this shows that the universal forwarder does exist on my device. These screenshots are when I attempt to uninstall the universal forwarder. The second screenshot should show that the service does exist and is not running at the moment (Yes when it is running I don't see it in "Forwarder Managment" still.) If anyone has any advice and/or direction on what I should do, it would be greatly appreciated.   Thank You. 
I got a free Splunk cloud dev instance where I am trying to create a custom alert action, but seems to me 'edit_alert_actions' capability is missing. Could you guys help me on how do I create a custo... See more...
I got a free Splunk cloud dev instance where I am trying to create a custom alert action, but seems to me 'edit_alert_actions' capability is missing. Could you guys help me on how do I create a custom alert action in dev instance. Or it does not support at all in dev version?
Hello  Using the below query, I am trying to build a response     index=my_index openshift_cluster="cluster009" sourcetype=openshift_logs openshift_namespace=my_ns openshift_container_name=co... See more...
Hello  Using the below query, I am trying to build a response     index=my_index openshift_cluster="cluster009" sourcetype=openshift_logs openshift_namespace=my_ns openshift_container_name=container | search ("POST /myvalue/mytoken/v1?s1=SSO HTTP" OR "POST /myvalue/mytoken/v1?s1=LOYALTY HTTP") | eval Operations=case(searchmatch("POST /myvalue/mytoken/v1?s1=SSO HTTP"),"type_SSO",searchmatch("POST /myvalue/mytoken/v1?s1=LOYALTY HTTP"),"type_LOYALTY") | stats avg(processDuration) as average perc90(processDuration) as response90 by Operations | eval average=round(average,2),response90=round(response90,2)     Operations average response90 type_LOYALTY 212 888 type_SSO 300 442   The above search does not return any table data. I am sure its due to the question-mark character present in in searchmatch and is not being handled as it should be. Because if I 1) do a plain search, I get the events returned      | search ("POST /myvalue/mytoken/v1?s1=SSO HTTP" OR "POST /myvalue/mytoken/v1?s1=LOYALTY HTTP")     2) remove by Operations from the query, it returns me the average and response90 value table data Can someone help me how to handle this.
I need to count the number of times an alert has triggered in a specific time window (say, last 24 hours).  I am trying to do that via  | rest   but noticed the counts remain constant despite changin... See more...
I need to count the number of times an alert has triggered in a specific time window (say, last 24 hours).  I am trying to do that via  | rest   but noticed the counts remain constant despite changing the search time interval (60m, yesterday, last 7d, 15m, etc.). What "time" does the | rest search or return results for?  I tried reading the docs on rest and and the user manual for REST API but nothing quite explains it. Current SPL     | rest /services/alerts/fired_alerts splunk_server=local | search author="me@me.com" | table eai:acl.app eai:acl.owner id title triggered_alert_count | rename eai:acl.* as *, app as App, owner as Owner, id as Endpoint, title as Title, triggered_alert_count as "Count of Triggered Alerts"      
I'm trying to configure this add-on in our environment. Could you please tell me do we need to pay for this product or is it free to download and install ? I'm asking this because when I try to downl... See more...
I'm trying to configure this add-on in our environment. Could you please tell me do we need to pay for this product or is it free to download and install ? I'm asking this because when I try to download and install the ad-on from the splunk cloud console, it says "Unable to install app." after I enter my splunk credentials. 
I want to use drilldowns on a "Network Diagram Viz" custom visualization in a dashboard. I want to show / hide panels depending on whether a node or a link has been clicked in the graph. So I finally... See more...
I want to use drilldowns on a "Network Diagram Viz" custom visualization in a dashboard. I want to show / hide panels depending on whether a node or a link has been clicked in the graph. So I finally need two tokens, $is_node$ and $is_link$, to drive my <panel depends=...> clauses. I tried various ideas with no success so far, and most of them seem rather clumsy for the small task. Attempt 1: Splunk Drilldowns -> $row.<fieldname>$: "Available fields depend on whether a node or link was double clicked: Double clicking nodes: row.from, row.value, row.type, row.color Double clicking links: row.from, row.to, row.linkColor, row.linkText, row.linkWidth" Problem: When clicking first a link and then a node, the link-related variables are not reset, so testing for isnull(row.to) does not help much. Attempt 2: Same approach, but using the visualization's own token, $nd_to_node_token$ "Node (to): The 'to' field of the selected link. Defaults to $nd_to_node_token$" Same problem as above. When clicking a node, it retains the value of a previously clicked link. Attempt 3: Using $click.value$ or $nd_value_token$ "Node or link text: The label value of the selected node, or the linktext of the selected link. Defaults to $nd_value_token$." Luckily, the "value" of my nodes is always a MAC address and "linktext" of my links is always a 1- or 2-digit number (datatype string). So I can use regular expressions to determine what is clicked. I tried:     <drilldown> <eval token="click_type">if(match($nd_value_token$, "^[0-9]{1,2}$"), "is_link", "is_node")</eval> </drilldown>     ... It works, but this only gives me one token ($click_type$) with two possible values, but I need above mentioned two separate tokens. I also tried (following this question     <panel> <viz type="network-diagram-viz.network-diagram-viz"> <search>...</search> <drilldown> <condition match="match($click.value$, &quot;^\d{1,2}$&quot;)"> <set token="is_link">true</set> <unset token="is_node"></unset> </condition> <condition match="WHAT_TO_PUT_HERE?"> <unset token="is_link"></unset> <set token="is_node">true</set> </condition> </drilldown> </viz> </panel>     ... but I wouldn't know how to specify the match=... clause in the else branch. And I doubt that using match() in the match=... clause works at all. If I try this, clicking anywhere in the Network Diagram Viz takes me to a Splunk Search page - obviously something's wrong. Any ideas?
Hello fellow splunkers! I'm getting these results from my splunk search but struggling to find a way to summarize the last numbers from the results.  In the example below (31 + 3 + 98 + 7 + 35) and... See more...
Hello fellow splunkers! I'm getting these results from my splunk search but struggling to find a way to summarize the last numbers from the results.  In the example below (31 + 3 + 98 + 7 + 35) and get a total count of 174 which I could display as a new field? Just started using splunk and will take some training but thought one of the experts out there might have be able to help. Best regards and thanks! index="logs" sourcetype="_json" | extract pairdelim="{,}" kvdelim=":" |fields message,robotName,timeStamp,Level,processName| search message="G3*Total Claims count is - *" processName="GroupClaimsDispatcher_GroupClaimsDispatcher" robotName="Unattended_Robot73"| table  timeStamp,Level,processName,robotName,message| dedup message | sort -timeStamp   2023-04-17T16:45:41.1960125Z Info GroupClaimsDispatcher_GroupClaimsDispatcher Unattended_Robot73 G3 --- Total Claims count is - 31 2023-04-17T16:44:16.8150041Z Info GroupClaimsDispatcher_GroupClaimsDispatcher Unattended_Robot73 G3 --- Total Claims count is - 3 2023-04-17T10:00:44.2792246Z Info GroupClaimsDispatcher_GroupClaimsDispatcher Unattended_Robot73 G3 --- Total Claims count is - 98 2023-04-17T10:00:21.3532608Z Info GroupClaimsDispatcher_GroupClaimsDispatcher Unattended_Robot73 G3 --- Total Claims count is - 7 2023-04-17T09:59:20.2110636Z Info GroupClaimsDispatcher_GroupClaimsDispatcher Unattended_Robot73 G3 --- Total Claims count is - 35  
  Splunk Mission Control brings order to the chaos of your security operations by enabling your SOC to detect, investigate and respond to threats from one modern and unified work surface.  Here are ... See more...
  Splunk Mission Control brings order to the chaos of your security operations by enabling your SOC to detect, investigate and respond to threats from one modern and unified work surface.  Here are some frequently asked questions to help you better adopted to the product.  What is Mission Control? Mission Control is a Splunk application that provides a unified, simplified and modern security operations experience for your SOC.With Mission Control, you can unify detection, investigation and response capabilities and data to take action based on prioritized insights, simplify operations by codifying your processes into response templates, and modernize your SOC with security automation (SOAR).  How can I access Mission Control? The Mission Control app is automatically installed for you if you are an eligible user. You simply need to login to Enterprise Security Cloud and go into the app selector > choose Mission Control > read through the info > click “Enable”.  Am I eligible to use Splunk Mission Control? Currently, Mission Control is available for customers who own Enterprise Security (ES) in the Cloud and is deployed in the following AWS regions. This link will stay updated as MC is deployed in more regions.  What are the key functionalities Mission Control provides? You can use Splunk Mission Control to triage, investigate, and respond to security incidents from a unified console integrated with Splunk Enterprise Security (Cloud). You can identify and remediate incidents while collaborating with others on your team. What is the most common use case of Mission Control? Perform an end-to-end Threat Detection, Investigation & Response (TDIR) Workflow. Please check the demo for more details: Watch the Demo What are the initial steps required to set up Mission Control? Enable Splunk Mission Control Assign a default SLA Create incident types Assign and manage user roles Create or manage response templates Are all the incidents automatically ingested in Mission Control from Enterprise Security? Yes. To view a list of incidents in Splunk Mission Control, select Incident review. You can view information about incidents using the default time range of the last 24 hours or another time range that you select. Incidents appear in the order they were created or ingested with the most recent incidents listed first. If I don’t have SOAR, can I still use Mission Control? Yes, you can, as long as you are an eligible Mission Control user.  What is the difference between ES notables and Mission Control (MC) incidents?  MC supports Incident creation from two sources: 1) Incidents can come from ES notables, or 2) Incidents can be created ad-hoc in the MC UI. Incidents are stored in the Key Value (KV) store because much of the Incident data is updated frequently (e.g. status, owner, notes, task status). MC Incidents also contain data that is not present in an ES Notable (e.g. response template data). Finally, data in an MC Incident can be updated, much like SOAR artifact data can be modified by a playbook.  How do I build playbooks in Mission Control?  You will be linked into the integrated SOAR UI in order to build playbooks and configure connectors. Most existing SOAR playbooks will work when run via Mission Control. SOAR Playbooks will need to use the new Mission Control block in the Virtual Playbook Editor to interact with new MC capabilities.  Where can I get resources and help, If I have questions for Mission Control? Check out the Mission Control Product Web Page Learn more about Mission Control on our docs site Watch the webinar titled “Unify Your Security Operations with Splunk Mission Control”. On this webinar, Splunk experts share how Splunk Mission Control strengthens your digital resilience by bringing order to your security operations. If your organization has access to OnDemand Services (ODS) credits (What is ODS?), you can take advantage of several security specific tasks and connect with an ODS Consultant that can help with your Mission Control journey (ODS Catalog - find your product area). If you need expert advice or guidance with your Splunk environment, find out how our team can help at Customer Success. 
Hello,  I have a search that returns the status "UP" / "DOWN" for various groups.  At the moment both UP and down are the same colour.  how do i return the; status=up green status=down red ... See more...
Hello,  I have a search that returns the status "UP" / "DOWN" for various groups.  At the moment both UP and down are the same colour.  how do i return the; status=up green status=down red Edit ** Code index=test host=ABC source=table.csv sourcetype=csv Group=Snow* | eval Group=if(Group="Snow Day Here we come 12345","Snow",Group) | eval Status=if(Status="Down (Acknowledged)", "Down", Status) | dedup _raw | stats count by Status This gives me Status                     Count Down                      12 Up                            45 I would like those in a bar chart one column as red one as green
Hi Team, I downloaded a file from webex app. But in crowdstrike while validating file name is showing. But the path and location where it's downloaded is not showing. Even in splunk proxy / crowd... See more...
Hi Team, I downloaded a file from webex app. But in crowdstrike while validating file name is showing. But the path and location where it's downloaded is not showing. Even in splunk proxy / crowdstrike indexes file name / any details it is not showing with the IP address of mine. Is there any way we can write use case to capture the details of files downloaded from webex app ? Thank you in advance. 
We want to fetch emails from a mailbox and forward to splunk. I have the ta-mailclient installed on our HF Windows server. I went to Settings > Data inputs > Mail Server to add an Email account to mo... See more...
We want to fetch emails from a mailbox and forward to splunk. I have the ta-mailclient installed on our HF Windows server. I went to Settings > Data inputs > Mail Server to add an Email account to monitor with protocol IMAP. No emails are being read. GitHub - seunomosowon/TA-mailclient: This technology adapter add-on fetches emails for Splunk to index from mailboxes using either POP3 or IMAP, with or without SSL.
Hey, When running a query the results found are diminishing over time. Pagination is not of incluence ( tried 10, 50, 100 ) and it seems to be somewhere in the index this behavior is triggered. Me... See more...
Hey, When running a query the results found are diminishing over time. Pagination is not of incluence ( tried 10, 50, 100 ) and it seems to be somewhere in the index this behavior is triggered. Meaning, if i search all events for today the count goes up to a count of events which then stalls for about 10 seconds to then continue with the result count to then start diminishing. When setting the time to a few hours back this behavior also happens. Since the search takes quite long to hit this "point of return" i assume the two time frames overlap and the same events are causing this diminishing of results to happen. Suggesting there is something wrong in the index ? Br, JLT
Is it possible to use timechart, or another command, to display the results over a time series but instead of the single data point of when the event occurred, it can expand the width of the event to... See more...
Is it possible to use timechart, or another command, to display the results over a time series but instead of the single data point of when the event occurred, it can expand the width of the event to cover the time span? I was going to show this in an enhanced timeline but I'm displaying events over 60 days and ones that has a duration of a day or two are hard to see. I used rex and eval to extract date time fields for the "Start" and "End" of the event
Hi, I currently have an outdated version of DBConnect and need to go through the upgrade process. I have several questions. 1. To perform the upgrade process I must necessarily make a backup of the... See more...
Hi, I currently have an outdated version of DBConnect and need to go through the upgrade process. I have several questions. 1. To perform the upgrade process I must necessarily make a backup of the entire splunk installation? 2- If I enter https://splunkbase.splunk.com/app/2686 it allows me to download a tgz file, with this file I can enter through the graphical interface of splunk and load it? 3. If there is an old version already installed, will this affect the existing inputs, connections and identities? 4. How would you perform or recommend me to perform this upgrade process? thanks