All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have a Splunk Standalone instance running at v8.2.10 I have recently installed the Microsoft Add-on for Microsoft IIS (version 1.2.0) on my Splunk server and have also deployed this app to a wind... See more...
I have a Splunk Standalone instance running at v8.2.10 I have recently installed the Microsoft Add-on for Microsoft IIS (version 1.2.0) on my Splunk server and have also deployed this app to a windows server with IIS installed (and a UF installed). However I seem to be having difficulties getting any logs from this IIS server.  If I do a search on data in this new index (index=windows_iis), it is returning no results. If I look under Settings>indexes, I can see the newly created index, however it has 0 for event count.   These were the basic steps I have followed so far: I have created a new index for these logs called "windows_iis" - all other settings as default. Installed the Microsoft Add-on for Microsoft IIS on my Splunk Enterprise instance (combined Search Head/Indexer/deployment server). I have copied the contents of this add-on to the /opt/splunk/etc/deployment-apps folder Within the deployment app I have created the following inputs.conf file under the deployment app local directory: [monitor://C:\inetpub\logs\LogFiles] disabled=false sourcetype=ms:iis:auto index=windows_iis I have reloaded the deployment server. I have created a new server class and pushed this app out to the IIS server.  I have gone through and done the following troubleshooting steps: looking on the IIS server in c:\program files\splunkuniversalforwarder\var\log\splunk\splunkd.log, I can see: UF on IIS server is showing connected to my indexer. The UF has "adding watch on path: C:\inetpub\logs\LogFiles". So the UF is monitoring the IIS log files. I am also getting some INFO messages - "ChunkedLBProcessor Failed to find EVENT_BREAKER regex in props.conf for sourcetype: ms:iis:auto. Reverting to the default EVENT_BREAKER regex for now". Not sure how relevant these are? I think my problem might be more fundamental? If I do a search on my Splunk Enterprise instance as follows: "index=_internal host="IIS_Server01" component=Metrics group=per_sourcetype_thruput series="ms:iis:auto" ", I can events being sent from the UF on the IIS server (e.g kbps=0.557, eps=3.3, kb=33, ev=202). I can actually see logs in C:\inetpub\logs\LogFiles\W3SVC1 folder on the IIS server, so there is data there to collect.  Does the modified local/inputs.conf need to also be configured on the Splunk Enterprise server app or is this inputs.conf configuration only needed on the UF deployment app (which is what I have done)? Any thoughts on why these events aren't being ingested by my Splunk Enterprise server would be greatly appreciated. Thanks,
Is it possible to change a report or dashboard's permissions from the rest api? 
Hello, I am thinking about the splunk add-on for cisco ucs, but this might apply to add-ons in general. it's easy to visualize installing it onto a heavy forwarder, and configure it to talk to al... See more...
Hello, I am thinking about the splunk add-on for cisco ucs, but this might apply to add-ons in general. it's easy to visualize installing it onto a heavy forwarder, and configure it to talk to all of my ucs managers. BUT this implies a single system that does this, and I need redundancy. My cluster of forwarders would seem to be the answer here, but if I install the add-on onto all three heavy forwarders, will I be ingesting replicated data?   Is the add-on or splunk smart enough to prevent this? Thank you!   --jason    
I am trying to run a query like below but I am limited to 10000 sub search result. Is there a way to make this query run for more than 10000 sub search result. search index="sample_index" "Kuber... See more...
I am trying to run a query like below but I am limited to 10000 sub search result. Is there a way to make this query run for more than 10000 sub search result. search index="sample_index" "Kubernetes.namespace"="ABC" "Two String" [index="sample_index" "Kubernetes.namespace"="ABC" "Success work done" | fields demo_id ] | stats count as Result by marksObtained I saw someone has already asked a similar question here, and I tried implementing it in the same way, but it's not working for me.  Below is the query which I wrote, but results are not as expected. index="sample_index" "Kubernetes.namespace"="ABC" ("Two String" OR "Success work done") | stats count as Result by marksObtained
I am trying to distribute $SPLUNK_HOME/etc/system/local/web.conf file to all my servers in my cluster with the  (search heads, management nodes, search peers). I want to change the name of my cert fi... See more...
I am trying to distribute $SPLUNK_HOME/etc/system/local/web.conf file to all my servers in my cluster with the  (search heads, management nodes, search peers). I want to change the name of my cert files from the default privKeyPath = $SPLUNK_HOME/etc/auth/splunkweb/privkey.pem serverCert = $SPLUNK_HOME/etc/auth/splunkweb/cert.pem I am using the following command.      splunk apply shcluster-bundle -target <URI>:<management_port> -auth <username>:<password>     I have searched and found nothing on how to do this. Only for apps ($SPLUNK_HOME/etc/apps/)
I am a beginner. Why is stats avg(response_time) not working after extracting response_time? index="testing1" source="web_access_log_project2.txt" | erex response_time examples="7ms, 0ms, 17ms, 67m... See more...
I am a beginner. Why is stats avg(response_time) not working after extracting response_time? index="testing1" source="web_access_log_project2.txt" | erex response_time examples="7ms, 0ms, 17ms, 67ms, 77ms, 39ms " | stats count, avg(response_time) Below is the sample event: 127.0.0.1 - - [17/Mar/2023:17:59:13.798 -0400] "HEAD /favicon.ico HTTP/1.1" 303 124 "" "Splunk/9.0.4 (Windows Server 10 Professional with Media Center Edition; arch=x64)" - 6414e2b1cc1a8e6558ec8 7ms 127.0.0.1 - - [17/Mar/2023:16:02:45.754 -0400] "HEAD /favicon.ico HTTP/1.1" 303 124 "" "Splunk/9.0.4 (Windows Server 10 Professional with Media Center Edition; arch=x64)" - 6414c765c11e7271cf148 0ms 127.0.0.1 - admin [09/Mar/2023:17:52:41.509 -0500] "GET /en-US/config?autoload=1 HTTP/1.1" 200 1874 "http://127.0.0.1:8000/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36 Edg/110.0.1587.63" - 640a6339821e0d9ba9848 49ms 127.0.0.1 - admin [09/Mar/2023:17:52:41.455 -0500] "GET /en-US/account/logout HTTP/1.1" 404 18942 "" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36 Edg/110.0.1587.63" - 640a6339741e0d987dc08 14ms
I'm attempting to determine what folders on a Windows server are being audited. I don't have access to the server to view the inputs.conf file and need to discover what folders are being accessed fro... See more...
I'm attempting to determine what folders on a Windows server are being audited. I don't have access to the server to view the inputs.conf file and need to discover what folders are being accessed from the audit logs sent to Splunk. The field labeled FilePath shows the entire path to the file. I have not been successful in creating a regex query to extract only the top parent folder. Because the string value of FilePath contains the full path, I am trying to figure out how to display just the first folder of the entire folder path. index=win_servers Computer="Storage" | table FilePath | rex field=FilePath "^\\ (?<FilePath>[^\\ ]+)" The search above produces the results below after passing it to dedup. H:\Folder1\subfolder1\subfolder_A H:\Folder1\subfolder1\subfolder_B H:\Folder1\subfolder2\subfolder_A H:\Folder2\subfolder1\ H:\Folder2\subfolder2\subfolder_A H:\Folder2\subfolder3\subfolder_B H:\Folder3\subfolder1\ H:\Folder3\subfolder2\ H:\Folder4\subfolder1\ H:\Folder4\subfolder2\ The results I am looking for is to just show the following: H:\Folder1\ H:\Folder2\ H:\Folder3\ H:\Folder4\ ... I've looked at the following posts and haven't been able to successfully apply what is mentioned to my situation. https://community.splunk.com/t5/Splunk-Search/rex-regex-to-extract-first-folder-name-from-the-path/m-p/287508 https://community.splunk.com/t5/Splunk-Search/Regex-Source-and-Destination-files-with-path-filename-extension/m-p/271989 https://community.splunk.com/t5/Splunk-Search/Regex-to-match-string-between-2-strings/m-p/626758#M217834 Any help would be appreciated!
I  am beginner.  How do I extract response time in "ms" from this event? Thank you. 4.72.20.141 - - [27/Dec/2037:12:00:00 +0530] "POST /usr HTTP/1.0" 500 4998 "http://www.parker-miller.org/tag/list... See more...
I  am beginner.  How do I extract response time in "ms" from this event? Thank you. 4.72.20.141 - - [27/Dec/2037:12:00:00 +0530] "POST /usr HTTP/1.0" 500 4998 "http://www.parker-miller.org/tag/list/list/privacy/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.75.14 (KHTML, like Gecko) Version/7.0.3 Safari/7046A194A" 830 31.60.78.151 - - [27/Dec/2037:12:00:00 +0530] "PUT /usr/admin HTTP/1.0" 303 5071 "-" "Mozilla/5.0 (Linux; Android 10; ONEPLUS A6000) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Mobile Safari/537.36 OPR/61.2.3076.56749" 1361 162.135.142.180 - - [27/Dec/2037:12:00:00 +0530] "DELETE /usr/admin HTTP/1.0" 502 5002 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.141 Safari/537.36" 4608 56.125.112.165 - - [27/Dec/2037:12:00:00 +0530] "GET /usr/admin/developer HTTP/1.0" 303 5006 "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36 OPR/73.0.3856.329" 4650  
Hello, i have installed Splunk on windows machines and trying to get data from another windows machines using remote collectors. Error"Failed to fetch data: Unable to get wmi classes from host '192... See more...
Hello, i have installed Splunk on windows machines and trying to get data from another windows machines using remote collectors. Error"Failed to fetch data: Unable to get wmi classes from host '192.168.1.131'. This host may not be reachable or WMI may be misconfigured.   Anyone can guide its just two windows machines one is Splunk and other one is needs to collect logs.
Hola querido equipo de SPLUNK, Tengo una duda a punto de saber, si la Herramienta SPLUNK ha configurado el horario de verano? Gracias por su ayuda y buenas tardes.
I'm pretty sure the answer to my question is regex but I'm not too savy with it.  I have a few values in an IP field formatted like the below example. How can I remove the [] , and ' and get each ... See more...
I'm pretty sure the answer to my question is regex but I'm not too savy with it.  I have a few values in an IP field formatted like the below example. How can I remove the [] , and ' and get each IP in its own event?  ['10.1.1.1', '10.2.2.2']
Without giving admin, is there a permission to apply to roles that would allow that user to update the geoip files?  I allowed all the upload permissions but it still errors about permissions
Hello everyone!  In this scenario i have one Heavy forwarder and one indexer cluster (of course the is a Cluster Manager over there).  The HF have some inputs configured in this way (inputs.conf)... See more...
Hello everyone!  In this scenario i have one Heavy forwarder and one indexer cluster (of course the is a Cluster Manager over there).  The HF have some inputs configured in this way (inputs.conf):  [mi_input://List_Deployment_State] index = endpoint sourcetype = endpoint _TCP_ROUTING = ixChabelaGroup And the outputs were configured in this way (outputs.conf):  [tcpout] defaultGroup = ixChabelaGroup defaultGroup = default-autolb-group [tcpout:ixChabelaGroup] server = 192.189.2.25:9997 As you can see the TCP_ROUTING is only sending data to one Indexer and we want to balance the data forwarding to the entire cluster.  My question is: what would it happen if i enable the indexer discovery in the Heavy Forwarder?  as follows:  [tcpout:idxc-forwarders] indexerDiscovery = cluster1 useACK=true [indexer_discovery:cluster1] master_uri = https://192.189.2.26:8089 pass4SymmKey = MyUnhashedPasswd There will be a conflict between the indexer discovery and the _tcp_routing declared?  Or what is the proper way to configure the indexer discovery in my HF? Thanks in advance for your support.   
How often do scripted inputs execute?  I want to implement some of these for exchange, but concerned that they will continually execute and cause performance impact. Scripted inputs [script://.\bin... See more...
How often do scripted inputs execute?  I want to implement some of these for exchange, but concerned that they will continually execute and cause performance impact. Scripted inputs [script://.\bin\exchangepowershell.cmd v14 get-publicfolderstats_2010.ps1] [script://.\bin\exchangepowershell.cmd v14 get-databasestats_2010.ps1] [script://.\bin\exchangepowershell.cmd v14 get-folderstats_2010.ps1] [script://.\bin\exchangepowershell.cmd v14 get-distlists_2010_2013.ps1] [script://.\bin\exchangepowershell.cmd v14 get-hoststats_2010_2013.ps1] [script://.\bin\exchangepowershell.cmd v14 read-audit-logs_2010_2013.ps1] [script://.\bin\exchangepowershell.cmd v14 read-mailbox-audit-logs_2010_2013.ps1] [script://.\bin\exchangepowershell.cmd v14 get-mailboxstats_2010_2013.ps1] [script://.\bin\exchangepowershell.cmd v14 get-inboxrules_2010_2013.ps1]   Link to TA https://docs.splunk.com/Documentation/AddOns/released/MSExchange/TA-Mailboxinputs
Hi all! I'm trying to go through a list where each item is the input for a child playbook that return a json object. When I try to run it though, I receive this in the logs: "Ignoring child playbook ... See more...
Hi all! I'm trying to go through a list where each item is the input for a child playbook that return a json object. When I try to run it though, I receive this in the logs: "Ignoring child playbook <name of playbook> completion message since this playbook is already marked completed with status: success". so then I tried using phantom.playbook() instead, but then the results variable doesnt include this json object even though the child playbook has that as an output. What am I doing wrong or how can I make either option work for looping a child playbook? Thanks for any input!
I have some JSON that looks similar to this:     { "foo": "bar", "x": { "hello": "world", "y": { "A": 400, "B": 500, "C": 300 ... See more...
I have some JSON that looks similar to this:     { "foo": "bar", "x": { "hello": "world", "y": { "A": 400, "B": 500, "C": 300 } } } { "foo": "baz", "x": { "something": "test", "y": { "A": 100, "D": 200, "E": 600 } } }      What I would like is to extract everything in x.y for a sum but the keys are dynamic and I won't know them all in advance: A 500 B 500 C 300 D 200 E 600   I have been stuck on this one for a while. Can anyone help me?
Hello. I'm creating a dashboard which will then be used as a monthly report, with some statistics in it. I will use pdf delivery, so that i have avoided using any timerange picker in it. Point is, ... See more...
Hello. I'm creating a dashboard which will then be used as a monthly report, with some statistics in it. I will use pdf delivery, so that i have avoided using any timerange picker in it. Point is, without timerange picker, how can i define to run the search for -30d@d until -1d@d in all my panels?
I would like to create an alert to detect when a new user is added to " domain Admins" group and/or "enterprise admins" global group and disable that account automatically.
Hi  I wanted to integrate Nessus professional Data with splunk. Can you please suggest the best Add-on for integrating the data.   Can you please help me on this?
Hello all, I was reaching out to see if anyone has come across issues connecting Tenable.sc add-on to Splunk.  I downloaded the Tenable application and add-on, and I was able upload to Splunk.  My ... See more...
Hello all, I was reaching out to see if anyone has come across issues connecting Tenable.sc add-on to Splunk.  I downloaded the Tenable application and add-on, and I was able upload to Splunk.  My issue is getting Tenable to connect to Splunk.  When I try with credentials, I receive the following error, "Please enter valid address, username and password, or configure valid proxy settings or verify  SSL certificate."  When trying with API keys, I receive the following error, "Please enter valid address, access key, secret key, or configure valid proxy settings.  I have verified credentials and API keys multiple times.  Can anyone help with this?  Thank you.