All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am attempting to use an HEC with basic authentication via HTTPS, but receiving a response 403 "Forbidden" when using the authorization header as Base64 encoded username:password pair.  The usernam... See more...
I am attempting to use an HEC with basic authentication via HTTPS, but receiving a response 403 "Forbidden" when using the authorization header as Base64 encoded username:password pair.  The username:HEC token works as is hinted in the documentation, so my question is whether there is any way to use a user's password for authentication, or a session key from a login request, when posting data to an HEC.   If not, are there any endpoints that will return a response on an HTTP request? Thanks in advance for any advice you can give.  
I am trying to send the following WMI winevent log event to the Null queue as it needs to be dropped.But this dosn't seems to be working. Can someone help me on this? I have configured the props & t... See more...
I am trying to send the following WMI winevent log event to the Null queue as it needs to be dropped.But this dosn't seems to be working. Can someone help me on this? I have configured the props & transform in Heavy Forwarder like- props.conf [source::WinEventLog:Microsoft-Windows-WMI-Activity/Operational] TRANSFORMS-null = wmi-setnull transforms.conf [wmi-setnull] REGEX =((.|\n)*)EventCode=5857\s+((.|\n)*)ProviderPath\s+=\s+(%systemroot%\\system32\\wbem\\(wmiprov\.dll|ntevt\.dll|wmiprvsd\.dll)|C:\\Windows\\(System32\\wbem\\krnlprov\.dll|CCM\\ccmsdkprovider\.dll)|C:\\Program\sFiles\\(Microsoft\sSQL\sServer\\.*\\Shared\\sqlmgmprovider\.dll|VMware\\VMware Tools\\vmStatsProvider\\win64\\vmStatsProvider\.dll)) DEST_KEY = queue FORMAT = nullQueue  
Hello to everyone, my issue is that when I use sendemail in a scheduled search to send results via email in csv format, columns in the csv are not in the same order I tabled them in the search. For... See more...
Hello to everyone, my issue is that when I use sendemail in a scheduled search to send results via email in csv format, columns in the csv are not in the same order I tabled them in the search. For example: <some_search> | table field1 field2 field3 | outputcsv TestInvioMail_searchOutput.csv | stats values(recipient) AS emailToHeader | mvexpand emailToHeader | map search="|inputcsv TestInvioMail_searchOutput.csv | where recipient=$emailToHeader$ | sendemail sendresults=true sendcsv=true server=<my_email_server_address> from=<sender_server_address> to=$emailToHeader$ subject=\"Some object\" message=\"Some message\"" | append [|inputcsv TestInvioMail_searchOutput.csv] I need to use also map command because I have to send different results to different recipients, since inserting recipient token in the Splunk's mail alert panel doesn't work. Sendemail works fine, every recipient receives the correct results, but they receive a csv in which fields are in a different order respect to the one specified in table command (for example in the csv column order is field2 field3 field1). I also tried to add width_sort_columns=<bool> parameter in sendemail command (after sendcsv=true) but without success. Do you have any suggestion? Thanks in advance.
Hi, I have a query below with a join condition .The issue is if I am hardcoding name value I am getting the result but when I'm removing it, not seeing any results plus I m getting this error in scre... See more...
Hi, I have a query below with a join condition .The issue is if I am hardcoding name value I am getting the result but when I'm removing it, not seeing any results plus I m getting this error in screenshot. Validated that it is not because of space issue .Can somebody suggest?
Is bucket repair on an index cluster any different from non-clustered indexers?  Should splunkd be running on the cluster master? Should it be in maintenance mode? When using network storage, shou... See more...
Is bucket repair on an index cluster any different from non-clustered indexers?  Should splunkd be running on the cluster master? Should it be in maintenance mode? When using network storage, should it be mounted to all of the indexers or only one? Is the fsck command run from the cluster master or from one of the indexers?
Hi Everyone,                Can someone please help OR let me know the steps on how to send clear events from Splunk to Service Now?     For Ex: Hung threads showing for the first 15 minutes from a... See more...
Hi Everyone,                Can someone please help OR let me know the steps on how to send clear events from Splunk to Service Now?     For Ex: Hung threads showing for the first 15 minutes from an alert and sending an events to Service Now for a particular CI and the next 15 minutes no hung threads found from an alert
Hi, The Jenkins addon was installed and everything is working, but there is a broken link when clicking on the Splunk button. When I'm in the main dashboard when I click the Splunk button and I h... See more...
Hi, The Jenkins addon was installed and everything is working, but there is a broken link when clicking on the Splunk button. When I'm in the main dashboard when I click the Splunk button and I have the overview results: /app/splunk_app_jenkins/overview?overview_jenkinsmaster=master But when I go into the job itself and click the button, I get error 404: /app/splunk_app_jenkins/build_analysis?build_analysis_jenkinsmaster=jenkins&build_analysis_job=job_name Jenkins Splunk 1.9.7 Splunk App for Jenkins 2.0.4   Where may be the problem? Thanks   
We have a central syslog server that is being used to push paloalto logs to along with some other devices, each host has its own folder on the syslog server where data for that particular host is sto... See more...
We have a central syslog server that is being used to push paloalto logs to along with some other devices, each host has its own folder on the syslog server where data for that particular host is stored.  From a splunk POV we are a cloud hosted customer.  I have today installed the palo alto app for Splunk and wondering on the best way to achieve the below.  As the data is coming into the index=syslog and sourcetype=syslog the inputs on the app are not working as is expecting particular sourcetypes pan_logs  as an example. Is it possible to override and redirect the PA hosts from the syslog stream to the correct index and sourcetype ?    Is it possible to filter out the 
Hi there,   I'm trying so hard to do a new field in Splunk, but i don't know where i do "wrongs". I would like to extract "Log Closed" or just "Log" from event, but when i do, i get all kind of ot... See more...
Hi there,   I'm trying so hard to do a new field in Splunk, but i don't know where i do "wrongs". I would like to extract "Log Closed" or just "Log" from event, but when i do, i get all kind of other results other than what i want. I tried with extract and require. On the extract end i get a mixed variety of results, most of them with no relation to what i look for. On the require end, when i select all correct lines, i cannot press Next button as it is grayed out. And i have no other clue what to do next. My question is: What path should i take to get "Log Closed" or just "Log" from the event "2021-11-18 02:19:04.291 - Thread: 1 -> Log Closed" to make a new field. I would like to make a new Field as i have a "Log Started" and a "Log Closed". I tried even too look at the regex, but i understand none of it, exept i know that \n is new line. The regex is: ^[^>\n]*>\s+(?P<LogClosed>\w+\s+\w+) Thank you.
Can someone please help me with the below Query  1. Account lockouts(4740) and then go back in time one hour to find login failures(4625) for the blocked user. 2. Login failure(4625) and then go ba... See more...
Can someone please help me with the below Query  1. Account lockouts(4740) and then go back in time one hour to find login failures(4625) for the blocked user. 2. Login failure(4625) and then go back in time 2 hour to find account lockout(4740) for the same failed login user.   SOURCE LOG BELOW : 4740 EVENT <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Auditing' Guid='{54849625-5478-4994-A5BA-3E3B0328C30D}'/><EventID>4740</EventID><Version>0</Version><Level>0</Level><Task>13824</Task><Opcode>0</Opcode><Keywords>0x8020000000000000</Keywords><TimeCreated SystemTime='2021-11-18T12:40:45.252885800Z'/><EventRecordID>774430877</EventRecordID><Correlation/><Execution ProcessID='568' ThreadID='1856'/><Channel>Security</Channel><Computer>TESTDC1.TESTDOMAIN123.net</Computer><Security/></System><EventData><Data Name='TargetUserName'>TESTUSER123</Data><Data Name='TargetDomainName'>HOSTNAME123</Data><Data Name='TargetSid'>S-1-5-21-2467427501-1309223053-903455979-12974</Data><Data Name='SubjectUserSid'>S-1-5-18</Data><Data Name='SubjectUserName'>TESTDC1$</Data><Data Name='SubjectDomainName'>TESTDOMAIN123</Data><Data Name='SubjectLogonId'>0x3e7</Data></EventData></Event> 4625 EVENT <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Security-Auditing' Guid='{54849625-5478-4994-A5BA-3E3B0328C30D}'/><EventID>4625</EventID><Version>0</Version><Level>0</Level><Task>12544</Task><Opcode>0</Opcode><Keywords>0x8010000000000000</Keywords><TimeCreated SystemTime='2021-11-18T12:44:43.074155100Z'/><EventRecordID>74779349</EventRecordID><Correlation ActivityID='{6527FA3B-D06B-4A13-A997-3F44717DF05B}'/><Execution ProcessID='716' ThreadID='1712'/><Channel>Security</Channel><Computer>TESTHOST123.TESTDOMAIN123.net</Computer><Security/></System><EventData><Data Name='SubjectUserSid'>NULL SID</Data><Data Name='SubjectUserName'>-</Data><Data Name='SubjectDomainName'>-</Data><Data Name='SubjectLogonId'>0x0</Data><Data Name='TargetUserSid'>NULL SID</Data><Data Name='TargetUserName'>TESTUSER123</Data><Data Name='TargetDomainName'>.</Data><Data Name='Status'>0xc000006d</Data><Data Name='FailureReason'>%%2313</Data><Data Name='SubStatus'>0xc0000064</Data><Data Name='LogonType'>3</Data><Data Name='LogonProcessName'>NtLmSsp </Data><Data Name='AuthenticationPackageName'>NTLM</Data><Data Name='WorkstationName'>TESTHOST123</Data><Data Name='TransmittedServices'>-</Data><Data Name='LmPackageName'>-</Data><Data Name='KeyLength'>0</Data><Data Name='ProcessId'>0x0</Data><Data Name='ProcessName'>-</Data><Data Name='IpAddress'>172.19.19.19</Data><Data Name='IpPort'>53972</Data></EventData></Event>
I have a search string that gives me count of txns processed by a job... ....| rex field=_raw "Total txns:(?<TxnsCount>.*)#015" | table _time, TxnsCount ...but when I try to extract txns where valu... See more...
I have a search string that gives me count of txns processed by a job... ....| rex field=_raw "Total txns:(?<TxnsCount>.*)#015" | table _time, TxnsCount ...but when I try to extract txns where value greater than 10... ...| rex field=_raw "Total txns:(?<TxnsCount>.*)#015" | table _time, TxnsCount | where TxnsCount > 10 ...no data is returned Any help welcome Thanks in advance
Hi  In our organisation we are in the process of implementing Splunk and there are some domains which do not have access to internet. For these domains can we use proxy server? It seems that prefer... See more...
Hi  In our organisation we are in the process of implementing Splunk and there are some domains which do not have access to internet. For these domains can we use proxy server? It seems that preferred way is to use heavy forwarder but I am curious to know if proxy server can be used. What are the advantages/disadvantages of using proxy server? Thank you
Hi Everyone I've been tasked with looking into using ITSI for my company.  Very new to Splunk so I appreciate it's a big ask. Doing as much of the online training as is humanly possible atm. We had... See more...
Hi Everyone I've been tasked with looking into using ITSI for my company.  Very new to Splunk so I appreciate it's a big ask. Doing as much of the online training as is humanly possible atm. We had a workshop last week with the new new observability model, lovely stuff. My question around ITSI revolves around it's need to use the following app. https://docs.splunk.com/Documentation/InfraApp/2.2.4/Install/About The blurb there states it's going EOL 22/08/22 Have I read this all wrong, and ITSI doesn't need the Splunk app for Infrastructure? I'm just conscious that I may be assessing something that is about to go EOL, and I should concentrate on the new obvservability stuff(signalFX) Any help or advice, greatly appreciated.
How to import (*) wildcard or root certificate into the controller and to which keysore to use keystore.jks or cacert.jks.  I want to secure the controller which is running on http on port 8090 and ... See more...
How to import (*) wildcard or root certificate into the controller and to which keysore to use keystore.jks or cacert.jks.  I want to secure the controller which is running on http on port 8090 and want to secure this conroller and i have * certificate with me please any one can share with me the steps and after importing the * certificate will the controller will be secure to https. please
Not working SEDCMD in my props.conf   /opt/splunk/etc/system/local/props.conf   [ActiveDirectory] SEDCMD-mask_ms_pwd = s/(ms-Mcs-AdmPwd\s*=)\s*.*/ms-Mcs-AdmPwd=*******/    I checked it on rege... See more...
Not working SEDCMD in my props.conf   /opt/splunk/etc/system/local/props.conf   [ActiveDirectory] SEDCMD-mask_ms_pwd = s/(ms-Mcs-AdmPwd\s*=)\s*.*/ms-Mcs-AdmPwd=*******/    I checked it on regex101.com everything works.      wrote a line in props.conf and reloaded splunk. but still hides nothing
I'm trying to put a host in a host field before indexing the csv file below. 【CSV file】 #ServerName001 #JobName,Start time,End time,Elapsed time,Status JobName_01,11/05/21 19:08:07,11/05/21 19:08... See more...
I'm trying to put a host in a host field before indexing the csv file below. 【CSV file】 #ServerName001 #JobName,Start time,End time,Elapsed time,Status JobName_01,11/05/21 19:08:07,11/05/21 19:08:41,00:00:34,Succeeded JobName_02,11/05/21 20:49:53,11/05/21 21:19:06,00:29:13,Succeeded JobName_03,11/05/21 21:53:10,11/05/21 21:53:15,00:00:05,Succeeded I set TRANSFORMS in props.conf with changeHost and set the contents of changeHost in transfoms.conf as follows. 【changeHost】 [changeHost] SOURCE_KEY = _raw REGEX = \#(\S+)\s\#: DEST_KEY = MetaData:Host FORMAT = host::$1 I want to set host field as ServerName001, but it doesn't work. Can anyone give me some advice?
Hi guys, I have a doubt regarding the mapping of connection from the same source IP to different destination IP. In my query, I have to check if the same source reach more than 300 different destin... See more...
Hi guys, I have a doubt regarding the mapping of connection from the same source IP to different destination IP. In my query, I have to check if the same source reach more than 300 different destination IP. Usually, If I had to report connection with source ip which appear more than 300 times, I know I can to write:   |stats count src_ip as source by source |where count > 300   But what about if I have to check the query requirements? It shuold be something like: |stats count src_ip as source, dest_ip as destination by source |where destination > 300 ?
  How to create a chart like the above one for different elements with multiple values
Hi, I am working with my proxy logs and trying to find a way to get same URLs visited by multiple clients. To add clarity, my current splunk query gives me an output similar to this:- src_ip       ... See more...
Hi, I am working with my proxy logs and trying to find a way to get same URLs visited by multiple clients. To add clarity, my current splunk query gives me an output similar to this:- src_ip       URL 1.2.3.4      abc.com, jp.com, ms.com 2.3.4.5      abc.com, yahoo.com. jp.com 3.3.5.5      abc.com, hoot.com. japn.com 6.7.8.5      abc.com, yahoo.com. jp.com, ms.com I am trying to get something like the below as clearly all clients visited abc.com src_ip       URL 1.2.3.4      abc.com 2.3.4.5      abc.com 3.3.5.5      abc.com 6.7.8.5      abc.com Anyone helping with an SPL would be greatly appreciated. I have tried a lot of documentations and forums but it doesn't look like there is a straightforward answer to what I am trying to accomplish.
I when I create a json dashboard from dashboard sution do I have to put the file in default and views folder or do I have to do something else? Thanks