All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have recently run into an issue with multiple "WARN HttpListener [HttpDedicatedIoThread:0] Socket error from [Search Head IP] while accessing /services/streams/search: broken pipe" for each Indexe... See more...
I have recently run into an issue with multiple "WARN HttpListener [HttpDedicatedIoThread:0] Socket error from [Search Head IP] while accessing /services/streams/search: broken pipe" for each Indexer in the Index cluster. There is a SH cluster, and a standalone SH. The standalone houses an app that does heavy backend searching. And the majority of the errors are from the standalone. When looking at the "I/O Operations per second" and "Storage I/O Saturation (cold/hot/opt)" from the Monitoring Console all instances are below 1%. I am not sure which settings to adjust to fix this error. Would adjusting the maxSockets and/or maxThreads in the server.conf help? Currently they are both set to default. Or should I be looking at values in limits.conf? This is happening on version 9.0.1. Any suggestions to help solve this would be much appreciated. Thanks!
Is it possible to set Splunk's timezone for each user based on their metadata in their SSO profile they use to log into Splunk? I'm pretty sure I can automate this information on the SSO side in a... See more...
Is it possible to set Splunk's timezone for each user based on their metadata in their SSO profile they use to log into Splunk? I'm pretty sure I can automate this information on the SSO side in a way that I can't using role-based configurations in Splunk (why can't roles have default timezones???), the only issue is getting Splunk to accept the timezone value. Using Splunk Cloud, currently on 8.2.something Thanks!
I have a lookup table that I want to use in a search. So I load the lookup table and use format. However I noticed there is a limit to 50,000 events in the format's result. What config allows me to i... See more...
I have a lookup table that I want to use in a search. So I load the lookup table and use format. However I noticed there is a limit to 50,000 events in the format's result. What config allows me to increase that or convert a column of values into a single string? | inputlookup test.csv | table count | format The lookup has 50,001 values in it but format only goes up to 50,000. 
Hi all,  I am trying to configure a REST API (OAuth) into a Splunk cloud trial environment. I'm running into issues and not seeing a clear way to pull this data in. I've tried using the HTTP event c... See more...
Hi all,  I am trying to configure a REST API (OAuth) into a Splunk cloud trial environment. I'm running into issues and not seeing a clear way to pull this data in. I've tried using the HTTP event collector with no luck.  I want to consume and search on the events that are pulled. Could this be a limitation on my cloud trial? Anything I'm missing?  I did find an app called "REST API Modular Input" that seems to work with splunk enterprise, but not splunk cloud. is there a free equivalent on splunk cloud splunkbase? Thanks, Michelle
Hi Spelunker, I want to create a field "Credentialed checks:" with this field value. Please help. regards, Nessus version : 8.10.0 Nessus build : 20232 Plugin feed version : 202210171349 Scann... See more...
Hi Spelunker, I want to create a field "Credentialed checks:" with this field value. Please help. regards, Nessus version : 8.10.0 Nessus build : 20232 Plugin feed version : 202210171349 Scanner edition used : Nessus Scanner OS : LINUX Scanner distribution : es6-x86-64 Scan type : Normal Scan policy used : eb1cd575-c2d4-5be5-8010-1290128ec92e-23586099/01. PCI-INTERNAL-VA-SCAN Scanner IP : 10.6.6.51 Port scanner(s) : nessus_syn_scanner Port range : sc-default Ping RTT : Unavailable Thorough tests : no Experimental tests : no Plugin debugging enabled : no Paranoia level : 1 Report verbosity : 1 Safe checks : yes Optimize the test : yes Credentialed checks : no Patch management checks : None Display superseded patches : no (supersedence plugin launched) CGI scanning : disabled Web application tests : disabled Max hosts : 30 Max checks : 4 Recv timeout : 5 Backports : Detected Allow post-scan editing : Yes Scan Start Date : 2022/10/18 19:50 +07 Scan duration : 561 sec" ........................................................................................................ Nessus version : 8.10.0 Nessus build : 20232 Plugin feed version : 202210171349 Scanner edition used : Nessus Scanner OS : LINUX Scanner distribution : es6-x86-64 Scan type : Normal Scan policy used : eb1cd575-c2d4-5be5-8010-1290128ec92e-23586099/01. PCI-INTERNAL-VA-SCAN Scanner IP : 10.6.6.51 Port scanner(s) : netstat Port range : sc-default Ping RTT : Unavailable Thorough tests : no Experimental tests : no Plugin debugging enabled : no Paranoia level : 1 Report verbosity : 1 Safe checks : yes Optimize the test : yes Credentialed checks : yes, as 'isdscan' via ssh Attempt Least Privilege : no Patch management checks : None Display superseded patches : no (supersedence plugin launched) CGI scanning : disabled Web application tests : disabled Max hosts : 30 Max checks : 4 Recv timeout : 5 Backports : Detected Allow post-scan editing : Yes Scan Start Date : 2022/10/18 19:51 +07 Scan duration : 2483 sec"
Hello,  I find myself needing a jump start on Splunk API calls with tokens. Anyone have more information than the docs?   https://docs.splunk.com/Documentation/Splunk/8.2.6/Security/UseAuthToke... See more...
Hello,  I find myself needing a jump start on Splunk API calls with tokens. Anyone have more information than the docs?   https://docs.splunk.com/Documentation/Splunk/8.2.6/Security/UseAuthTokens   any techniques to make testing easier?  I need to confirm that a given token is valid, I have the correct port number and that some basic functionality works with the API calls.  Anything would be greatly appreciated.  Has this been a topic of past Splunk .conf? 
I'm setting up Splunk Cloud for my organization.  Lately I've been getting errors when I try to establish a connection between my services and Splunk Cloud both from on-premises systems and cloud env... See more...
I'm setting up Splunk Cloud for my organization.  Lately I've been getting errors when I try to establish a connection between my services and Splunk Cloud both from on-premises systems and cloud environments.   I've been getting timeout errors as shown above when Cloudflare tests the connection.  I also get timeout messages when other applications (ADAudit) send data to Splunk.  Has anyone run into issues such as this or could give me some pointers.  My organization has reached out to Splunk regarding establishing a service contract (We thought it was included in our order) but have not heard anything back from our sales rep yet. 
Hi, Any thoughts appreciated. I have some connection data captured at connection termination, it has connection start and end times. CONSTA and CONEND in the format "2022-10-18 15:40:00.000000". ... See more...
Hi, Any thoughts appreciated. I have some connection data captured at connection termination, it has connection start and end times. CONSTA and CONEND in the format "2022-10-18 15:40:00.000000". What I'd like to do is timechart in say 5 minute intervals the number of connections that were active in those intervals. So all connections in an interval 15:40 - 15:45 that had started but not terminated and repeat that across the timechart so 15:45 - 15:50 etc. Hopefully that make sense. Thanks in advance Steve
Hello, Assuming i have numbers, let's say 1-2-3-4-5-6. And each of those represent Ip adress number of request method 1.1.1.2 1 get 1.1.1.3 1 get 1.1.1.4 2 ... See more...
Hello, Assuming i have numbers, let's say 1-2-3-4-5-6. And each of those represent Ip adress number of request method 1.1.1.2 1 get 1.1.1.3 1 get 1.1.1.4 2 get 1.1.1.5 4 get 1.1.1.6 4 get 1.1.1.7 5 get 1.1.1.8 7 get   What's could be the search to get following table   number of requests number of IPs that make 'x' requests 1 2 (meaning to client has made 1 requests) 2 1 3 0 4 2 5 1 6 0 7 1 8 0 9 0   Thanks
Hi Guys, Is there anybody here knows how to remove user email from any Splunk alert and add new user email in his place! I used this search to find any Splunk alerts related to the person I want ... See more...
Hi Guys, Is there anybody here knows how to remove user email from any Splunk alert and add new user email in his place! I used this search to find any Splunk alerts related to the person I want to remove, but I'm getting 0 events. | `a_searches` | fields report_name email_recipients cc_email_recipients | search email_recipients="* A@gmail.com*" Any help will be appreciated!
  I have a search index="xyz" sourcetype="csv" | fillnull value="unknownMan" field1 field2 field3 field4 | eventstats dc(field1) as xyz by field2 field3 field4 | table field1 field2 field3 field... See more...
  I have a search index="xyz" sourcetype="csv" | fillnull value="unknownMan" field1 field2 field3 field4 | eventstats dc(field1) as xyz by field2 field3 field4 | table field1 field2 field3 field4 while running this, i'm getting NULL values in the results? Please help me with this why NULL values will be coming when there is no NULL values in the events??
how to set an alert running every day hourly? ex - if new transactions /events occur alert the user
Hi Everyone, We need a PAM server logs without installing any third-party app in Pam server. Is it possible to do the monitoring without installing the Third-party app ??   Regards, Jack 
Please let me know the correlation search query and time range conditions for two of these usecases. I have windows powershell logs onboarded.   1. Suspicious Windows Shell Launched by Web Applic... See more...
Please let me know the correlation search query and time range conditions for two of these usecases. I have windows powershell logs onboarded.   1. Suspicious Windows Shell Launched by Web Applications  2. Suspicious Windows Shell Launched by a trusted process
I have a flat file that is in JSON format where events have no date/time as follows:     {"device": "info.gw.xyz.com", "ip": "x.x.x.x", "age": "0", "mac": "Incomplete", "interface": " "}, {"device... See more...
I have a flat file that is in JSON format where events have no date/time as follows:     {"device": "info.gw.xyz.com", "ip": "x.x.x.x", "age": "0", "mac": "Incomplete", "interface": " "}, {"device": "info.gw.xyz.com", "ip": "x.x.x.x", "age": "-", "mac": "0000.0000.0000", "interface": "Vlan673"}     My props.conf file is as follows:     [my_arp] INDEXED_EXTRACTIONS = JSON TZ=UTC     Problem is when I search events, they are four hours in the future. The files are on a sever that has the UF and that has the correct time set so looking through the Splunk docs  (https://docs.splunk.com/Documentation/Splunk/9.0.1/Data/HowSplunkextractstimestamps) I see this: If no events in the source have a date, Splunk software tries to find a date in the source name or file name. The events must have a time, even if they don't have a date. The files do have a date and time How do I fix this? Thx
Hi, I`ve got the following search that I would like to amend as follows: 1. swipe_in and swipe_out times to show on the same row for each "transaction" (in and out being considered a transaction)... See more...
Hi, I`ve got the following search that I would like to amend as follows: 1. swipe_in and swipe_out times to show on the same row for each "transaction" (in and out being considered a transaction). 2. only show the duration for swipe_in and swipe_out and not for swipe_out-swipe_in. Essentially my table should display: swipe_in times, swipe out times and duration. Thank you in advance. Search details: | eval location_desc=if(match(location_desc,"OUT"), "swipe_out", "swipe_in") | sort _time | streamstats window=2 current=f first(_time) as previous_swipe | eval duration=round((_time-previous_swipe)/3600, 2) | table location_desc, _time, duration
Hello there, Here is the context, I have a Splunk test environment, one indexer one search head and one forwarder. I'm in charge of finding a way to guarantee the integrity of the events available ... See more...
Hello there, Here is the context, I have a Splunk test environment, one indexer one search head and one forwarder. I'm in charge of finding a way to guarantee the integrity of the events available on the search head. My first question is, how to test data integrity control? I implemented it based on Splunk documentation, I tried to run Splunk clean and use the delete command (now I know that the event is not deleted from the index using delete),  and I edited the log files. But the integrity check is always successful. In an other words, in what case does the integrity check becomes unsuccessful?  My second question is, I changed the auth.log file, I mean this can be super dangerous but Splunk just displays both events, before the edit and after the edit. How can I use Splunk to detect such changes? Any help would be appreciated, thank you so much for your time 
hi experts trying to deploy a HF and forward logs to 2 different indexers. clone data i have 2 UFs feeding windows and syslog logs respectively to a HF. This is my HF output conf, i think there ... See more...
hi experts trying to deploy a HF and forward logs to 2 different indexers. clone data i have 2 UFs feeding windows and syslog logs respectively to a HF. This is my HF output conf, i think there some thing wrong here as i can only see logs at my indexer1 [tcpout] defaultGroup=windows,syslog [tcpout:windows,syslog] server=indexer1 ip:9997 [tcpout:windows,syslog] server=indexer2 ip:9997 appreciate any help.    
How to move Splunk cloud archives to Azure blob storage as contract with Splunk Cloud is getting terminated and we want to move it to Sentinel as part of this. Any suggestions ?
I'm trying to install Splunk SOAR in and EC2 Linux machine (8vCPU and 16GB RAM). I used this link https://docs.splunk.com/Documentation/SOARonprem/5.3.5/Install/InstallRPM. On running sudo ./soar-ins... See more...
I'm trying to install Splunk SOAR in and EC2 Linux machine (8vCPU and 16GB RAM). I used this link https://docs.splunk.com/Documentation/SOARonprem/5.3.5/Install/InstallRPM. On running sudo ./soar-install i'm getting errors. Trying this setup for test purpose only. The storage I have added is less than 500GB.    Traceback (most recent call last): File "/opt/phantom/5.3.4/splunk-soar/./soar-install", line 85, in main deployment.run() File "/opt/phantom/5.3.4/splunk-soar/install/deployments/deployment.py", line 130, in run self.run_pre_deploy() File "/opt/phantom/5.3.4/splunk-soar/usr/python39/lib/python3.9/contextlib.py", line 79, in inner return func(*args, **kwds) File "/opt/phantom/5.3.4/splunk-soar/install/deployments/deployment.py", line 163, in run_pre_deploy raise InstallError( install.install_common.InstallError: pre-deploy checks failed. Warnings can be ignored with --ignore-warnings install failed.   Anyone faced any issues similar to this ?