All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I am trying to search for a list of users who have not logged into the azure ad  past 30 days Can you please help
Hi, everyone. Need some help for detection exclusion setting.  Want to exclude detections of  the files which are applicable to the file path below. c:\users\01234567\downloads\1234567890123xx... See more...
Hi, everyone. Need some help for detection exclusion setting.  Want to exclude detections of  the files which are applicable to the file path below. c:\users\01234567\downloads\1234567890123xx.exe For preventing alerts, I would like to use "13 digits number" and "xx.exe" as indicators. For now, I found it can be excluded only by "xx.exe." ex) file_path="*xx.exe" Although, when I use regex like below, it doesn't work. ex) file_path="*\d{13}xx.exe" Could you please let me know how to set both "13 digits number" and "xx.exe" as indicators for excluding detections?
Hello,  I have an output list like this one:       { "10.10.10.15": { "High": [ { "name": "vu1", "nvt_id": "123", "port": "", "protocol": "" ... See more...
Hello,  I have an output list like this one:       { "10.10.10.15": { "High": [ { "name": "vu1", "nvt_id": "123", "port": "", "protocol": "" } ], "Medium": [], "Low": [], "Log": [], "False Positive": [] }, "10.10.10.24": { "High": [ { "name": "vul", "nvt_id": "123", "port": "", "protocol": "" } ], "Medium": [], "Low": [], "Log": [], "False Positive": [] } }       I want to get All the IP address and extract the fields in each object.
Does splunk support push mechanism? How to push the available application logs to API endpoint?
When creating a Rest API data input for the Add-on Builder, and testing the REST API call I receive the following error: Traceback (most recent call last): File "/Applications/Splunk/etc/apps/TA... See more...
When creating a Rest API data input for the Add-on Builder, and testing the REST API call I receive the following error: Traceback (most recent call last): File "/Applications/Splunk/etc/apps/TA-test/bin/testtest_1663829580_707.py", line 14, in <module> import input_module_testtest_1663829580_707 as input_module File "/Applications/Splunk/etc/apps/TA-test/bin/input_module_testtest_1663829580_707.py", line 29, in <module> from cloudconnectlib.client import CloudConnectClient File "/Applications/Splunk/etc/apps/TA-test/bin/ta_test/aob_py3/cloudconnectlib/client.py", line 8, in <module> from .configuration import get_loader_by_version File "/Applications/Splunk/etc/apps/TA-test/bin/ta_test/aob_py3/cloudconnectlib/configuration/__init__.py", line 1, in <module> from .loader import get_loader_by_version File "/Applications/Splunk/etc/apps/TA-test/bin/ta_test/aob_py3/cloudconnectlib/configuration/loader.py", line 15, in <module> from ..core.exceptions import ConfigException File "/Applications/Splunk/etc/apps/TA-test/bin/ta_test/aob_py3/cloudconnectlib/core/__init__.py", line 1, in <module> from .engine import CloudConnectEngine File "/Applications/Splunk/etc/apps/TA-test/bin/ta_test/aob_py3/cloudconnectlib/core/engine.py", line 6, in <module> from .http import HttpClient File "/Applications/Splunk/etc/apps/TA-test/bin/ta_test/aob_py3/cloudconnectlib/core/http.py", line 26, in <module> 'http_no_tunnel': socks.PROXY_TYPE_HTTP_NO_TUNNEL, AttributeError: module 'socks' has no attribute 'PROXY_TYPE_HTTP_NO_TUNNEL' Splunk Version: 8.2.2.1 Add-on Builder Version: 4.1.1 OS: Mac I found a post to a similar issue, but moving the socks.py one directory up did not fix the issue. https://community.splunk.com/t5/All-Apps-and-Add-ons/Splunk-TA-New-Relic-Insight-not-ingesting-data/m-p/528756 Creating a new Add-on still experienced the issue. Downloading Add-on Builder version 3.0.1 still experienced the issue.
Hi, I am creating a single value panel with different search query for each. I want to combine all these values into a table, It should look like an excel table in the splunk dashboard. My individu... See more...
Hi, I am creating a single value panel with different search query for each. I want to combine all these values into a table, It should look like an excel table in the splunk dashboard. My individual query for each single value wizard looks like below. I want to combine all these queries and form a table with values. 1. index=abcd laas_appID=xyz OSBUILD=Linux3.1 | where OSVendor="Redhat" | stats count by OSBUILD 2. index=abcd laas_appID=xyz OSBUILD=Linux3.2 | where OSVendor="Redhat" | stats count by OSBUILD 3. index=abcd laas_appID=xyz OSBUILD=Linux3.3 | where OSVendor="Redhat" | stats count by OSBUILD 4. index=abcd laas_appID=xyz OSBUILD=Linux3.1 | where OSVendor="Ubuntu" | stats count by OSBUILD etc 5. index=abcd laas_appID=xyz OSBUILD=Linux3.1 | where OSVendor="Solaries" | stats count by OSBUILD etc Table shoud look Like the below in dashboard: OS Type  Redhat Ubuntu Solaris Linux 3.1 12 84 54 Linux 3.2 13 45 123 Linux 3.3 56 658 678
We would like to know how to onboard an AIX wtmp logs to splunk ?Can it be done via Universal Forwarder ? If so can you please help us with the documentations for onboarding AIX logs ?    
Is there a way to reduce memory usage for splunk Forwarder? I have two directories with 57k files each (120Mb each) and After restarting service I checked Task manager and memory usage reach 2Gb. I c... See more...
Is there a way to reduce memory usage for splunk Forwarder? I have two directories with 57k files each (120Mb each) and After restarting service I checked Task manager and memory usage reach 2Gb. I can't reduce the number of files because I've already done that and can't reduce more. I'm sure that the issue is with those directories because after I delete them from inputs.conf memory reach only 100Mb. 
Hi,  I am having the following output: [txn_key] field2 field3 status thread [time1] time2 time3 status2 [IDMS-TJ_TJG022092200005GN00017] 332950 311551 OK 2 [133369] 342 29 OK [ZVKK_R1000001-2... See more...
Hi,  I am having the following output: [txn_key] field2 field3 status thread [time1] time2 time3 status2 [IDMS-TJ_TJG022092200005GN00017] 332950 311551 OK 2 [133369] 342 29 OK [ZVKK_R1000001-235CDC24E191DBCE4906CCD0ND0000001] 498728 488378 OK 1 [133564] 509 9 OK [PE_CZ_R19.6_2226500012123062] 342295 331477 OK 2 [133365] 353 49 OK [BAFIROPC_R1.1_186951760] 289068 282128 OK 1 [133392] 295 5 OK [GALILEO_R19.4_MTA_03FH220922110216] 394234 383672 OK 2 [133537] 405 11 OK [DBINTERNET_R19.4_HU_RE02209223-06008] 187797 168329 OK 2 [133526] 201 7 OK [IDMS_1-I0781_944e2c3cafc0487db56f6b8d3a6a6e231] 193581 178804 OK 2 [133576] 206 4 OK [....] I need to create a search string that can count the number of occurrences for the prefixes on [txn_key].  Therefore, I would need to have the output similar to:  txn_key count of txns IDMS-TJ 1 ZVKK 543 PE_CZ_R19.6 0 BAFIROPC_R1.1 231 GALILEO_R19.4 12 DBINTERNET_R19.4_HU 212312 [...]     Tried so far using following logic | stats count(eval(tnx_key=="ZVKK")) as ZVKK, count(eval(tnx_key=="GALAPAC")) as GALAPAC by tnx_key but it doesn't produce the desired output.   A bit of help please?
Hi to all. im setting an integration with Splunk and Splunk ES. I decided to send events via HEC method json format. I understand that in order to accept the events in Splunk ES i need to do 2 ... See more...
Hi to all. im setting an integration with Splunk and Splunk ES. I decided to send events via HEC method json format. I understand that in order to accept the events in Splunk ES i need to do 2 things. 1. build an Add on for parsing the info 2. load the data in data model. ill be happy to have several answers : 1. do i need to send the events via CEF Syslog or json format is good enough ? 2. what is the standard event we should send to Splunk? json, Syslog? CEF? ill be happy for you to explain the process for building an add on, how to load in data modal - CIM  thanks to all
The following event being parsed as single event. I'm trying to break the event into multiple events Sample data Timestamp: 2021.09.21 - 23:10:17.463 Message: c0d8758b-3fxy-44ca-aa65-hf180002d49... See more...
The following event being parsed as single event. I'm trying to break the event into multiple events Sample data Timestamp: 2021.09.21 - 23:10:17.463 Message: c0d8758b-3fxy-44ca-aa65-hf180002d499 Organization Name: bananaII UserId: systemuser AppTracking: abcd2400-34ac-50el-3456-4abcd7636 ------------------- CommonMessageInput ---------------- Timestamp: 2021.09.21 - 23:10:17.463 Message: c0d8758b-3fxy-44ca-aa65-hf180002d499 Organization Name: bananaII UserId: systemuser AppTracking: abcd2400-34ac-50el-3456-4abcd7636 ------------------- CommonMessageInput ---------------- Timestamp: 2021.09.21 - 23:10:17.463 Message: c0d8758b-3fxy-44ca-aa65-hf180002d499 Organization Name: bananaII UserId: systemuser AppTracking: abcd2400-34ac-50el-3456-4abcd7636 I'm trying to break the event before ------------------- CommonMessageInput ---------------- so the events will be  Event 1 Timestamp: 2021.09.21 - 23:10:17.463 Message: c0d8758b-3fxy-44ca-aa65-hf180002d499 Organization Name: bananaII UserId: systemuser AppTracking: abcd2400-34ac-50el-3456-4abcd7636 Event 2 ------------------- CommonMessageInput ---------------- Timestamp: 2021.09.21 - 23:10:17.463 Message: c0d8758b-3fxy-44ca-aa65-hf180002d499 Organization Name: bananaII UserId: systemuser AppTracking: abcd2400-34ac-50el-3456-4abcd7636 Event 3 ------------------- CommonMessageInput ---------------- Timestamp: 2021.09.21 - 23:10:17.463 Message: c0d8758b-3fxy-44ca-aa65-hf180002d499 Organization Name: bananaII UserId: systemuser AppTracking: abcd2400-34ac-50el-3456-4abcd7636  
Hello, Is it possible to create a form where users can update field values in Splunk?Does eval or replace conditions help? can someone help me with the process?   Thanks
I was expecting to find a helper object function for this, but I don't see one here: https://docs.splunk.com/Documentation/AddonBuilder/4.1.1/UserGuide/PythonHelperFunctions I want to use the DNS n... See more...
I was expecting to find a helper object function for this, but I don't see one here: https://docs.splunk.com/Documentation/AddonBuilder/4.1.1/UserGuide/PythonHelperFunctions I want to use the DNS name/URL of the search head in my alert action code.   How can this be accessed (Splunk cloud, if it matters)?    At least in my local test server,  I see this, but it returns an IP address, not a DNS name. helper.settings["server_uri"]
# How to get cookies for simulation or accessing UI port. # cval=`curl -c - -k http://splunk:8000 -L -o a 2>/dev/null|grep cval|tr -s " " " " | cut -d $'\t' -f 7` ab=`curl -c - -k http://splunk... See more...
# How to get cookies for simulation or accessing UI port. # cval=`curl -c - -k http://splunk:8000 -L -o a 2>/dev/null|grep cval|tr -s " " " " | cut -d $'\t' -f 7` ab=`curl -c - -k http://splunk:8000/en-US/account/login -H "Cookie: cval=$cval" -d "username=MYUSER&password=MYPASSWORD&cval=$cval" -o a 2>/dev/null |egrep "csrf|splunkd_8000" |perl -pe 's/\n/ /g' | perl -pe 's/\t/ /g'`   csrf_token=$(echo $ab |cut -d " " -f 7) splunkd_8000=$(echo $ab | cut -d " " -f 14) echo "splunkweb_csrf_token=$csrf_token" echo "splunkd_8000=$splunkd_8000"   # Once cookies ready, then fill headers for command #headers = { #Cookie: splunkd_8000=<splunkd_cookie>;splunkweb_csrf_token_8000=<csrf_token>, #Content-type: application/json, #X-Requested-With: XMLHttpRequest, #X-Splunk-Form-Key: <csrf_token> <<< csrf this appears for POST only.. #} # Example: curl -c - -k http://splunk:8000/en-US/splunkd/__raw/servicesNS/-/-/saved/searches/ -H "Cookie: cval=372560337;splunkweb_csrf_token_8000=1324774297983139238;splunkd_8000=xuqLdlcjgtNm77umvfv6WZvJnX^WbTGvi2f2XbBMhoHe3nsshq_rGa6_Rknw06XThwCvML2VLuyQhTuhJJsFyx8TRAHi7RC17Up56IkluUmQVCLj9R4uZl9OyNP9Z7qBhIr" -X GET -H "X-Splunk-Form-Key: 1324774297983139238" -H "X-Requested-With: XMLHttpRequest" -H "Content-type: application/json"
I have a dropdown whose value once input needs to be used in two different ways in the same search query. One of the indexes require that value to be as is but the other index requires the value to b... See more...
I have a dropdown whose value once input needs to be used in two different ways in the same search query. One of the indexes require that value to be as is but the other index requires the value to be altered.    I'm doing something like what ive mentioned below but that doesn't seem to be working. Any help is appreciated.     dropdown token value is selected_client with the initial value being "*" (index = indexA OR index = indexB) AND (clientForIndexA = domain1 OR clientForIndexB = domain2) | eval domain1 = if("$selected_client$" == "*", "*", "$selected_client$") | eval domain2 = if(domain1 == "*", "*", replace (sync_ml_domain, "\." , "-")) | eval domain2 = if(domain1 == "*", "*", replace (sync_ml_domain, "_" , "-"))    
I am trying to install Splunk on our 32bit Raspberry Pis. I was originally trying to install 9.0.1 and found out that it only runs on 64bit. I am currently using a 32-bit V7 Pi 4. So now I a trying t... See more...
I am trying to install Splunk on our 32bit Raspberry Pis. I was originally trying to install 9.0.1 and found out that it only runs on 64bit. I am currently using a 32-bit V7 Pi 4. So now I a trying to install 7.31. I am having issues finding the best documentation for installing. Could someone point me in the right direction? Thank you!
Does anyone have a document/steps to guide me to do a SIEM migration from Qradar to Splunk
Hi Team! Someone please explain to me what each parameter is responsible for in such a search tag: <search> <query>system="SWAP_total" host = crm.narsdade.com | bin _time span=1d | dedup _time... See more...
Hi Team! Someone please explain to me what each parameter is responsible for in such a search tag: <search> <query>system="SWAP_total" host = crm.narsdade.com | bin _time span=1d | dedup _time | eval requestLasts = requestLimit - requestCount | table requestCount, requestLasts | rename requestCount AS "Requests done", requestLasts AS "Requests to go" | transpose | eval foobar_slice = column + " (" + 'row 1'+")" | fields foobar_slice, "row 1"</query> <earliest>-24h@h</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search>   what is system..host.. and other attributes means for..
Hello, I've tried figuring this out on my own but I couldn't find any related threads which would fixed my problems. I'm trying to install Splunk enterprise on my Ubuntu 20.04.5 LTS Server as root.... See more...
Hello, I've tried figuring this out on my own but I couldn't find any related threads which would fixed my problems. I'm trying to install Splunk enterprise on my Ubuntu 20.04.5 LTS Server as root. I've tried both .deb and .tar versions by following the docs . I've also tried following the new installation manual video.   After starting splunk with ./splunk start  and accepting the license I was prompted to rename the default account, i continued with enter to use the default admin name. Then I changed the default password and waited for the RSA key gen and preliminary checks. After that I am prompted with:    Starting splunk server daemon (splunkd)... PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate validation for the httplib and urllib libraries shipped with the embedded Python interpreter; must be set to "1" for increased security Done Waiting for web server at http://127.0.0.1:8000 to be available........splunkd 4932 was not running. Stopping splunk helpers... Done. Stopped helpers. Removing stale pid file... done. WARNING: web interface does not seem to be available!   I've also checked the logs but could't figure out the problem on my own:   Last entries of cat /opt/splunk/var/log/splunk/splunkd.log  09-21-2022 19:55:57.924 +0200 INFO  PipelineComponent [4932 MainThread] - Pipeline vix disabled in default-mode.conf file 09-21-2022 19:55:57.932 +0200 WARN  IntrospectionGenerator:resource_usage [5097 ExecProcessor] -   SSLOptions - server.conf/[sslConfig]/sslVerifyServerCert is false disabling certificate validation; must be set to "true" for increased security 09-21-2022 19:55:57.942 +0200 WARN  Thread [4932 MainThread] - MainThread: about to throw a ThreadException: pthread_create: Resource temporarily unavailable; 55 threads active. Trying to create QueueServiceThread 09-21-2022 19:55:57.944 +0200 WARN  IntrospectionGenerator:resource_usage [5097 ExecProcessor] -   SSLCommon - PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate validation for the httplib and urllib libraries shipped with the embedded Python interpreter; must be set to "1" for increased security 09-21-2022 19:55:57.945 +0200 ERROR ExecProcessor [5097 ExecProcessor] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk-dashboard-studio/bin/save_image_and_icon_on_install.py" /bin/sh: 1: Cannot fork 09-21-2022 19:55:57.945 +0200 ERROR ExecProcessor [5097 ExecProcessor] - Couldn't start command "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/selfupdate_modular_input.py": Resource temporarily unavailable 09-21-2022 19:55:57.948 +0200 WARN  IntrospectionGenerator:resource_usage [5097 ExecProcessor] -   SSLOptions - server.conf/[kvstore]/sslVerifyServerCert is false disabling certificate validation; must be set to "true" for increased security 09-21-2022 19:55:57.949 +0200 ERROR ExecProcessor [5097 ExecProcessor] - Couldn't start command "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/supervisor_modular_input.py": Resource temporarily unavailable 09-21-2022 19:55:57.952 +0200 WARN  IntrospectionGenerator:resource_usage [5097 ExecProcessor] -   Thread - MainThread: about to throw a ThreadException: pthread_create: Resource temporarily unavailable; 2 threads active. Trying to create KVStoreServerStatusInstrumentThread 09-21-2022 19:55:57.953 +0200 ERROR ExecProcessor [5097 ExecProcessor] - Couldn't start command "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_assist/bin/uiassets_modular_input.py": Resource temporarily unavailable 09-21-2022 19:55:57.954 +0200 ERROR ExecProcessor [5097 ExecProcessor] - Couldn't start command "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_instrumentation/bin/on_splunk_start.py": Resource temporarily unavailable 09-21-2022 19:55:57.954 +0200 ERROR ExecProcessor [5097 ExecProcessor] - Couldn't start command "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_monitoring_console/bin/dmc_config.py": Resource temporarily unavailable 09-21-2022 19:55:57.954 +0200 INFO  IntrospectionGenerator:resource_usage [5097 ExecProcessor] -  terminate called after throwing an instance of '15ThreadException' 09-21-2022 19:55:57.955 +0200 ERROR ExecProcessor [5097 ExecProcessor] - Couldn't start command "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_monitoring_console/bin/mc_auto_config.py": Resource temporarily unavailable 09-21-2022 19:55:57.956 +0200 ERROR ExecProcessor [5097 ExecProcessor] - Couldn't start command "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/splunk_secure_gateway/bin/ssg_enable_modular_input.py": Resource temporarily unavailable 09-21-2022 19:55:57.956 +0200 INFO  IntrospectionGenerator:resource_usage [5097 ExecProcessor] -    what():  MainThread: about to throw a ThreadException: pthread_create: Resource temporarily unavailable; 2 threads active. Trying to create KVStoreServerStatusInstrumentThread   ulimit -n -u open files                      (-n) 1024 max user processes              (-u) 62987   Does anyone know what I am doing wrong?   Please help me I have no spluck> : (
Hey Splunkers !! SPL-210107 Set Up Primary Data Source missing from data configuration side panel for Choropleth USA and Choropleth World Workaround: Add the data source via source... See more...
Hey Splunkers !! SPL-210107 Set Up Primary Data Source missing from data configuration side panel for Choropleth USA and Choropleth World Workaround: Add the data source via source code. Does this issue is fixed in 9.0.* versions.... This issue is available in Splunk 8.2.7 known issue document .. cant able to verify is this issue is fixed in version 9.0.* ...... check through the fixed issues document of 9.0.* version ... there is no sign that this issue is fixed or not ? So how could i verify does this issue is fixed in 9.0.* versions.... And if possible brief this issue with some sample SPL query and about the workaround for this issue... ------------------------------------ RestinLinux