All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I'm creating a custom application in SOAR and one of the fields this custom application provides is a password information, for obvious reasons, I don't want to store the password in the container, r... See more...
I'm creating a custom application in SOAR and one of the fields this custom application provides is a password information, for obvious reasons, I don't want to store the password in the container, rather I just would like to add it in a parameter that I can use during playbook execution only. is there any way I can do it? Version: Splunk SOAR 5.2.1.78411 What I'm doing today in my custom app is: if secret_value:           self.save_progress("Secret value retrieved successfully")           action_result.add_data({"succeeded": True, "secret_value": secret_value})          return action_result.set_status(phantom.APP_SUCCESS, 'Successfully retrieved secret value') but the secret value is saved in the container.  
Hello everybody, I'm trying to join two different sourcetypes from the same index that both have a field with the same value but different name. i.e: sourcetype 1 - field name=x - value=z | sou... See more...
Hello everybody, I'm trying to join two different sourcetypes from the same index that both have a field with the same value but different name. i.e: sourcetype 1 - field name=x - value=z | sourcetype 2 - field name=y - value=z I've tried this two queries but had no success at joining these two:   index=rapid7 sourcetype="rapid7:insightvm:asset:vulnerability_finding" finding_status=new | eval date=strftime(now(),"%m-%d") | eval date_first=substr(first_found,6,5) | where date=date_first | join type=outer left=L right=R where L.vulnerability_id=R.id [ search index=rapid7 sourcetype="rapid7:insightvm:vulnerability_definition" ] index=rapid7 sourcetype="rapid7:insightvm:asset:vulnerability_finding" OR sourcetype="rapid7:insightvm:vulnerability_definition" | eval id=vulnerability_id | transaction id   As you can see, I didn't even tried with the transaction one because I haven't finished to understand how it works. The main issue I have is that I want to work with all values so I can build a table or a stats command that displays the most recent vulnerabilities found by the InsightVM dataset, however, I only get the values from the left search. Whenever I add a stats or a table command to the query using the join command I get empty values in my table. i.e:   | table L.asset_hostname, R.title, R.description, L.solution_fix   I have already manually tested to see if the values from the different fields are the same and they are, I'd appreciate if someone would be kind enought to shed some light onto this and help me understand what am I doing wrong. Thanks in advance.
Hello all, I'm trying to install Palo Alto Add-On to integrate Cortex XDR on Splunk. I followed the steps in https://splunk.paloaltonetworks.com/cortex-xdr.html configured Tenant Name, API Key ID a... See more...
Hello all, I'm trying to install Palo Alto Add-On to integrate Cortex XDR on Splunk. I followed the steps in https://splunk.paloaltonetworks.com/cortex-xdr.html configured Tenant Name, API Key ID and API Key but when tries to retrieve events this error it's logged: File "/opt/splunk/etc/apps/Splunk_TA_paloalto/bin/splunk_ta_paloalto/aob_py3/requests/adapters.py", line 516, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPSConnectionPool(host='api-https', port=443): Max retries exceeded with url: //masked_tenant_name.xdr.masked_tenant_region.paloaltonetworks.com/.xdr.masked_tenant_region.paloaltonetworks.com/public_api/v1/incidents/get_incidents/ (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7f1afcb645d0>: Failed to establish a new connection: [Errno -2] Name or service not known')) As you can see, after the message "Max retries exceeded with url:" the URL doesn't contain "https:", well this cannot be the problem. The configuration it's this: Name = DEV_XDR Interval = 60 Index = default Status = false Tenant Namehttps://masked_tenant_name.xdr.masked_tenant_region.paloaltonetworks.com/ Tenant Region = masked_tenant_region API Key ID******** API Key******** I tried "curl" from server with add-on to the tenant URL, and the URL can be reached Before openning a case in Palo Alto, did anyone had this problem or similar before?
What capability gives users permissions to only enable/disable custom alert (w/o editing)? I looking for solution how to add a user just permissions to enabling or disabling custom alerts, w/o edit ... See more...
What capability gives users permissions to only enable/disable custom alert (w/o editing)? I looking for solution how to add a user just permissions to enabling or disabling custom alerts, w/o edit and schedule possibilities.
Hi everyone,   I'm currently having a difficulty installing a UF in one of our Microsoft Server 2019 that is residing as VM via Hyper-V. Please do take note that this is a fresh installation of un... See more...
Hi everyone,   I'm currently having a difficulty installing a UF in one of our Microsoft Server 2019 that is residing as VM via Hyper-V. Please do take note that this is a fresh installation of universal forwarder in this machine. Also, this server is acting as a domain controller and we would like to get its logs.   Kindly show me the way since I have been searching for hours and could not find a proper answer for this. Also, I would like to avoid doing a reformatting on this specific machine just to install the UF. Thank you.   This shows the logs:   12:23:30 AM C:\Windows\system32\cmd.exe /c "C:\Windows\system32\rundll32.exe setupapi,InstallHinfSection DefaultInstall 128 C:\Program Files\SplunkUniversalForwarder\bin\splunkdrv.inf >> "C:\Users\ADMINI~1\AppData\Local\Temp\splunk.log" 2>&1" 12:23:34 AM C:\Windows\system32\cmd.exe /c "C:\Windows\system32\rundll32.exe setupapi,InstallHinfSection DefaultInstall 128 C:\Program Files\SplunkUniversalForwarder\bin\splknetdrv.inf >> "C:\Users\ADMINI~1\AppData\Local\Temp\splunk.log" 2>&1" 12:23:37 AM C:\Windows\system32\cmd.exe /c "C:\Windows\system32\rundll32.exe setupapi,InstallHinfSection DefaultInstall 128 C:\Program Files\SplunkUniversalForwarder\bin\SplunkMonitorNoHandleDrv.inf >> "C:\Users\ADMINI~1\AppData\Local\Temp\splunk.log" 2>&1" 12:23:40 AM C:\Windows\system32\cmd.exe /c ""C:\Program Files\SplunkUniversalForwarder\bin\splunk.exe" _internal first-time-run --answer-yes --no-prompt >> "C:\Users\ADMINI~1\AppData\Local\Temp\splunk.log" 2>&1" This appears to be your first time running this version of Splunk. 12:23:40 AM C:\Windows\system32\cmd.exe /c ""C:\Program Files\SplunkUniversalForwarder\bin\splunk.exe" _internal pre-flight-checks --answer-yes --no-prompt >> "C:\Users\ADMINI~1\AppData\Local\Temp\splunk.log" 2>&1" The certificate generation script did not generate the expected certificate file:C:\Program Files\SplunkUniversalForwarder\etc\auth\server.pem. Splunkd port communication will not work. SSL certificate generation failed. Creating: C:\Program Files\SplunkUniversalForwarder\var\lib\splunk Creating: C:\Program Files\SplunkUniversalForwarder\var\run\splunk Creating: C:\Program Files\SplunkUniversalForwarder\var\run\splunk\appserver\i18n Creating: C:\Program Files\SplunkUniversalForwarder\var\run\splunk\appserver\modules\static\css Creating: C:\Program Files\SplunkUniversalForwarder\var\run\splunk\upload Creating: C:\Program Files\SplunkUniversalForwarder\var\run\splunk\search_telemetry Creating: C:\Program Files\SplunkUniversalForwarder\var\spool\splunk Creating: C:\Program Files\SplunkUniversalForwarder\var\spool\dirmoncache Creating: C:\Program Files\SplunkUniversalForwarder\var\lib\splunk\authDb Creating: C:\Program Files\SplunkUniversalForwarder\var\lib\splunk\hashDb 12:23:45 AM C:\Windows\system32\cmd.exe /c "C:\Windows\system32\rundll32.exe setupapi,InstallHinfSection DefaultUninstall 128 C:\Program Files\SplunkUniversalForwarder\bin\SplunkMonitorNoHandleDrv.inf >> "C:\Users\ADMINI~1\AppData\Local\Temp\splunk.log" 2>&1" 12:23:47 AM C:\Windows\system32\cmd.exe /c "C:\Windows\system32\rundll32.exe setupapi,InstallHinfSection DefaultUninstall 128 C:\Program Files\SplunkUniversalForwarder\bin\splknetdrv.inf >> "C:\Users\ADMINI~1\AppData\Local\Temp\splunk.log" 2>&1" 12:23:49 AM C:\Windows\system32\cmd.exe /c "C:\Windows\system32\rundll32.exe setupapi,InstallHinfSection DefaultUninstall 128 C:\Program Files\SplunkUniversalForwarder\bin\splunkdrv.inf >> "C:\Users\ADMINI~1\AppData\Local\Temp\splunk.log" 2>&1"    
What capabilities are required for users to see Splunk.image visualizations on Studio Dashboards? Currently the only capability I can find that allows the role to see the visualization is admin_al... See more...
What capabilities are required for users to see Splunk.image visualizations on Studio Dashboards? Currently the only capability I can find that allows the role to see the visualization is admin_all_objects, but I want to avoid using this if possible Thanks Daniel
(This may be Vague because what its on so sorry ) I have lets say 2 servers. The "Splunk" server, and then the "Target" server. There are certain logs on the Target server that im trying to get to re... See more...
(This may be Vague because what its on so sorry ) I have lets say 2 servers. The "Splunk" server, and then the "Target" server. There are certain logs on the Target server that im trying to get to report to splunk. And for some reason they just arent going. Tried editing the inputs a few times. I get 2 of the 3 files reporting to splunk but for some reason it wont capture the other one.  Ive tried a few variations on the Inputs.conf file to capture these logs. Not sure if i need to specify the log type in the conf file or if im just being dumb. This is one of those things ive been working on for a while so my mind is mush. So here i am.  There is a "Banner.LogTypeHere" "Host.LogTypeHere" and a "UserDAC.LogTypeHere" I get the Banner and Host to show up but for some reason the UserDAC doesnt populate.  Just some dumb variations ive tried in inputs.  [monitor: %ProgramData%\XXX\XXX\Logs] [monitor: %ProgramData%\XXX\XXX\Logs\*.LogTypeHere] [monitor: c:\ProgramData\XXX\XXX\Log]
After upgrading Splunk Enterprise to 9.0.2 we are encountering the following error on every restart on CLI:   Checking conf files for problems... Invalid key in stanza [instrumentat... See more...
After upgrading Splunk Enterprise to 9.0.2 we are encountering the following error on every restart on CLI:   Checking conf files for problems... Invalid key in stanza [instrumentation.usage.tlsBestPractices] in /opt/splunk/etc/apps/splunk_instrumentation/default/savedsearches.conf, line 451: | append [| rest /services/configs/conf-pythonSslClientConfig | eval sslVerifyServerCert (value: if(isnull(sslVerifyServerCert),"unset",sslVerifyServerCert), splunk_server=sha256(splunk_server) | stats values(eai:acl.app) as python_configuredApp values(sslVerifyServerCert) as python_sslVerifyServerCert by splunk_server | eval python_configuredSystem=if(python_configuredApp="system","true","false") | fields python_sslVerifyServerCert, splunk_server, python_configuredSystem] | append [| rest /services/configs/conf-web/settings | eval mgmtHostPort=if(isnull(mgmtHostPort),"unset",mgmtHostPort), splunk_server=sha256(splunk_server) | stats values(eai:acl.app) as fwdrMgmtHostPort_configuredApp values(mgmtHostPort) as fwdr_mgmtHostPort by splunk_server | eval fwdrMgmtHostPort_configuredSystem=if(fwdrMgmtHostPort_configuredApp="system","true","false") | fields fwdrMgmtHostPort_sslVerifyServerCert, splunk_server, fwdrMgmtHostPort_configuredSystem] | append [| rest /services/configs/conf-server/sslConfig | eval cliVerifyServerName=if(isnull(cliVerifyServerName),"feature",cliVerifyServerName), splunk_server=sha256(splunk_server) | stats values(cliVerifyServerName) as servername_cliVerifyServerName values(eai:acl.app) as servername_configuredApp by splunk_server | eval cli_configuredSystem=if(cli_configuredApp="system","true","false") | fields cli_sslVerifyServerCert, splunk_server, cli_configuredSystem] | stats values(*) as * by splunk_server | eval date=now() | makejson output=data | eval _time=date, date=strftime(date,"%Y-%m-%d") | fields data date _time). Your indexes and inputs configurations are not internally consistent. For more information, run 'splunk btool check --debug'   This was not happening on 9.0.1 so we checked the 'savedsearches.conf' of the splunk_instrumentation app in the 9.0.1 tar and we found that the 9.0.2 'savedsearches.conf' is actually older and different than the 9.0.1 version.   ~/Downloads$ diff savedsearches.conf.901 savedsearches.conf.902 | cat -A 447c447$ < | append [| rest /services/configs/conf-server/sslConfig | eval sslVerifyServerCert=if(isnull(sslVerifyServerCert),"unset",sslVerifyServerCert), splunk_server=sha256(splunk_server) | stats values(eai:acl.app) as global_configuredApp values(sslVerifyServerCert) as global_sslVerifyServerCert by splunk_server | eval global_configuredSystem=if(global_configuredApp="system","true","false") | fields global_sslVerifyServerCert, splunk_server, global_configuredSystem] \$ ---$ > | append [| rest /services/configs/conf-server/sslConfig | eval sslVerifyServerCert=if(isnull(sslVerifyServerCert),"unset",sslVerifyServerCert), splunk_server=sha256(splunk_server) | stats values(eai:acl.app) as global_configuredApp values(sslVerifyServerCert) as global_sslVerifyServerCert by splunk_server | eval global_configuredSystem=if(global_configuredApp="system","true","false") | fields global_sslVerifyServerCert, splunk_server, global_configuredSystem] \ $   The difference lies in the scaped end of line character at the end. We also tried to run this search from the GUI and it raises an error confirming that the search is indeed broken: We "solved" it by using the 9.0.1 version in the local folder of the app splunk_instrumentation. Has anyone found out if this broken search is affecting Splunk Enterprise usage in anyway?
Hello, is the app REST storage/passwords Manager for Splunk compatible with jquery 3.5? thanks , Krithika
Hi all,  Pls consider this subset of data, ... - Date - Fruit - Seller - Bad_count - ... 11/8 - Apple - X - 3 11/8 - Apple - Y - 10 11/8 - Apple - X - 3 11/8 - Apple - Y - 10 11/8 - Orange - Y... See more...
Hi all,  Pls consider this subset of data, ... - Date - Fruit - Seller - Bad_count - ... 11/8 - Apple - X - 3 11/8 - Apple - Y - 10 11/8 - Apple - X - 3 11/8 - Apple - Y - 10 11/8 - Orange - Y - 6 11/8 - Orange - X - 1 11/8 - Orange - Y - 6 11/9 - Apple - X - 0 11/9 - Apple - Y - 9 11/9 - Apple - X - 0 11/9 - Orange - X - 7 11/9 - Orange - Y - 2 How to read it => Row 1: On 11/8 Seller X had 3 bad Apples, Row 8: on 11/9 Seller X had 0 bad Apples. I would like to reformat the table into this: ... - Date - Fruit - Seller - Bad_count - X_bad_count - Y_bad_count - ... 11/8 - Apple - X - 3 — 3 - 10 11/8 - Apple - Y - 10 — 3 - 10 11/8 - Apple - X - 3 — 3 - 10 11/8 - Apple - Y - 10 — 3 - 10 11/8 - Orange - Y - 6 — 1 - 6 11/8 - Orange - X - 1 — 1 - 6 11/8 - Orange - Y - 6 — 1 - 6 11/9 - Apple - X - 0 — 0 - 9 11/9 - Apple - Y - 9 — 0 - 9 11/9 - Apple - X - 0 — 0 - 9 11/9 - Orange - X - 7 — 7 - 2 11/9 - Orange - Y - 2 — 7 - 2 How to read this => Row 1: On 11/8 for Apples:  Seller X had 3 bad count and Seller Y had 10 bad count.   The idea is to split the Bad_count column into two columns based on the unique combination of Date and Fruit.  Any help would be greatly appreciated! Thanks, Shrey PS: 1) There's years of data, many many fruits, and multiple sellers in the original dataset. 2) I've also sorted the sample data by Fruit up there to make it easy to read. 3) Don't worry about the duplicate rows as there are other fields in the dataset as well (meaning, dedup with care).
index=cgr_trial host=rp00001234 rp00002345 rp00002344
Hi, We are running Splunk on 3 Environments Env#1 is Splunk Cloud v 8.2.2112.1 Env#2 is Splunk Cloud v 9.0.2208.3 Env#3 is Splunk Enterprise v 9.0.1  The following SPL successfully runs on Env... See more...
Hi, We are running Splunk on 3 Environments Env#1 is Splunk Cloud v 8.2.2112.1 Env#2 is Splunk Cloud v 9.0.2208.3 Env#3 is Splunk Enterprise v 9.0.1  The following SPL successfully runs on Env#2 and Env#3 and produces the expected result: | makeresults | eval mvfield=mvappend("1", "2", "3"), total=2 | foreach mode=multivalue mvfield      [eval total = total + <<ITEM>>] | table mvfield, total Result from running above search in Env#2 and Env#3:  mvfield total 1 2 3 8 Running the exactly same search in Env#1 triggers the ERROR: Error in 'eval' command: The expression is malformed. An unexpected character is reached at '<<ITEM>>'. Any advices on workaround? Thank you!
Hi, I´m facing a problem. I´m trying to solve my current issue via 2 different approaches but i´m unfortunately unable to complete any of these solutions. I´m trying to provide a dashboard with a ... See more...
Hi, I´m facing a problem. I´m trying to solve my current issue via 2 different approaches but i´m unfortunately unable to complete any of these solutions. I´m trying to provide a dashboard with a form whose some of the fields to fill should allow users to sort data using multiple inclusions or exclusions to fit what each and every team works with. Solution 1 : By using an EVAL tag in the XML code  and use the variable in the main basesearch like "index=test $exclude_uri$ | stats count BY cs_uri_sterm" I tried something like and use a panel to display the result :     <form> <fieldset submitButton="true"> <input type="text" token="exclude" searchWhenChanged="true"> <default></default> <change> <eval token="exclude_uri">replace(replace(trim(exclude), "\\s+", " "), "(\\S+)", "cs_uri_sterm!=\"\1\"")</eval> </change> </input> </fieldset> <row> <panel> <html> <p>Token:<b>$exclude_uri$</b></p> </html> </panel> </row> </form>     However i´m facing several issues : - for some reasons multiple whitespaces are removed by default even though i decide to remove the replace and trim functions dedicated to that. Why ? - \1 does not seem to be recognized. For some reasons a lot of people do not need \\ but it seems like i need them but this does not work for \\1 anyway, only for \s+ and \S+. Any reason ? How can i make this work ?   On the other side, it does work if i implement the same in a makeresults test search as follows :     | makeresults | eval exclude="/assets/* /api/* " | eval exclude_uri=replace(replace(trim(exclude), "(\S+)", "cs_uri_sterm!=\"\1\""), "\s+", " ")     providing :     _time exclude exclude_uri 2022-11-10 15:21:17 /assets/* /api/* cs_uri_sterm!="/assets/*" cs_uri_sterm!="/api/*"     Why is it different ? Solution 2 : Use a Makeresults like above and use the output of it as a direct filter in my basesearch I tried that but im not able to find a proper solution :     index=test [ | makeresults | eval | eval exclude_uri=replace(replace(trim($exclude$), "(\S+)", "cs_uri_sterm!=\"\1\""), "\s+", " ") | table exclude_uri ]     I should get something like index=test cs_uri_sterm!="/assets/*" cs_uri_sterm!="/api/*" if the user filled "/assets/* /api/*" in the form text input. I have also tried the same with a MV field.      index=text [ | makeresults | eval exclude_uri=replace(replace(trim($exclude$), "(\S+)", "cs_uri_sterm!=\"\1\""), "\s+", " ") | makemv delim=" " exclude_uri | mvexpand exclude_uri | table exclude_uri ]     Nothing works. Spent few hours trying to look at solutions. Even tried to see if i could use something like search ... IN or where something. Any advice ? I really need this. Javascript may be my solution, i don´t know. Kind of stuck here.
hi   I use this relative time in my search   earliest=@d+7h latest=@d+19h    now I want the same slot time but one day ago (it means between 7h and 19h) so i am doing this but it doens... See more...
hi   I use this relative time in my search   earliest=@d+7h latest=@d+19h    now I want the same slot time but one day ago (it means between 7h and 19h) so i am doing this but it doenst works   earliest=-1d+7h latest=-1d+19h   what is wrong please?
is there an api to delete phases/task from a workbook? if not is there any plan to expose this api in the future?
is there an api to delete workbook using workbook ids? if not is there any plan to expose this api in the future?
I have created a AWS account in configuration and added key and secret. When I am trying to publish data to the SNS topic getting the following error: Search Alert - result="Failed", error="AWS SNS ... See more...
I have created a AWS account in configuration and added key and secret. When I am trying to publish data to the SNS topic getting the following error: Search Alert - result="Failed", error="AWS SNS topic "<name>" not found The SNS topic is present. What other configuration is required in order to publish alerts from Splunk to SNS
Hi,  I would like to have some discussion, opinions and use cases about how the Appdynamics tool is being used by the testing team when running some performance tests, maybe load testing, release te... See more...
Hi,  I would like to have some discussion, opinions and use cases about how the Appdynamics tool is being used by the testing team when running some performance tests, maybe load testing, release testing, etc,  Looking for some real-world scenarios.  Appreciate your responses. 
Hello Splunkers, I am ingesting data from azure eventhub, and after using some SEDCMDs in my props, I am making data into json. However, I see multiple entries of same data. apart from SEDMDs my ... See more...
Hello Splunkers, I am ingesting data from azure eventhub, and after using some SEDCMDs in my props, I am making data into json. However, I see multiple entries of same data. apart from SEDMDs my props is as below: HF pulldown_type = true KV_MODE=none   Searchhead pulldown_type = true INDEXED_EXTRACTIONS=json KV_MODE=none   Can you please help what mistake i am doing here?   Thanks
I have a timechart where I report quarterly trends for a metric. The X-axis shows up with the regular dates. I would like it instead to show the quarter, like "2022 Q4." There doesn't seem to be a st... See more...
I have a timechart where I report quarterly trends for a metric. The X-axis shows up with the regular dates. I would like it instead to show the quarter, like "2022 Q4." There doesn't seem to be a strftime format option for quarters, though. I'm willing to use CSS/HTML if I have to, but I'd prefer to handle it all within simple XML. What are my options?