All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, is the app REST storage/passwords Manager for Splunk compatible with jquery 3.5? thanks , Krithika
Hi all,  Pls consider this subset of data, ... - Date - Fruit - Seller - Bad_count - ... 11/8 - Apple - X - 3 11/8 - Apple - Y - 10 11/8 - Apple - X - 3 11/8 - Apple - Y - 10 11/8 - Orange - Y... See more...
Hi all,  Pls consider this subset of data, ... - Date - Fruit - Seller - Bad_count - ... 11/8 - Apple - X - 3 11/8 - Apple - Y - 10 11/8 - Apple - X - 3 11/8 - Apple - Y - 10 11/8 - Orange - Y - 6 11/8 - Orange - X - 1 11/8 - Orange - Y - 6 11/9 - Apple - X - 0 11/9 - Apple - Y - 9 11/9 - Apple - X - 0 11/9 - Orange - X - 7 11/9 - Orange - Y - 2 How to read it => Row 1: On 11/8 Seller X had 3 bad Apples, Row 8: on 11/9 Seller X had 0 bad Apples. I would like to reformat the table into this: ... - Date - Fruit - Seller - Bad_count - X_bad_count - Y_bad_count - ... 11/8 - Apple - X - 3 — 3 - 10 11/8 - Apple - Y - 10 — 3 - 10 11/8 - Apple - X - 3 — 3 - 10 11/8 - Apple - Y - 10 — 3 - 10 11/8 - Orange - Y - 6 — 1 - 6 11/8 - Orange - X - 1 — 1 - 6 11/8 - Orange - Y - 6 — 1 - 6 11/9 - Apple - X - 0 — 0 - 9 11/9 - Apple - Y - 9 — 0 - 9 11/9 - Apple - X - 0 — 0 - 9 11/9 - Orange - X - 7 — 7 - 2 11/9 - Orange - Y - 2 — 7 - 2 How to read this => Row 1: On 11/8 for Apples:  Seller X had 3 bad count and Seller Y had 10 bad count.   The idea is to split the Bad_count column into two columns based on the unique combination of Date and Fruit.  Any help would be greatly appreciated! Thanks, Shrey PS: 1) There's years of data, many many fruits, and multiple sellers in the original dataset. 2) I've also sorted the sample data by Fruit up there to make it easy to read. 3) Don't worry about the duplicate rows as there are other fields in the dataset as well (meaning, dedup with care).
index=cgr_trial host=rp00001234 rp00002345 rp00002344
Hi, We are running Splunk on 3 Environments Env#1 is Splunk Cloud v 8.2.2112.1 Env#2 is Splunk Cloud v 9.0.2208.3 Env#3 is Splunk Enterprise v 9.0.1  The following SPL successfully runs on Env... See more...
Hi, We are running Splunk on 3 Environments Env#1 is Splunk Cloud v 8.2.2112.1 Env#2 is Splunk Cloud v 9.0.2208.3 Env#3 is Splunk Enterprise v 9.0.1  The following SPL successfully runs on Env#2 and Env#3 and produces the expected result: | makeresults | eval mvfield=mvappend("1", "2", "3"), total=2 | foreach mode=multivalue mvfield      [eval total = total + <<ITEM>>] | table mvfield, total Result from running above search in Env#2 and Env#3:  mvfield total 1 2 3 8 Running the exactly same search in Env#1 triggers the ERROR: Error in 'eval' command: The expression is malformed. An unexpected character is reached at '<<ITEM>>'. Any advices on workaround? Thank you!
Hi, I´m facing a problem. I´m trying to solve my current issue via 2 different approaches but i´m unfortunately unable to complete any of these solutions. I´m trying to provide a dashboard with a ... See more...
Hi, I´m facing a problem. I´m trying to solve my current issue via 2 different approaches but i´m unfortunately unable to complete any of these solutions. I´m trying to provide a dashboard with a form whose some of the fields to fill should allow users to sort data using multiple inclusions or exclusions to fit what each and every team works with. Solution 1 : By using an EVAL tag in the XML code  and use the variable in the main basesearch like "index=test $exclude_uri$ | stats count BY cs_uri_sterm" I tried something like and use a panel to display the result :     <form> <fieldset submitButton="true"> <input type="text" token="exclude" searchWhenChanged="true"> <default></default> <change> <eval token="exclude_uri">replace(replace(trim(exclude), "\\s+", " "), "(\\S+)", "cs_uri_sterm!=\"\1\"")</eval> </change> </input> </fieldset> <row> <panel> <html> <p>Token:<b>$exclude_uri$</b></p> </html> </panel> </row> </form>     However i´m facing several issues : - for some reasons multiple whitespaces are removed by default even though i decide to remove the replace and trim functions dedicated to that. Why ? - \1 does not seem to be recognized. For some reasons a lot of people do not need \\ but it seems like i need them but this does not work for \\1 anyway, only for \s+ and \S+. Any reason ? How can i make this work ?   On the other side, it does work if i implement the same in a makeresults test search as follows :     | makeresults | eval exclude="/assets/* /api/* " | eval exclude_uri=replace(replace(trim(exclude), "(\S+)", "cs_uri_sterm!=\"\1\""), "\s+", " ")     providing :     _time exclude exclude_uri 2022-11-10 15:21:17 /assets/* /api/* cs_uri_sterm!="/assets/*" cs_uri_sterm!="/api/*"     Why is it different ? Solution 2 : Use a Makeresults like above and use the output of it as a direct filter in my basesearch I tried that but im not able to find a proper solution :     index=test [ | makeresults | eval | eval exclude_uri=replace(replace(trim($exclude$), "(\S+)", "cs_uri_sterm!=\"\1\""), "\s+", " ") | table exclude_uri ]     I should get something like index=test cs_uri_sterm!="/assets/*" cs_uri_sterm!="/api/*" if the user filled "/assets/* /api/*" in the form text input. I have also tried the same with a MV field.      index=text [ | makeresults | eval exclude_uri=replace(replace(trim($exclude$), "(\S+)", "cs_uri_sterm!=\"\1\""), "\s+", " ") | makemv delim=" " exclude_uri | mvexpand exclude_uri | table exclude_uri ]     Nothing works. Spent few hours trying to look at solutions. Even tried to see if i could use something like search ... IN or where something. Any advice ? I really need this. Javascript may be my solution, i don´t know. Kind of stuck here.
hi   I use this relative time in my search   earliest=@d+7h latest=@d+19h    now I want the same slot time but one day ago (it means between 7h and 19h) so i am doing this but it doens... See more...
hi   I use this relative time in my search   earliest=@d+7h latest=@d+19h    now I want the same slot time but one day ago (it means between 7h and 19h) so i am doing this but it doenst works   earliest=-1d+7h latest=-1d+19h   what is wrong please?
is there an api to delete phases/task from a workbook? if not is there any plan to expose this api in the future?
is there an api to delete workbook using workbook ids? if not is there any plan to expose this api in the future?
I have created a AWS account in configuration and added key and secret. When I am trying to publish data to the SNS topic getting the following error: Search Alert - result="Failed", error="AWS SNS ... See more...
I have created a AWS account in configuration and added key and secret. When I am trying to publish data to the SNS topic getting the following error: Search Alert - result="Failed", error="AWS SNS topic "<name>" not found The SNS topic is present. What other configuration is required in order to publish alerts from Splunk to SNS
Hi,  I would like to have some discussion, opinions and use cases about how the Appdynamics tool is being used by the testing team when running some performance tests, maybe load testing, release te... See more...
Hi,  I would like to have some discussion, opinions and use cases about how the Appdynamics tool is being used by the testing team when running some performance tests, maybe load testing, release testing, etc,  Looking for some real-world scenarios.  Appreciate your responses. 
Hello Splunkers, I am ingesting data from azure eventhub, and after using some SEDCMDs in my props, I am making data into json. However, I see multiple entries of same data. apart from SEDMDs my ... See more...
Hello Splunkers, I am ingesting data from azure eventhub, and after using some SEDCMDs in my props, I am making data into json. However, I see multiple entries of same data. apart from SEDMDs my props is as below: HF pulldown_type = true KV_MODE=none   Searchhead pulldown_type = true INDEXED_EXTRACTIONS=json KV_MODE=none   Can you please help what mistake i am doing here?   Thanks
I have a timechart where I report quarterly trends for a metric. The X-axis shows up with the regular dates. I would like it instead to show the quarter, like "2022 Q4." There doesn't seem to be a st... See more...
I have a timechart where I report quarterly trends for a metric. The X-axis shows up with the regular dates. I would like it instead to show the quarter, like "2022 Q4." There doesn't seem to be a strftime format option for quarters, though. I'm willing to use CSS/HTML if I have to, but I'd prefer to handle it all within simple XML. What are my options?
Hello, is Splunk Supporting Add-on for Active Directory version 3.0.5 is  compatible with python 3 ? Thanks, Krithika    
Hello, I am performing a search in Splunk Cloud but I am getting the following error, does anyone know how to resolve it. The search: index=sf-test-metrics domains NOT "sitefinity.com" NOT "*... See more...
Hello, I am performing a search in Splunk Cloud but I am getting the following error, does anyone know how to resolve it. The search: index=sf-test-metrics domains NOT "sitefinity.com" NOT "*iriscodebasesandbox.cloudapp.net*" NOT "*.bedford*" NOT "*.regression.com*" NOT "dynamicscan.cloudapp.net" NOT "irisdmsitsandbox" NOT "*irisdmuatsandbox*" NOT "*.sandbox.sitefinity.com*" NOT "localhost" domains{}= *\.* | spath path=modulesInfo{} output=x | fields x, clientId, domains{} | mvexpand x | spath input=x | search moduleName="AdminBridge" | where extensionBundlesCount > 0 | dedup clientId | table domains{}, extensionBundlesCount | sort -extensionBundlesCount The error: command.mvexpand: output will be truncated at 37800 results due to excessive memory usage. Memory threshold of 500MB as configured in limits.conf / [mvexpand] / max_mem_usage_mb has been reached.
I need to extract fields from log which is in xml format. Below is the example: <Event> <DateTime>2022-11-10T11:58:41.136Z</DateTime> <IBIC8>CTBAAUSN</IBIC8> <InstanceId>D</InstanceId> <EventCode... See more...
I need to extract fields from log which is in xml format. Below is the example: <Event> <DateTime>2022-11-10T11:58:41.136Z</DateTime> <IBIC8>CTBAAUSN</IBIC8> <InstanceId>D</InstanceId> <EventCode>PAG.NTF.CRL_UPDATE_SUCCESS</EventCode> <Name>CRL update succeeded</Name> <Severity>INFO</Severity> <Class>SECURITY</Class> <Text><![CDATA[CRL was successfully downloaded and validated Context: - URL: https://crlcheck.common.sipn.swift.com:443/SWIFTCA1.crl - Version: 2 - Updated on: Thu Nov 10 21:57:53 AEDT 2022 - Valid till: Sun Nov 13 21:57:53 AEDT 2022 - Issuer: o=swift]]></Text> </Event> I need to extract fields like eventcode,severity,text . How can i extract it as statistical data either by using regular expression or how it is...or there is any way to extract the, Please suggest
Hi All,   We are running with Splunk UF version 8.2.4 in our Linux x64 client machines and we have planned to get them upgraded to the latest version 9.0.2 hence i have downloaded the latest rpm pa... See more...
Hi All,   We are running with Splunk UF version 8.2.4 in our Linux x64 client machines and we have planned to get them upgraded to the latest version 9.0.2 hence i have downloaded the latest rpm package and usually we used to deploy the package using rpm in all our client servers so when we tried to deploy the package using RPM we are getting the below error.   So do we have anything needs to be done from our end before performing the upgrade of Splunk UF on Windows and Linux servers from 8.x to 9.x version?   "/opt/splunkforwarder/etc/auth/ca.pem": already a renewed Splunk certificate: skipping renewal "/opt/splunkforwarder/etc/auth/cacert.pem": already a renewed Splunk certificate: skipping renewal Failed to start mongod. Did not get EOF from mongod after 5 second(s). [App Key Value Store migration] Starting migrate-kvstore. Created version file path=/opt/splunkforwarder/var/run/splunk/kvstore_upgrade/versionFile36 Created version file path=/opt/splunkforwarder/var/run/splunk/kvstore_upgrade/versionFile40 [App Key Value Store migration] Collection data is not available. Starting KV Store storage engine upgrade: Phase 1 (dump) of 2: Failed to migrate to storage engine wiredTiger, reason=Failed to receive response from kvstore error=, service not ready after waiting for timeout=300271ms [App Key Value Store migration] Starting migrate-kvstore. Created version file path=/opt/splunkforwarder/var/run/splunk/kvstore_upgrade/versionFile42 [App Key Value Store migration] Collection data is not available. [DFS] Performing migration. [DFS] Finished migration. [Peer-apps] Performing migration. [Peer-apps] Finished migration. Creating unit file... Current splunk is running as non root, which cannot operate systemd unit files. Please create it manually by 'sudo splunk enable boot-start' later. Failed to create the unit file. Please do it manually later. Systemd unit file installed by user at /etc/systemd/system/SplunkForwarder.service. Configured as systemd managed service.   Nov 09 07:05:49 splunk[135425]: Invalid key in stanza [webhook] in /opt/splunkforwarder/etc/system/default/alert_actions.conf, line 229: enable_allowlist (value: false). Nov 09 07:05:49 splunk[135425]: Your indexes and inputs configurations are not internally consistent. For more information, run 'splunk btool check --debug' Nov 09 07:05:49 splunk[135425]: Checking conf files for problems... Nov 09 07:05:49 splunk[135425]: Done Nov 09 07:05:49 splunk[135425]: Checking default conf files for edits... Nov 09 07:05:49 splunk[135425]: Validating installed files against hashes from '/opt/splunkforwarder/splunkforwarder-9.0.2-17e00c557dc1-linux-2.6-x86_64-manifest' Nov 09 07:05:49 splunk[135425]: PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate validation for the httplib and urllib libraries shipped with the embedded Python interpreter; must be set to "1" for increased security Nov 09 07:05:49 splunk[135425]: 2022-11-09 07:05:49.925 -0600 splunkd started (build 17e00c557dc1) pid=135425
Hi Team,  I am a beginner at AppDynamics. Could you please provide the links for Appdynamics video tutorials for beginner to advanced to learning? I need Video tutorials to learn and explore AppD... See more...
Hi Team,  I am a beginner at AppDynamics. Could you please provide the links for Appdynamics video tutorials for beginner to advanced to learning? I need Video tutorials to learn and explore AppDynamics. Thanks Balaraju G
How do I join a search with a list of jobnames from a file DepC_listofjobs.csv. This file has only one column which has unique jobnames.   Below command, if I uncomment the line earliest=-8h i... See more...
How do I join a search with a list of jobnames from a file DepC_listofjobs.csv. This file has only one column which has unique jobnames.   Below command, if I uncomment the line earliest=-8h index=log-13120-prod-c laas_appId="pbmp.prediction*" "Prediction" ```| join [ inputlookup DepC_listofjobs.csv ]```    | bin _time span=1h    | stats dc(predictionId),dc(jobName), count by _time  predictionStatus
I have a search with timechart, If I use the VIS column chart I only see the mondays as date on the axis. Is it possible to show every day per column?   index=test host=test* |my  search  | eval ... See more...
I have a search with timechart, If I use the VIS column chart I only see the mondays as date on the axis. Is it possible to show every day per column?   index=test host=test* |my  search  | eval date=strptime(date,"%A%m/%d/%Y") | timechart span=1d count  
Hi, I wanted to see if there is anyway we can store credentials in Phantom which is not visible to the users within Phantom, but we can somehow fetch those credentials in Custom function within our... See more...
Hi, I wanted to see if there is anyway we can store credentials in Phantom which is not visible to the users within Phantom, but we can somehow fetch those credentials in Custom function within our playbook to do some API calls which need those credentials. .....? Note:- built in Phantom app's such as HTTP does not meet our requirements, therefore we're using python request module to make API calls within custom function to achieve this Thanks!