All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

This happen sometime on this DB and the only one that I experience this. The view, fields and value are pretty simple, my input also. But sometime it don't index certain fields that make it difficult... See more...
This happen sometime on this DB and the only one that I experience this. The view, fields and value are pretty simple, my input also. But sometime it don't index certain fields that make it difficult to make statistical report. I checked the view and they certainly have those fields and value but the events are lacking them Another weird thing is some events are duplicate and some aren't. The events that are not duplicate is the one lacking fields. I never seen this before and don't know how to resovle this.
Hello friends. I had a question for you I wanted to see how I can convert an alert in Splank to IODEF format?
Hi, I have a dashboard on Dashboard Studio and since this morning, I have been unable to use the "Open in Search" icon for each search panel. I am also using a token for a text input on this dashbo... See more...
Hi, I have a dashboard on Dashboard Studio and since this morning, I have been unable to use the "Open in Search" icon for each search panel. I am also using a token for a text input on this dashboard. Why would this be as I made no changes to the respective JSON data of this dashboard?   Many thanks, Patrick
There are two columns with headings "new image Name" and "source image Name".  The new images are derived from source images. Also, occasionally, images are created from existing images as well. Plea... See more...
There are two columns with headings "new image Name" and "source image Name".  The new images are derived from source images. Also, occasionally, images are created from existing images as well. Please see sample data below.        new image Name | source image Name ---------------------------------- image1 | baseline image2 | baseline image3 | image1 image4 | baseline image5 | image3 image6 | imageX          Observations: From above table, we see that "image3" is derived from "image1" which is derived from "baseline".  For "image5" in col1, it is derived from "image3", which in turn is derived from "image1".  For "image6" in col1, it is derived from "imageX" which can be an "Unknown" source Requirements: Wherever we see that it is not "baseline" in col2, we need to check if corresponding value exists in col1 and populate its col2 value. If we get "baseline" as output, then we need to mark it "baseline" If we get another image name as the col2 value, then we need to mark it "Unknown" If we get an imageX as the col2 value, then we need to mark it "Unknown" Final result:       new image Name | source image Name ---------------------------------- image1 | baseline image2 | baseline image3 | baseline image4 | baseline image5 | Unknown image6 | Unknown         Please help.
Hi Team,  We have mobile app instrumentation using IOS SDK has been done  and however there's no data available for Custom data, custom timers, breadcrumbs and custom metrics for crash events correl... See more...
Hi Team,  We have mobile app instrumentation using IOS SDK has been done  and however there's no data available for Custom data, custom timers, breadcrumbs and custom metrics for crash events correlation. As far as I know,  AppDynamics iOS SDK instruments  methods in the application automatically.  Somehow, I am not able to see these datatypes reporting data to controller.  Kindly advise if we have to do any further instrumentation on application methods to show the same. Please advise. 
Hello. I have a link like: http://n12345.se:8888/p2pd/servlet. If you surf this link you get back to: Status: Running, Start time: 2023-05-05, localTime: 2023-05-30. How can I use this in an SP... See more...
Hello. I have a link like: http://n12345.se:8888/p2pd/servlet. If you surf this link you get back to: Status: Running, Start time: 2023-05-05, localTime: 2023-05-30. How can I use this in an SPLUNK report, have no log to read but want to go directly towards the link and retrieve the result of what the link returns(see above).   Regard Jappen
HI Friends, Recently our Splunk cloud upgraded to version: 9.0.2303.101. After this upgrade in Splunk dashboard image hyperlinks not working.  Could you please assist on this issue? This is our... See more...
HI Friends, Recently our Splunk cloud upgraded to version: 9.0.2303.101. After this upgrade in Splunk dashboard image hyperlinks not working.  Could you please assist on this issue? This is our dashboard source code: <panel id="pg_os_kpi_performance_indicator"> <title>OS Key Performance Indicators</title> <html> <center> <button class="pg_button_launch"> <a class="pg_link" href="prime_global_os_base_metrics" target="_blank">OS Base Metrics</a> <a class="pg_link" href="prime_global_os_base_metrics" target="_blank"> <img class="baseImage" src="/static/app/PG_COMMON_LIBRARY/images/Dashboard2.png" style="height: 50px;"/> <img class="overlapImage" src="/static/app/PG_COMMON_LIBRARY/images/OpenNew.png"/>. </a> </button>   Normal string hyperlinks working fine. Only image hyperlink not working. Image is visible while click that image new dashboard not opened.   Kindly provide some suggestion to overcome this issue.
Hello, i'm looking for a way to reduce or even stop AppDynamics from building the DexArchiveBuilderTask/ DexBuilderDebug task during debug builds which is causing a bottle neck in every single Gradl... See more...
Hello, i'm looking for a way to reduce or even stop AppDynamics from building the DexArchiveBuilderTask/ DexBuilderDebug task during debug builds which is causing a bottle neck in every single Gradle build.  I've added the flag false but AppDynamic seems to still trigger the dex builder.  adeum { .. enabledForDebugBuilds = false .. } problem:  Dex builder runs every time during debug build // ./gradlew build :app:myexample:dexBuilderDebug Duration: 102s / 30% Type: com.android.build.gradle.internal.tasks.DexArchiveBuilderTask Task Execution Categories: Uncategorized Warnings no warnings found Reason task ran Input property 'mixedScopeClasses` file /Users/****/build/intermediates/transform/AppDynamics/debug/349.jar has changed Input property 'mixedScopeClasses` file /Users/****/build/intermediates/transform/AppDynamics/debug/501/com/appdynamics/...BuildInfo.class has changed I'd appreciate any suggestion on how to achieve this as it's causing a large amount of build times during debug builds.  regards, Red
Hi, I am new to Splunk. How to search error messages in the log file using SPL. I am using the below formats to search for error messages. source="sample_logcat.txt" host="debug" sourcetype="Androi... See more...
Hi, I am new to Splunk. How to search error messages in the log file using SPL. I am using the below formats to search for error messages. source="sample_logcat.txt" host="debug" sourcetype="Android log" | head 20 source="sample_logcat.txt" host="debug" sourcetype="Android log" | tail 4 error* AND * | search iwlwifi error* AND * | search Bluetooth Is sub-search possible in Splunk? Can we search the result of a secondary or inner query as the input to the primary or outer question?  If possible can anyone explain in detail?
      | stats count | eval _time="1685158808" | eval rule_title="Test notable" | eval security_domain="Network" | eval urgency="Medium" | eval rule_name="Test rule" | eval dest="8.8.8.8" | ev... See more...
      | stats count | eval _time="1685158808" | eval rule_title="Test notable" | eval security_domain="Network" | eval urgency="Medium" | eval rule_name="Test rule" | eval dest="8.8.8.8" | eval src="1.1.1" | eval desc="Please investigate firewall log, and action" | sendalert notable param.mapfields=_time,desc,rule_id,rule_name,nes_fields,drilldown_name,drilldown_search,governance,control,default_owner,drilldown_earliest_offset,drilldown_latest_offset,next_steps,investigation_profiles,extract_artifacts,recommended_actions         Is it possible to use a timestamp to change the notable creation date time? it is creating notable everytime i hit search with the above query.` Additionally how do i move my description from below to the above description?      
HI Guys/Gals Does anyone have a writeup or walkthrough for the tryhackme Splunk Advanced room? 
Hi, There are some logs that come to Indexer with empty host field (host= ). These logs come to main index and I would like them to come to another index. I have source and sourcetype.   I try to o... See more...
Hi, There are some logs that come to Indexer with empty host field (host= ). These logs come to main index and I would like them to come to another index. I have source and sourcetype.   I try to override index and route the logs to another index but it does not work. Here is my config. I would appreciate any help. Thx.   props: [host:: ] TRANSFORMS-myindex=override-index   transforms: [override-index] REGEX = . DEST_KEY = _MetaData:Index FORMAT = myindex    
Hello splunkers! Is there is a way we can calculate moving/rolling averages such that the current data point, ```x(t)```, is somewhere in the middle of the window, rather than at the boundaries?  ... See more...
Hello splunkers! Is there is a way we can calculate moving/rolling averages such that the current data point, ```x(t)```, is somewhere in the middle of the window, rather than at the boundaries?  Ex: If ```window=5```, i want the MA to be calculated like:      MA = avg(x(t-2), x(t-1), x(t), x(t+1), x(t+2))     I am okay with using MLTK or any other method of implementing this.  --------------------------- For reference, in the following query, I show two similar methods of calculating MA - based on previous values of x(t) here       MA = avg(x(t-4), x(t-3), x(t-2), x(t-1), x(t)) | makeresults count=100 | streamstats count as s | eval n=(random() % 100000) + 1 | table s n | streamstats window=5 avg(n) as trend | autoregress n p=1-5 | fillnull value=0 | eval ma = avg(n, n_p1, n_p2, n_p3, n_p4) | fields s n ma trend     --------- PS: I am aware that in both methods, the absence of earlier/later values at the boundaries will cause the MA model to be inaccurate - I am okay to work around that.  Thanks in advance!  
Hi, could not find anything on the website. I like to try and maybe use splunk indefinteley on my home lab. Is there such a thing like free license or home license? Enterprise trial is for 60 days ... See more...
Hi, could not find anything on the website. I like to try and maybe use splunk indefinteley on my home lab. Is there such a thing like free license or home license? Enterprise trial is for 60 days and does not seem to fit my needs.
We upgraded from Splunk Ent 8.2.1 to Splunk Ent 9.0.4.1 installed on CentOS7 We have put all our dashboards in to "CompanyApp". Normally when I want to see details of an event I will click on "Ope... See more...
We upgraded from Splunk Ent 8.2.1 to Splunk Ent 9.0.4.1 installed on CentOS7 We have put all our dashboards in to "CompanyApp". Normally when I want to see details of an event I will click on "Open in Search" at the bottom of my dashboard. When I do a new tab is opened and normally it will show me the results. Now it just says "loading". I do have "admin" role assigned to myself. I assume this is some kind of permissions issue, but where and what do I look for? The 
Hi All We are using snmp_ta(snmp-modular-input_184) version 1.8.4 to get SNMP POLL data , we have configured MIBs as per the documents but we are able to see the data in splunk.  Configuration: we ... See more...
Hi All We are using snmp_ta(snmp-modular-input_184) version 1.8.4 to get SNMP POLL data , we have configured MIBs as per the documents but we are able to see the data in splunk.  Configuration: we have configured for BRIDGE-MIB and IF-MIB  below as the details of BRIDGE-MIB [snmp://BRIDGE] activation_key = XXXXXXXXXXXXXXXXX communitystring = public destination = xxx.xx.xx.xx do_bulk_get = 1 do_get_subtree = 0 index = catch_all ipv6 = 0 mib_names = BRIDGE-MIB object_names = port = 161 response_handler = SplitNonBulkResponseHandler snmp_mode = attributes snmp_version = 2C sourcetype = snmp_poll split_bulk_output = 1 timeout = 5 trap_rdns = 0 v3_authProtocol = usmHMACMD5AuthProtocol v3_privProtocol = usmDESPrivProtocol Checks:  a) snmpwalk - command is working b) no errors noticed in snmpmodinput_app_modularinput.log and splunkd.log  c) snmp services are running  ps -ef | grep snmp splunk 2955262 2954920 0 13:57 ? 00:00:00 /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/snmp_ta/bin/snmp.py splunk 2955268 2954920 0 13:57 ? 00:00:00 /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/snmp_ta/bin/snmp.py Kindly please help me why SNMP poll data is not reaching splunk and any steps that we have missed ? Also let us know if there any steps to validate SNMP POLL data         2023-05-29 10:38:37,270 INFO Starting process checker thread for parent PID 2926939 stanza:snmp://BRIDGE 2023-05-29 10:38:37,271 INFO Using Splunk's python stanza:snmp://BRIDGE 2023-05-29 10:38:37,271 INFO SNMP Modular Input executing stanza:snmp://BRIDGE 2023-05-29 10:38:37,281 INFO Building MIBs stanza:snmp://BRIDGE 2023-05-29 10:40:01,723 INFO Starting process checker thread for parent PID 2928030 stanza:snmp://BRIDGE 2023-05-29 10:40:01,762 INFO Using Splunk's python stanza:snmp://BRIDGE 2023-05-29 10:40:01,762 INFO SNMP Modular Input executing stanza:snmp://BRIDGE 2023-05-29 10:40:01,855 INFO Building MIBs stanza:snmp://BRIDGE 2023-05-29 10:48:11,878 INFO Starting process checker thread for parent PID 2929393 stanza:snmp://BRIDGE 2023-05-29 10:48:11,889 INFO Using Splunk's python stanza:snmp://BRIDGE 2023-05-29 10:48:11,889 INFO SNMP Modular Input executing stanza:snmp://BRIDGE 2023-05-29 10:48:11,927 INFO Building MIBs stanza:snmp://BRIDGE 2023-05-29 11:34:24,088 INFO Starting process checker thread for parent PID 2935597 stanza:snmp://BRIDGE 2023-05-29 11:34:24,191 INFO Using Splunk's python stanza:snmp://BRIDGE 2023-05-29 11:34:24,191 INFO SNMP Modular Input executing stanza:snmp://BRIDGE 2023-05-29 11:34:24,247 INFO Building MIBs stanza:snmp://BRIDGE 2023-05-29 11:47:58,495 INFO Starting process checker thread for parent PID 2937457 stanza:snmp://BRIDGE 2023-05-29 11:47:58,509 INFO Using Splunk's python stanza:snmp://BRIDGE 2023-05-29 11:47:58,509 INFO SNMP Modular Input executing stanza:snmp://BRIDGE 2023-05-29 11:47:58,575 INFO Building MIBs stanza:snmp://BRIDGE 2023-05-29 13:30:24,627 INFO Starting process checker thread for parent PID 2949972 stanza:snmp://BRIDGE 2023-05-29 13:30:24,672 INFO Using Splunk's python stanza:snmp://BRIDGE 2023-05-29 13:30:24,672 INFO SNMP Modular Input executing stanza:snmp://BRIDGE 2023-05-29 13:30:24,748 INFO Building MIBs stanza:snmp://BRIDGE 2023-05-29 13:55:34,635 INFO Starting process checker thread for parent PID 2954014 stanza:snmp://BRIDGE 2023-05-29 13:55:34,669 INFO Using Splunk's python stanza:snmp://BRIDGE 2023-05-29 13:55:34,669 INFO SNMP Modular Input executing stanza:snmp://BRIDGE 2023-05-29 13:55:34,742 INFO Building MIBs stanza:snmp://BRIDGE 2023-05-29 13:57:12,038 INFO Starting process checker thread for parent PID 2954920 stanza:snmp://BRIDGE 2023-05-29 13:57:12,038 INFO Using Splunk's python stanza:snmp://BRIDGE 2023-05-29 13:57:12,038 INFO SNMP Modular Input executing stanza:snmp://BRIDGE 2023-05-29 13:57:12,075 INFO Building MIBs stanza:snmp://BRIDGE ************ IF-MIB **************** 2023-05-29 13:55:34,744 INFO Building MIBs stanza:snmp://IPMIB 2023-05-29 13:57:12,768 INFO Starting process checker thread for parent PID 2954920 stanza:snmp://IPMIB 2023-05-29 13:57:12,842 INFO Using Splunk's python stanza:snmp://IPMIB 2023-05-29 13:57:12,842 DEBUG running pysnmp version 4.4.12 stanza:snmp://IPMIB 2023-05-29 13:57:12,842 DEBUG debug category 'io' enabled stanza:snmp://IPMIB 2023-05-29 13:57:12,842 DEBUG debug category 'dsp' enabled stanza:snmp://IPMIB 2023-05-29 13:57:12,842 DEBUG debug category 'msgproc' enabled stanza:snmp://IPMIB 2023-05-29 13:57:12,842 DEBUG debug category 'secmod' enabled stanza:snmp://IPMIB 2023-05-29 13:57:12,842 DEBUG debug category 'proxy' enabled stanza:snmp://IPMIB 2023-05-29 13:57:12,842 INFO SNMP Modular Input executing stanza:snmp://IPMIB 2023-05-29 13:57:12,880 INFO Building MIBs stanza:snmp://IPMIB *************** splunkd.log output of cat splunkd.log | grep snmp **** 05-29-2023 13:55:36.217 +0100 INFO WatchedFile [2954371 tailreader0] - Will begin reading at offset=6186 for file='/opt/splunk/var/log/splunk/1537685e8d29b1deb1f7562f984e6806_snmpmodinput_app_modularinput.log'. 05-29-2023 13:55:36.218 +0100 INFO WatchedFile [2954371 tailreader0] - Will begin reading at offset=2400 for file='/opt/splunk/var/log/splunk/651a0f0fde8ef37438adcef45b20bea2_snmpmodinput_app_modularinput.log'. 05-29-2023 13:57:00.101 +0100 INFO SpecFiles [2954916 MainThread] - Found external scheme definition for stanza="snmp://" from spec file="/opt/splunk/etc/apps/snmp_ta/README/inputs.conf.spec" with parameters="activation_key, snmp_mode, destination, ipv6, port, snmp_version, object_names, do_get_subtree, do_bulk_get, split_bulk_output, non_repeaters, max_repetitions, lexicographic_mode, communitystring, v3_securityName, v3_securityEngineId, v3_authKey, v3_privKey, v3_authProtocol, v3_privProtocol, snmpinterval, timeout, retries, listen_traps, trap_port, trap_host, trap_rdns, mib_names, response_handler, response_handler_args, log_level, use_system_python, system_python_path, run_process_checker" 05-29-2023 13:57:02.435 +0100 INFO ModularInputs [2954916 MainThread] - Endpoint argument settings for "snmp_mode": 05-29-2023 13:57:02.435 +0100 INFO ModularInputs [2954916 MainThread] - Endpoint argument settings for "snmp_version": 05-29-2023 13:57:02.435 +0100 INFO ModularInputs [2954916 MainThread] - Endpoint argument settings for "snmpinterval": 05-29-2023 13:57:02.435 +0100 INFO ModularInputs [2954916 MainThread] - Introspection setup completed for scheme "snmp". 05-29-2023 13:57:11.754 +0100 INFO ExecProcessor [2955230 exec_1] - New scheduled exec process: /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/snmp_ta/bin/snmp.py 05-29-2023 13:57:11.754 +0100 INFO ExecProcessor [2955230 exec_1] - New scheduled exec process: /opt/splunk/bin/python3.7 /opt/splunk/etc/apps/snmp_ta/bin/snmp.py 05-29-2023 13:57:13.832 +0100 INFO WatchedFile [2955265 tailreader1] - Will begin reading at offset=6507 for file='/opt/splunk/var/log/splunk/1537685e8d29b1deb1f7562f984e6806_snmpmodinput_app_modularinput.log'. 05-29-2023 13:57:13.833 +0100 INFO WatchedFile [2955265 tailreader1] - Will begin reading at offset=3200 for file='/opt/splunk/var/log/splunk/651a0f0fde8ef37438adcef45b20bea2_snmpmodinput_app_modularinput.log'.          
Hi, I have the following search that searches an index based on 2 textbook inputs: | inputlookup ABC | search src=$src_tok$ OR dest=$dest_tok$ I need to change this such that multiple s... See more...
Hi, I have the following search that searches an index based on 2 textbook inputs: | inputlookup ABC | search src=$src_tok$ OR dest=$dest_tok$ I need to change this such that multiple src's or multiple dest's are inputted at a given time by a user. Something like the following if dest="10.175.96.146 10.175.96.147 10.175.96.148 0.175.96.149 10.175.96.150 10.175.96.183": | inputlookup ABC | search src=* OR (dest=10.175.96.146 AND dest=10.175.96.147 AND dest= 10.175.96.148 AND dest=10.175.96.149 AND dest=10.175.96.150 AND dest=10.175.96.183) How can I change my current search to search on multiple inputted values? Thanks as always
05-29-2023 06:00:46.836 +0000 WARN ConfigWatcher [249660 SplunkConfigChangeWatcherThread] - Failed to read file to checksum in init(), path=/opt/splunkforwarder/etc/apps/app1/local/inputs.conf 05-29... See more...
05-29-2023 06:00:46.836 +0000 WARN ConfigWatcher [249660 SplunkConfigChangeWatcherThread] - Failed to read file to checksum in init(), path=/opt/splunkforwarder/etc/apps/app1/local/inputs.conf 05-29-2023 06:00:46.836 +0000 ERROR ConfigWatcher [249660 SplunkConfigChangeWatcherThread] - initialization failed for path=/opt/splunkforwarder/etc/apps/app1/local/inputs.conf     Splunk Forwarder unable to read inputs.conf file. Seeing above 2 issues in splunkd.log
I have an input string  which contains strings like code =test1  description=test1 description status = pending,code =test2  description=test2 description status = COMPLTED, code =test3  description=... See more...
I have an input string  which contains strings like code =test1  description=test1 description status = pending,code =test2  description=test2 description status = COMPLTED, code =test3  description=test3 description status = COMPLETED_FIRST,code =test2  description=test2 description status = COMPLTED, Expected Ouput  Code   count  test2     2 test3      1 Basically i  am looking for whose status is completed or starts with completed word  those code name and completion count in the result. Can anyone please help me on this.
Hi, i have a lot of files, the size of each file can be 4M. the structure of each JSON file: Events/objects. Each event contains Payload and Header. Each file contains Metadata event/object. i ne... See more...
Hi, i have a lot of files, the size of each file can be 4M. the structure of each JSON file: Events/objects. Each event contains Payload and Header. Each file contains Metadata event/object. i need to add all metaData fields to each event within of the same file. example for input: Output-Splunk table, row for each event + add the metadata columns to each row. do it for all files. i tried to write the following but it's not correct: index=my_index source=my_source sourcetype=_json | search NOT MetaData | join type=left source  [ | search index=my_index source=my_source sourcetype=_json | search NOT HEADER] | table * thanks, Maayan   fdgd