All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello everyone,  So according to the Splunk blog: Splunk Security Advisory for Apache Log4j (CVE-2021-44228 and CVE-2021-45046) | Splunk it says that the affected versions are: "All supported non-... See more...
Hello everyone,  So according to the Splunk blog: Splunk Security Advisory for Apache Log4j (CVE-2021-44228 and CVE-2021-45046) | Splunk it says that the affected versions are: "All supported non-Windows versions of 8.1.x and 8.2.x only if DFS is used. "  I'm using Splunk Enterprise Search Head & Indexer with version 7.3.1 and I can see various log4j-1.2.17.jar files under location "/bin/jars/vendors/spark/2.3.0/lib/", "/etc/apps/splunk_app_db_connect/bin/lib/", /etc/apps/splunk_archiver/java-bin/jars/vendors/spark/ and etc.  Also, I am attaching the result I received from a search query to determine if DFS is enabled on my Splunk servers. Should I be concerned about this vulnerability?  Also to remediate, do I just need to replace this log4j-1.2.17.jar with the latest files directly in the respective directories or do I need to make any changes in the conf files as well?  Any help will be appreciated.  Thank you!  
Our TA is cloud vetted on Splunkbase. Our TA is working on the Splunk cloud Classic version whereas the same TA is throwing the below error while accessing the setup page on the Splunk cloud Victori... See more...
Our TA is cloud vetted on Splunkbase. Our TA is working on the Splunk cloud Classic version whereas the same TA is throwing the below error while accessing the setup page on the Splunk cloud Victoria version.  Please let me know how to identify the root cause and resolve the error. Thanks in advance.    
Currently all our applications are using 1.6.0 version of splunk-library-javalogging. This dependency has the vulnerable log4j2 version and I am trying to upgrade the to the latest version 1.11.3. I ... See more...
Currently all our applications are using 1.6.0 version of splunk-library-javalogging. This dependency has the vulnerable log4j2 version and I am trying to upgrade the to the latest version 1.11.3. I downloaded the dependency from splunk website and tried installing using maven. The installation is successful but this version is unable to recognize the appenders we have configured on log4j2.xml. Below are the errors I am  seeing when using the latest splunk-library-javalogging dependency. 2021-12-22 14:12:55,484 WrapperListener_start_runner ERROR Error processing element HttpSplunk ([AppenderSet: null]): CLASS_NOT_FOUND 2021-12-22 14:12:55,485 WrapperListener_start_runner ERROR Error processing element HttpSplunk ([AppenderSet: null]): CLASS_NOT_FOUND 2021-12-22 14:12:55,642 WrapperListener_start_runner ERROR No node named SplunkDevAppender in org.apache.logging.log4j.core.appender.AppenderSet@7ac97772 2021-12-22 14:12:55,643 WrapperListener_start_runner ERROR Null object returned for ScriptAppenderSelector in Appenders.   2021-12-22 14:12:55,649 WrapperListener_start_runner ERROR Unable to locate appender "SelectSplunkInstance" for logger config "root"   Log4J2 Configuration: ============================= <Configuration  status="warn" name="splunk-cloudhub" packages="com.splunk.logging,com.mulesoft.ch.logging.appender">    <Properties>    <Property name="target.env">${sys:env}</Property>        <Property name="target.app">${project.name}</Property>         <Property name="dev.splunk.host">{{host}}</Property>         <Property name="dev.splunk.token">{{token}}</Property>     </Properties>     <Appenders>   <ScriptAppenderSelector name="SelectSplunkInstance">       <Script language="JavaScript"><![CDATA[         "${target.env}".search("prd") > -1 ? "SplunkPrdAppender" : "SplunkDevAppender";]]>       </Script>       <AppenderSet>         <HttpSplunk name="SplunkDevAppender"               url="https://${dev.splunk.host}"               token="${dev.splunk.token}"               host=""               index="index1"               source="${target.app}-${target.env}"               sourcetype="application"   messageFormat="json"               middleware=""               send_mode="sequential"               batch_size_bytes="0"               batch_size_count="0"               batch_interval="0"               disableCertificateValidation="true">             <PatternLayout pattern="%m"/>         </HttpSplunk>         <HttpSplunk name="SplunkPrdAppender"               url="https://${prd.splunk.host}"               token="${prd.splunk.token}"               host=""               index="index2"               source="${target.app}-${target.env}"               sourcetype="application"   messageFormat="json"               middleware=""               send_mode="sequential"               batch_size_bytes="0"               batch_size_count="0"               batch_interval="0"               disableCertificateValidation="true">             <PatternLayout pattern="%m"/>         </HttpSplunk>       </AppenderSet>     </ScriptAppenderSelector>              </Appenders> ============================= Could you please let us know if there are any additional steps that we have to follow to get this to work. Attaching our log4j2 file for your reference.
hi Experts, We are planning to decommission on-prem Splunk Ent 8.0. can anyone advise on how to backup and archive existing Splunk Indexed data for future reference? also if we have to open this a... See more...
hi Experts, We are planning to decommission on-prem Splunk Ent 8.0. can anyone advise on how to backup and archive existing Splunk Indexed data for future reference? also if we have to open this archived data in the future then how we can open it without Splunk. We have 1xSH, 1xIndexer, 2 HF all are 8.0.   
I can see that there is a new version of splunk-library-javalogging dependency released for the log4j2 vulnerability. Can we just override the log4j2 versions to the newer version(2.17.0) on the pare... See more...
I can see that there is a new version of splunk-library-javalogging dependency released for the log4j2 vulnerability. Can we just override the log4j2 versions to the newer version(2.17.0) on the parent pom instead of updating the splunk-library-javalogging dependency to 1.11.*?
Searching _internal for source=sc4s shows:       srlssydr01 syslog-ng 174 - [meta sequenceId="32595295"] Message(s) dropped while sending message to destination; driver='d_hec_fmt#0', worker_inde... See more...
Searching _internal for source=sc4s shows:       srlssydr01 syslog-ng 174 - [meta sequenceId="32595295"] Message(s) dropped while sending message to destination; driver='d_hec_fmt#0', worker_index='5', time_reopen='10', batch_size='19'       and       srlssydr01 syslog-ng 174 - [meta sequenceId="32594764"] http: handled by response_action; action='drop', url='https://http-inputs-acme.splunkcloud.com:443/services/collector/event', status_code='400', driver='d_hec_fmt#0', location='root generator dest_hec:5:5'      
I want to join two source types ST1(has fields id,title) and ST2(no fields only _raw="xid https://www.example.com?q1=test1&q2=test2") . I have tried via join it is working but due to sub search row c... See more...
I want to join two source types ST1(has fields id,title) and ST2(no fields only _raw="xid https://www.example.com?q1=test1&q2=test2") . I have tried via join it is working but due to sub search row constraint, I  am getting wrong result. I have tried without join(sourcetype="ST1" OR sourcetype="ST2" approach), I am getting incorrect result. sourcetype="ST1" (id,title are fields here) id=1 title=one id=2 title=two  id=3 title=three sourcetype="ST2" _raw 1 "GET https://www.example.com?q1=one" 2 "GET https://www.example.com?q1=test&q2=test2"  3 "GET https://www.example.com?q3=thr" I want to join these source types and get the below output(grab the url params alone in source type ST2). Can you please help me on this? id title params 1 one q1=one 2 two q1=test&q2=test2 3 three q3=thr
I have a base search below but I need to use a time_window that is in table since various logs come in at diff times and I'm trying to create alerts for indexes not reporting but I dont want false po... See more...
I have a base search below but I need to use a time_window that is in table since various logs come in at diff times and I'm trying to create alerts for indexes not reporting but I dont want false positives for indexes that have a expected time lag.  splunk_security_index is used to get a specific subset of indexes.     | tstats max(_time) as _time where index=* by index sourcetype | lookup splunk_security_indexes.csv index as index OUTPUT index time_window | eval time_window="-7d@d" | where _time < relative_time(now(),'time_window')
I have started getting Event processing errors in the MC & messages on the ES main page. I looked for skipped & delayed searches but did not find the root cause. Please advise. Thanks a lot. Happy ho... See more...
I have started getting Event processing errors in the MC & messages on the ES main page. I looked for skipped & delayed searches but did not find the root cause. Please advise. Thanks a lot. Happy holidays.
Is there a way to remove or relocate the floating "Splunk Product Guidance" button that appears on the lower right of search results? It has a tendency to block useful information and it's fairly ann... See more...
Is there a way to remove or relocate the floating "Splunk Product Guidance" button that appears on the lower right of search results? It has a tendency to block useful information and it's fairly annoying.      
I have some data with a field called "priority", which has a value from P1 -> P5. this search query: ... | stats count as Quantity by priority   produces a table that looks something like this:... See more...
I have some data with a field called "priority", which has a value from P1 -> P5. this search query: ... | stats count as Quantity by priority   produces a table that looks something like this: priority Quantity P2 1 P3 1 P4 6 P5 3   As you can see,  there are no data entries with a priority of "P1". However, I would like to actually include that as a row in the table and show that there is a quantity of "0". Ideally I would want to include all 5 priority levels for any dataset, even when they are empty Can anyone help and let me know how I can do this? Is there a way to specify which values to count?
Hello Splunk Community! My team and I have been stuck trying to get the Splunk add-on for JMX working for us. We've installed the add-on to a heavy forwarder and are trying to connect to the local J... See more...
Hello Splunk Community! My team and I have been stuck trying to get the Splunk add-on for JMX working for us. We've installed the add-on to a heavy forwarder and are trying to connect to the local JMX server URL. After doing so, we get the following in our jmx.log: 2021-12-22 17:03:19,883 - com.splunk.modinput.ModularInput -5318 [Thread-2] INFO [] - Failed connection with service:jmx:rmi://hostname/jndi/rmi://hostname:8578/hostname/8577/jmxrmi, connecting with service:jmx:rmi://hostname/jndi/JMXConnector . 2021-12-22 17:03:19,884 - com.splunk.modinput.ModularInput -5319 [Thread-2] ERROR [] - Exception@checkConnector, e= java.io.IOException: Failed to retrieve RMIServer stub: javax.naming.NoInitialContextException: Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial at javax.management.remote.rmi.RMIConnector.connect(RMIConnector.java:369) ~[?:1.8.0_275] at javax.management.remote.JMXConnectorFactory.connect(JMXConnectorFactory.java:270) ~[?:1.8.0_275] at javax.management.remote.JMXConnectorFactory.connect(JMXConnectorFactory.java:229) ~[?:1.8.0_275] at com.splunk.jmx.ServerTask.connect(Unknown Source) ~[jmxmodinput.jar:?] at com.splunk.jmx.ServerTask.checkConnector(Unknown Source) [jmxmodinput.jar:?] at com.splunk.jmx.Scheduler.run(Unknown Source) [jmxmodinput.jar:?] Caused by: javax.naming.NoInitialContextException: Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:662) ~[?:1.8.0_275] at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:313) ~[?:1.8.0_275] at javax.naming.InitialContext.getURLOrDefaultInitCtx(InitialContext.java:350) ~[?:1.8.0_275] at javax.naming.InitialContext.lookup(InitialContext.java:417) ~[?:1.8.0_275] at javax.management.remote.rmi.RMIConnector.findRMIServerJNDI(RMIConnector.java:1955) ~[?:1.8.0_275] at javax.management.remote.rmi.RMIConnector.findRMIServer(RMIConnector.java:1922) ~[?:1.8.0_275] at javax.management.remote.rmi.RMIConnector.connect(RMIConnector.java:287) ~[?:1.8.0_275] ... 5 more Our jmx_servers.conf file is configured as follows: [default] [jms4] account_name = username destinationapp = Splunk_TA_jmx jmx_url = service:jmx:rmi://hostname/jndi/rmi://hostname:8578/hostname/8577/jmxrmi protocol = url account_password = password We confirmed that the URL works, because we were able to reach it with the following code: class Scratch { public static void main(String[] args) throws Exception { final JMXServiceURL jmxUrl = new JMXServiceURL("service:jmx:rmi://hostname/jndi/rmi://hostname:9878/hostname/9877/jmxrmi"); final Map<String, String[]> props = new HashMap<>(); props.put("jmx.remote.credentials", new String[]{"username", "password"}); final JMXConnector jmxConnector = JMXConnectorFactory.connect(jmxUrl, props); final MBeanServerConnection mbsc = jmxConnector.getMBeanServerConnection(); Arrays.asList(mbsc.getDomains()).forEach(System.out::println); } Any assistance is appreciated.
Hi Guys,  Hope you can help me out.  Consider the following data in Splunk:      { attrs: { account: 85859303 version: 1.3848 } line: { application_version: 1.94949303... See more...
Hi Guys,  Hope you can help me out.  Consider the following data in Splunk:      { attrs: { account: 85859303 version: 1.3848 } line: { application_version: 1.94949303 message: Event with key 84js9393: {"entity": {"customer_id": "K123456", "order_id": "Sjd49493-93nd-9494-jdjd-mskaldjfhfhh", "collection_id": "djdis939-9398-9488-j939-md839md93000", "issuer_id": null}} thread: springfield timestamp: 2021-12-21 19:30:52,123 } }       I would like to extract the order_id and use it in my search:  order_id=Sjd49493-93nd-9494-jdjd-mskaldjfhfhh Hope someone can help or point me in the right direction. Cheers!  Matthew 
In Java, I am trying to call a curl command that has a Splunk search to get contents of a lookup file. I've used https://docs.splunk.com/Documentation/Splunk/8.0.3/RESTTUT/RESTsearches as my startin... See more...
In Java, I am trying to call a curl command that has a Splunk search to get contents of a lookup file. I've used https://docs.splunk.com/Documentation/Splunk/8.0.3/RESTTUT/RESTsearches as my starting point.  Too bad they don't show how to use Java like they do for curl and python. >>>>> The curl command works fine outside of Java. curl -u admin:password -k https://1.2.3.4:8089/services/search/jobs/export -d output_mode=csv -d search=" | inputlookup hosts-info" >>>>> Here is the Java program : import java.io.IOException; import java.io.BufferedReader; import java.io.InputStreamReader; import java.util.stream.Collectors; public class tstcurl { public static void main(String[] args) { String command = "curl -u admin:password -k https://1.2.3.4:8089/services/search/jobs/export -d output_mode=csv -d search=\" | inputlookup hosts-info\""; try { System.out.println("Creating curl command: [" + command + "]"); Process process = Runtime.getRuntime().exec(command); String result = new BufferedReader(new InputStreamReader(process.getInputStream())).lines().collect(Collectors.joining("\n")); System.out.println(result); } catch (IOException e) { e.printStackTrace(); } } >>>>> Output of 'java -jar tst-curl.jar': Creating curl command: [curl -u admin:password -k https://1.2.3.4:8089/services/search/jobs/export -d output_mode=csv -d search=" | inputlookup hosts-info"] <?xml version="1.0" encoding="UTF-8"?> <response> <messages> <msg type="ERROR">Error in 'SearchParser': Missing a search command before '"'. Error at position '0' of search query '"'.</msg> </messages> </response> >>>>> Help please I've done the following: Looked into /opt/splunk/var/log/splunkd.log for any other messages related to this. I've turned on debug and no other messages related to this issue are in the splunkd.log I've search google for anybody else having this issue I've looked into getting the Splunk SDK but seems like extra effort to just read the lookup file. If anybody has made this work, please put share your solution.
I am running machine agent on Linux, installed using rpm. How can i configure the MySQL extension, extension available in GIT is not well documented. 
Hi there, I'm trying to do a search that look at the latest status of a given actionid everyday to make a kind of day by day backlog (event if no action was done). To be clearer, let's say we have t... See more...
Hi there, I'm trying to do a search that look at the latest status of a given actionid everyday to make a kind of day by day backlog (event if no action was done). To be clearer, let's say we have the events :  12/01/2021 actionid=actionid1 status=start 12/02/2021 actionid=actionid2 status=start 12/06/2021 actionid=actionid1 status=sent 12/08/2021 actionid=actionid2 status=sent 12/09/2021 actionid=actionid1 status=done 12/10/2021 actionid=actionid2 status=done  The status of a given action doesn't evolve (until it change) but I've no event before it change. And the result that I want is something like (representing the backlog of the actionid by status day by day) :   start sent done 12/01/2021 1 0 0 12/02/2021 2 0 0 12/03/2021 2 0 0 12/04/2021 2 0 0 12/05/2021 2 0 0 12/06/2021 1 1 0 12/07/2021 1 1 0 12/08/2021 0 2 0 12/09/2021 0 1 1 12/10/2021 0 0 2   I hope my request is clear enough. Any help would by great Best regards, Francois
Has Splunk for Blue Coat ProxySG v3.0.7 been replaced by Splunk Add-on for Symantec Blue Coat ProxySG? Currently have the former installed and must replace it due to it being archived and no longer ... See more...
Has Splunk for Blue Coat ProxySG v3.0.7 been replaced by Splunk Add-on for Symantec Blue Coat ProxySG? Currently have the former installed and must replace it due to it being archived and no longer being supported.
Hello, I have a strange behavior with one of my panel where i have a clustered map. Pratically i have two imput panel, one that permit the selection of a country, and the second one who permit th... See more...
Hello, I have a strange behavior with one of my panel where i have a clustered map. Pratically i have two imput panel, one that permit the selection of a country, and the second one who permit the selection of a city. Depending on the city select i have two tokens that saves the respecting lat and lgt of the city, and then in the xml source i give this two toker for centering the map. Pratically the result is blank, but if during editing before saving i go the source tab then the visualization works. But then if i refresh the page the map doesn't appear anymore. Hero the code that i'm using: <search> <query>| inputlookup Worldcities.csv       | search city = $Area$ | dedup city </query> <done> <set token="lat">$result.lat$</set> <set token="lng">$result.lng$</set> <set token="zoom">7</set> </done> </search> <fieldset submitButton="false"> <input type="dropdown" token="country"> <label>Country</label> <fieldForLabel>name</fieldForLabel> <fieldForValue>name</fieldForValue> <search> <query>| inputlookup Country.csv | search name= Spain OR name = Italy</query> <earliest>-15m</earliest> <latest>now</latest> </search> <default>Italy</default> </input> <input type="dropdown" token="Area" searchWhenChanged="true"> <label>Area</label> <fieldForLabel>city</fieldForLabel> <fieldForValue>city</fieldForValue> <search> <query>| inputlookup Worldcities.csv | search country = $country$ | dedup city</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <default>Milan</default> </input> </fieldset> <map> <search> <query>(query that gives me the value needed) | geostats latfield=Latitude longfield=Longitude globallimit=0 maxzoomlevel=13 translatetoxy=false count by test</query> <earliest>-24h@h</earliest> <latest>now</latest> <refresh>30s</refresh> <refreshType>delay</refreshType> </search> <option name="drilldown">none</option> <option name="mapping.map.center">($lat$,$lng$)</option> <option name="mapping.map.zoom">7</option> <option name="mapping.type">marker</option> <option name="refresh.display">progressbar</option> </map> Is something wrong with the code or it's just a bug?  Thank you
Hello, I have 2 lookups, L0011 which contains all (Known) products with the vulnerability Log4shell and L0012 with all the products and assets that I have in house. I would like to join these 2 loo... See more...
Hello, I have 2 lookups, L0011 which contains all (Known) products with the vulnerability Log4shell and L0012 with all the products and assets that I have in house. I would like to join these 2 lookups to have at the end: all vulnerable products that I have and the assets for each products. But so far the joining is not working. I have used the command join and lookup, i have added wildcard on the lookup definition  also, but it's not working either. (the results is not exhaustive, i have very few matches) the main issue is that the names of the products don't match identically (even with wildcard). Do you guys have any idea on how could I do matching with my two lookups? do not hesiate to ask if I need to clarify more. Thanks a lot in advance
Hi Splunkers, We are using Splunk enterprise 8.2.2.1 and we can frequently see the IOWait health yellow or red. Although it is getting green it self after some time but it does not feel good  if we ... See more...
Hi Splunkers, We are using Splunk enterprise 8.2.2.1 and we can frequently see the IOWait health yellow or red. Although it is getting green it self after some time but it does not feel good  if we see splund health yellow or red. So, I am looking some permanent solution for this. Please find the below message which I am getting. "Sum of 3 highest per-cpu iowaits reached yellow threshold of 7" Unhealthy Instances indexcr02.xyx.abc... Thanks.