All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have some data with a field called "priority", which has a value from P1 -> P5. this search query: ... | stats count as Quantity by priority   produces a table that looks something like this:... See more...
I have some data with a field called "priority", which has a value from P1 -> P5. this search query: ... | stats count as Quantity by priority   produces a table that looks something like this: priority Quantity P2 1 P3 1 P4 6 P5 3   As you can see,  there are no data entries with a priority of "P1". However, I would like to actually include that as a row in the table and show that there is a quantity of "0". Ideally I would want to include all 5 priority levels for any dataset, even when they are empty Can anyone help and let me know how I can do this? Is there a way to specify which values to count?
Hello Splunk Community! My team and I have been stuck trying to get the Splunk add-on for JMX working for us. We've installed the add-on to a heavy forwarder and are trying to connect to the local J... See more...
Hello Splunk Community! My team and I have been stuck trying to get the Splunk add-on for JMX working for us. We've installed the add-on to a heavy forwarder and are trying to connect to the local JMX server URL. After doing so, we get the following in our jmx.log: 2021-12-22 17:03:19,883 - com.splunk.modinput.ModularInput -5318 [Thread-2] INFO [] - Failed connection with service:jmx:rmi://hostname/jndi/rmi://hostname:8578/hostname/8577/jmxrmi, connecting with service:jmx:rmi://hostname/jndi/JMXConnector . 2021-12-22 17:03:19,884 - com.splunk.modinput.ModularInput -5319 [Thread-2] ERROR [] - Exception@checkConnector, e= java.io.IOException: Failed to retrieve RMIServer stub: javax.naming.NoInitialContextException: Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial at javax.management.remote.rmi.RMIConnector.connect(RMIConnector.java:369) ~[?:1.8.0_275] at javax.management.remote.JMXConnectorFactory.connect(JMXConnectorFactory.java:270) ~[?:1.8.0_275] at javax.management.remote.JMXConnectorFactory.connect(JMXConnectorFactory.java:229) ~[?:1.8.0_275] at com.splunk.jmx.ServerTask.connect(Unknown Source) ~[jmxmodinput.jar:?] at com.splunk.jmx.ServerTask.checkConnector(Unknown Source) [jmxmodinput.jar:?] at com.splunk.jmx.Scheduler.run(Unknown Source) [jmxmodinput.jar:?] Caused by: javax.naming.NoInitialContextException: Need to specify class name in environment or system property, or as an applet parameter, or in an application resource file: java.naming.factory.initial at javax.naming.spi.NamingManager.getInitialContext(NamingManager.java:662) ~[?:1.8.0_275] at javax.naming.InitialContext.getDefaultInitCtx(InitialContext.java:313) ~[?:1.8.0_275] at javax.naming.InitialContext.getURLOrDefaultInitCtx(InitialContext.java:350) ~[?:1.8.0_275] at javax.naming.InitialContext.lookup(InitialContext.java:417) ~[?:1.8.0_275] at javax.management.remote.rmi.RMIConnector.findRMIServerJNDI(RMIConnector.java:1955) ~[?:1.8.0_275] at javax.management.remote.rmi.RMIConnector.findRMIServer(RMIConnector.java:1922) ~[?:1.8.0_275] at javax.management.remote.rmi.RMIConnector.connect(RMIConnector.java:287) ~[?:1.8.0_275] ... 5 more Our jmx_servers.conf file is configured as follows: [default] [jms4] account_name = username destinationapp = Splunk_TA_jmx jmx_url = service:jmx:rmi://hostname/jndi/rmi://hostname:8578/hostname/8577/jmxrmi protocol = url account_password = password We confirmed that the URL works, because we were able to reach it with the following code: class Scratch { public static void main(String[] args) throws Exception { final JMXServiceURL jmxUrl = new JMXServiceURL("service:jmx:rmi://hostname/jndi/rmi://hostname:9878/hostname/9877/jmxrmi"); final Map<String, String[]> props = new HashMap<>(); props.put("jmx.remote.credentials", new String[]{"username", "password"}); final JMXConnector jmxConnector = JMXConnectorFactory.connect(jmxUrl, props); final MBeanServerConnection mbsc = jmxConnector.getMBeanServerConnection(); Arrays.asList(mbsc.getDomains()).forEach(System.out::println); } Any assistance is appreciated.
Hi Guys,  Hope you can help me out.  Consider the following data in Splunk:      { attrs: { account: 85859303 version: 1.3848 } line: { application_version: 1.94949303... See more...
Hi Guys,  Hope you can help me out.  Consider the following data in Splunk:      { attrs: { account: 85859303 version: 1.3848 } line: { application_version: 1.94949303 message: Event with key 84js9393: {"entity": {"customer_id": "K123456", "order_id": "Sjd49493-93nd-9494-jdjd-mskaldjfhfhh", "collection_id": "djdis939-9398-9488-j939-md839md93000", "issuer_id": null}} thread: springfield timestamp: 2021-12-21 19:30:52,123 } }       I would like to extract the order_id and use it in my search:  order_id=Sjd49493-93nd-9494-jdjd-mskaldjfhfhh Hope someone can help or point me in the right direction. Cheers!  Matthew 
In Java, I am trying to call a curl command that has a Splunk search to get contents of a lookup file. I've used https://docs.splunk.com/Documentation/Splunk/8.0.3/RESTTUT/RESTsearches as my startin... See more...
In Java, I am trying to call a curl command that has a Splunk search to get contents of a lookup file. I've used https://docs.splunk.com/Documentation/Splunk/8.0.3/RESTTUT/RESTsearches as my starting point.  Too bad they don't show how to use Java like they do for curl and python. >>>>> The curl command works fine outside of Java. curl -u admin:password -k https://1.2.3.4:8089/services/search/jobs/export -d output_mode=csv -d search=" | inputlookup hosts-info" >>>>> Here is the Java program : import java.io.IOException; import java.io.BufferedReader; import java.io.InputStreamReader; import java.util.stream.Collectors; public class tstcurl { public static void main(String[] args) { String command = "curl -u admin:password -k https://1.2.3.4:8089/services/search/jobs/export -d output_mode=csv -d search=\" | inputlookup hosts-info\""; try { System.out.println("Creating curl command: [" + command + "]"); Process process = Runtime.getRuntime().exec(command); String result = new BufferedReader(new InputStreamReader(process.getInputStream())).lines().collect(Collectors.joining("\n")); System.out.println(result); } catch (IOException e) { e.printStackTrace(); } } >>>>> Output of 'java -jar tst-curl.jar': Creating curl command: [curl -u admin:password -k https://1.2.3.4:8089/services/search/jobs/export -d output_mode=csv -d search=" | inputlookup hosts-info"] <?xml version="1.0" encoding="UTF-8"?> <response> <messages> <msg type="ERROR">Error in 'SearchParser': Missing a search command before '"'. Error at position '0' of search query '"'.</msg> </messages> </response> >>>>> Help please I've done the following: Looked into /opt/splunk/var/log/splunkd.log for any other messages related to this. I've turned on debug and no other messages related to this issue are in the splunkd.log I've search google for anybody else having this issue I've looked into getting the Splunk SDK but seems like extra effort to just read the lookup file. If anybody has made this work, please put share your solution.
I am running machine agent on Linux, installed using rpm. How can i configure the MySQL extension, extension available in GIT is not well documented. 
Hi there, I'm trying to do a search that look at the latest status of a given actionid everyday to make a kind of day by day backlog (event if no action was done). To be clearer, let's say we have t... See more...
Hi there, I'm trying to do a search that look at the latest status of a given actionid everyday to make a kind of day by day backlog (event if no action was done). To be clearer, let's say we have the events :  12/01/2021 actionid=actionid1 status=start 12/02/2021 actionid=actionid2 status=start 12/06/2021 actionid=actionid1 status=sent 12/08/2021 actionid=actionid2 status=sent 12/09/2021 actionid=actionid1 status=done 12/10/2021 actionid=actionid2 status=done  The status of a given action doesn't evolve (until it change) but I've no event before it change. And the result that I want is something like (representing the backlog of the actionid by status day by day) :   start sent done 12/01/2021 1 0 0 12/02/2021 2 0 0 12/03/2021 2 0 0 12/04/2021 2 0 0 12/05/2021 2 0 0 12/06/2021 1 1 0 12/07/2021 1 1 0 12/08/2021 0 2 0 12/09/2021 0 1 1 12/10/2021 0 0 2   I hope my request is clear enough. Any help would by great Best regards, Francois
Has Splunk for Blue Coat ProxySG v3.0.7 been replaced by Splunk Add-on for Symantec Blue Coat ProxySG? Currently have the former installed and must replace it due to it being archived and no longer ... See more...
Has Splunk for Blue Coat ProxySG v3.0.7 been replaced by Splunk Add-on for Symantec Blue Coat ProxySG? Currently have the former installed and must replace it due to it being archived and no longer being supported.
Hello, I have a strange behavior with one of my panel where i have a clustered map. Pratically i have two imput panel, one that permit the selection of a country, and the second one who permit th... See more...
Hello, I have a strange behavior with one of my panel where i have a clustered map. Pratically i have two imput panel, one that permit the selection of a country, and the second one who permit the selection of a city. Depending on the city select i have two tokens that saves the respecting lat and lgt of the city, and then in the xml source i give this two toker for centering the map. Pratically the result is blank, but if during editing before saving i go the source tab then the visualization works. But then if i refresh the page the map doesn't appear anymore. Hero the code that i'm using: <search> <query>| inputlookup Worldcities.csv       | search city = $Area$ | dedup city </query> <done> <set token="lat">$result.lat$</set> <set token="lng">$result.lng$</set> <set token="zoom">7</set> </done> </search> <fieldset submitButton="false"> <input type="dropdown" token="country"> <label>Country</label> <fieldForLabel>name</fieldForLabel> <fieldForValue>name</fieldForValue> <search> <query>| inputlookup Country.csv | search name= Spain OR name = Italy</query> <earliest>-15m</earliest> <latest>now</latest> </search> <default>Italy</default> </input> <input type="dropdown" token="Area" searchWhenChanged="true"> <label>Area</label> <fieldForLabel>city</fieldForLabel> <fieldForValue>city</fieldForValue> <search> <query>| inputlookup Worldcities.csv | search country = $country$ | dedup city</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <default>Milan</default> </input> </fieldset> <map> <search> <query>(query that gives me the value needed) | geostats latfield=Latitude longfield=Longitude globallimit=0 maxzoomlevel=13 translatetoxy=false count by test</query> <earliest>-24h@h</earliest> <latest>now</latest> <refresh>30s</refresh> <refreshType>delay</refreshType> </search> <option name="drilldown">none</option> <option name="mapping.map.center">($lat$,$lng$)</option> <option name="mapping.map.zoom">7</option> <option name="mapping.type">marker</option> <option name="refresh.display">progressbar</option> </map> Is something wrong with the code or it's just a bug?  Thank you
Hello, I have 2 lookups, L0011 which contains all (Known) products with the vulnerability Log4shell and L0012 with all the products and assets that I have in house. I would like to join these 2 loo... See more...
Hello, I have 2 lookups, L0011 which contains all (Known) products with the vulnerability Log4shell and L0012 with all the products and assets that I have in house. I would like to join these 2 lookups to have at the end: all vulnerable products that I have and the assets for each products. But so far the joining is not working. I have used the command join and lookup, i have added wildcard on the lookup definition  also, but it's not working either. (the results is not exhaustive, i have very few matches) the main issue is that the names of the products don't match identically (even with wildcard). Do you guys have any idea on how could I do matching with my two lookups? do not hesiate to ask if I need to clarify more. Thanks a lot in advance
Hi Splunkers, We are using Splunk enterprise 8.2.2.1 and we can frequently see the IOWait health yellow or red. Although it is getting green it self after some time but it does not feel good  if we ... See more...
Hi Splunkers, We are using Splunk enterprise 8.2.2.1 and we can frequently see the IOWait health yellow or red. Although it is getting green it self after some time but it does not feel good  if we see splund health yellow or red. So, I am looking some permanent solution for this. Please find the below message which I am getting. "Sum of 3 highest per-cpu iowaits reached yellow threshold of 7" Unhealthy Instances indexcr02.xyx.abc... Thanks.
Dear Community. Given: events, each has start_time, end_time Time Range: [BEGIN, END] output the following statistic: for each time t, in [BEGIN, END] with interval 5 Min, count how many event... See more...
Dear Community. Given: events, each has start_time, end_time Time Range: [BEGIN, END] output the following statistic: for each time t, in [BEGIN, END] with interval 5 Min, count how many events satisfy: start_time < t < end_time I am looking at concurrency and timechart, but can't wrap my head around. Any help would be very appreciated!
Windows server installs a universal forwarder and receives data, but sometimes the forwarder is shut down. If you look at the internal log, it seems that there is an error in perfmon.exe. What infor... See more...
Windows server installs a universal forwarder and receives data, but sometimes the forwarder is shut down. If you look at the internal log, it seems that there is an error in perfmon.exe. What information should you look at and can you recommend related data?
Hi  @gcusello , I am curious to know why I am able to see  HTTP Event collector under the Data Inputs on my Indexer where there is no HTTP Event collector on Search Head. Indexer   Search Hea... See more...
Hi  @gcusello , I am curious to know why I am able to see  HTTP Event collector under the Data Inputs on my Indexer where there is no HTTP Event collector on Search Head. Indexer   Search Head Regards, Rahul Gupta  
This serach result will always return 3 rows. I want display all row but in trellis.  For the first row, it is the memory utilization for CIC-1 For the second row, it is the memory utilization for ... See more...
This serach result will always return 3 rows. I want display all row but in trellis.  For the first row, it is the memory utilization for CIC-1 For the second row, it is the memory utilization for CIC-2 For the third row, it is the memory utilization for CIC-3 How can I do the trellis to display based on rows? Do I need to add new column "Name" and insert CIC-1, CIC-2, CIC-3 to respective rows?    
Our application's log-entries are in JSON and I need to search for certain strings found in the field called message. I have no problem finding them with a regular search: ... AND (message="Appli... See more...
Our application's log-entries are in JSON and I need to search for certain strings found in the field called message. I have no problem finding them with a regular search: ... AND (message="Application is closing." OR message="successfully started") However, when I try to define a transaction with the seemingly same search criteria: ... | transaction source startsWith="message=\"Application is closing.\"" endsWith="message=\"successfully started\"" I get zero results... Am I escaping the quotes incorrectly or making some other syntax error?
Hello The server cannot be accessed directly. Can I set up a delpoyment-client on a remote forwarder?   The 8089 port is open.
Need help with a solution for errors I get saying "unrecoverable in the server.....Python 3.x.... " when downloading 60,000-100,000 search results on the ES please. Thx in advance
Is there a need for keeping the _internal index logs past a certain time period? My _internaldb is pretty large at 218GB total, db - 31, cold - 112, frozen - 75. You can see my current settings below... See more...
Is there a need for keeping the _internal index logs past a certain time period? My _internaldb is pretty large at 218GB total, db - 31, cold - 112, frozen - 75. You can see my current settings below. We have about 140 forwarders reporting to this indexer. Should I just remove the path to frozen and let them get deleted? Does anyone ever thaw internal logs? If so, what for? [_internal] homePath = $SPLUNK_DB\_internaldb\db coldPath = $SPLUNK_DB\_internaldb\colddb thawedPath = $SPLUNK_DB\_internaldb\thaweddb coldToFrozenDir = $SPLUNK_DB\_internaldb\frozendb frozenTimePeriodInSecs = 5184000 tstatsHomePath = volume:_splunk_summaries\_internaldb\datamodel_summary maxConcurrentOptimizes = 6 maxWarmDBCount = 60 maxHotSpanSecs = 86400 maxHotBuckets = 8 maxDataSize = auto
I am looking for a great Alert manager Add-on for ES. To ingest MS Azure AD Alerts data into ES. There are 2 of them called Azure Sentinel add-on for Splunk & Alert Manager Add-on on Splunkbase.com b... See more...
I am looking for a great Alert manager Add-on for ES. To ingest MS Azure AD Alerts data into ES. There are 2 of them called Azure Sentinel add-on for Splunk & Alert Manager Add-on on Splunkbase.com but it says with 0 installs for both. Has any champs here used one that is good for my needs? Thank u in advance.
I have two tables EmailX Doc DateChecked Name a@a.com Doc 1 1/1/2021 a a@a.com Doc 2 1/15/2021 a a@a.com Doc 3 1/30/2021 b   EmailY DateLogin a@a.com 12/10/2022 ... See more...
I have two tables EmailX Doc DateChecked Name a@a.com Doc 1 1/1/2021 a a@a.com Doc 2 1/15/2021 a a@a.com Doc 3 1/30/2021 b   EmailY DateLogin a@a.com 12/10/2022 a@a.com 11/10/2022 a@a.com 1/15/2021 a@a.com 1/25/2021   I want to join them on Emailx & EmailY and then in result  for each Email  i need to get most recent DateLogin that is before DateChecked.  I am hoping to no have to use joins as my second table has more than 50k records.  So the results should be like this EmailX Doc DateChecked Name RecentDateLogin a@a.com Doc 1 1/1/2021 a - a@a.com Doc 2 1/15/2021 a 1/15/2021 a@a.com Doc 3 1/30/2021 b 1/25/2021   So if I have to write a sql, it would be something like below. I haven't tested below , but you get the idea. SELECT t1.EmailX, t1.Doc, t1.DateChecked, t1.Name, max(t2.DateLogin) as RecentDateLogin FROM table1 AS t1 LEFT JOIN table 2 AS t2 ON t1.EmailX = t2.EmailY AND t1.DateChecked>t2.DateLogin Group By t1.EmailX, t1.Doc, t1.DateChecked, t1.Name   Thanks.