All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi All,   I am trying our Splunk Enterprise server for MySQL database monitoring. I am using a trial version of Splunk. I have installed the Splunk DB Connect application and have installed the... See more...
Hi All,   I am trying our Splunk Enterprise server for MySQL database monitoring. I am using a trial version of Splunk. I have installed the Splunk DB Connect application and have installed the MySQL driver mysql-connector-j-8.0.31.jar. I am able to see that the driver is installed. And I have configured a MySQL DB connection. But when I execute Health Check from the monitoring console for the Splunk DB Connect app, I am getting the error " One or more defined connections require the corresponding JDBC driver." "Driver version is invalid, connection: MySQLDB, connection_type: mysql." MySQL DB version is 8.0.23 and it's an AWS Aurora RDS instance. I did try "Splunk DBX Add-on for MySQL JDBC" to install the driver before and I had the same issue. I saw that the driver that came with this app was of a lower version than what we use in our application to connect to the database, so installed the new driver directly. Also while trying to execute a query I am getting the error "Error in 'dbxquery' command: External search command exited unexpectedly with non-zero error code 1." Please help to resolve this error. Thanks Arun    
splunkd service is trying to start after windows server reboot but it stops suddenly. MS has confirmed that there is no issue from Service Control Manager.  This Splunk service stoppage is found ... See more...
splunkd service is trying to start after windows server reboot but it stops suddenly. MS has confirmed that there is no issue from Service Control Manager.  This Splunk service stoppage is found only on the servers after the patching reboot.  
I'm trying to parse saved searches that contain a bunch of eval statements that do this sort of logic   | eval var=case( a,b, c,d, e,f) | eval var2=case( match(x, "z|y|z"), 1, match(x, ... See more...
I'm trying to parse saved searches that contain a bunch of eval statements that do this sort of logic   | eval var=case( a,b, c,d, e,f) | eval var2=case( match(x, "z|y|z"), 1, match(x, "a|b|c"), 2) | eval...    I have the search string from the rest api response and am trying to extract all the LHS=RHS statements with   | rex field=search max_match=0 "(?s)\|\s*eval (?<field>\w+)=(?<data>.*?)"   The captures all the fields in <field> nicely, i.e. var and var2 (in this example because of the non-greedy ?), but I am struggling with capturing <data> in that the data is multi-line and if I don't use non-greedy(?) then I only get ONE field returned and data is the remainder of the search string, i.e. greedy (.*) I can't use [^|]* (effectively much the same) as the eval statements may contain pipes | so I want to extract up to the next \n|\s?eval I've been banging around with regex101 but just can't figure out the syntax to get this to work. Any ideas?
I have a certain amount of events (generated every 5 min) for a set of websites and their user base and their country.  The goal is to find the number of distinct users per hour/day/month for each w... See more...
I have a certain amount of events (generated every 5 min) for a set of websites and their user base and their country.  The goal is to find the number of distinct users per hour/day/month for each website per country during the last 6 months. So at the end it will look something like this: Over the last 6 months: Country1 - Website1 -  12 users/hour (or day, month) Country1 - Website2  -  2 users/hour (or day, month) Country3 - Website1 -  10 users/hour (or day, month) Country2 - Website3  -  8 users/hour (or day, month) And what would be the most appropriate chart to visualize the outcome?   I have come up with this line but i'm not sure if it gives out what i want (the hourly average)   index... | chart count(user) as no_users by location website span=1h
Hi experts there, Trying to extract multivalue output from a multiline json field through props and transforms. How best can I achieve for the below sample data (for my_mvdata field) ? I can writ... See more...
Hi experts there, Trying to extract multivalue output from a multiline json field through props and transforms. How best can I achieve for the below sample data (for my_mvdata field) ? I can write a regex in pros.conf with \\t delimiter. But only getting the first line. How to use multi add and do it through transforms?            { something: false somethingelse: true blah: blah: my_mvdata: server1 count1 country1 code1 message1 server2 count1 country1 code1 message2 server3 count1 country1 code1 message3 server4 count1 country1 code1 message4 blah: blah: }            
My search:     | makeresults earliest=-2h | timechart count as aantal span=1m     returns a list of zero's but for the last/current minute it returns "1". I only want zero's back to combi... See more...
My search:     | makeresults earliest=-2h | timechart count as aantal span=1m     returns a list of zero's but for the last/current minute it returns "1". I only want zero's back to combine this search with a timechart. After combining these searches (makeresults and timechart) there should be no message "no values found" anymore. What do I have to change to have only zero's as a result of my makeresults search?
the first parameter is expectedBw and the other one is observedBw the expectedBw remains constant we have to show by line graph that how the expectedBw is being achieved with respect to observedBw. T... See more...
the first parameter is expectedBw and the other one is observedBw the expectedBw remains constant we have to show by line graph that how the expectedBw is being achieved with respect to observedBw. There need to be 2 lines one for expectedBw one for observedBw. the x axis should have time and y axis should have bandwidth means at different time how is the observedBw going forward or behind the expectedBw
Hi, I am using the REST API to pull data from splunk, using the output_mode=json. The data that is returned is a mix of strings and JSON (objects) and I am trying to work out a way for the API ... See more...
Hi, I am using the REST API to pull data from splunk, using the output_mode=json. The data that is returned is a mix of strings and JSON (objects) and I am trying to work out a way for the API to return the entire data set as JSON. For Example: Curl Command: curl -k -u 'user1'' https://splunk-server:8089/servicesNS/admin/search/search/jobs/export -d 'preview=false' -d 'output_mode=json' -d 'search=|savedsearch syslog_stats latest="-2d@d" earliest="-3d@d" span=1' | jq . Results: Note how the result is in JSON, but devices is an array of strings not json. {   "preview": false,   "offset": 0,   "lastrow": true,   "result": {     "MsgType": "LINK-3-UPDOWN",     "devices": [       "{\"device\":\"1.1.1.1\",\"events\":12,\"deviceId\":null}",       "{\"device\":\"2.2.2.2\",\"events\":128,\"deviceId\":1}",       "{\"device\":\"3.3.3.3\",\"events\":217,\"deviceId\":2}"     ],     "total": "357",   } } Query: | tstats count as events where index=X-syslog Severity<=4 earliest=-3d@d latest=-2d@d by _time, Severity, MsgType Device span=1d | search MsgType="LINK-3-UPDOWN" | eval devices=json_object("device", Device, "events", events, "deviceId", deviceId ) | fields - Device events _time Filter UUID Regex deviceId addressDeviceId | table MsgType devices Query Result in UI: MsgType devices total LINK-3-UPDOWN {"device":"1.1.1.1","events":12,"deviceId":null} {"device":"2.2.2.2","events":128,"deviceId":null} {"device":"3.3.3.3","events":217,"deviceId":null} 357   As can be seen from the UI the device is in JSON format (using json_object), but from the curl result it is a string in json format - is there a way for the query to return the whole result as a json object, not a mix of json and strings ? I have also tried tojson in a number of differnt ways, but no success. Desired Result: where devices is a json object and not treated a string as above. {   "preview": false,   "offset": 0,   "lastrow": true,   "result": {     "MsgType": "LINK-3-UPDOWN",     "devices": [       {"device":"1.1.1.1","events":12,"deviceId":null}",       {"device":"2.2.2.2","events":128,"deviceId":1}",       {"device":"3.3.3.3","events":217,"deviceId":2}"     ],     "total": "357",   } } I can post process the strings into JSON, but I would rather get JSON from SPlunk directly. Thanks !        
Today : index=sold Product=Acer , Product=iphone  last week : index=sold  Product=Samsung , Product=iphone Query Used : index=sold earliest=-0d@d latest=now |stats count as Today by Product | ... See more...
Today : index=sold Product=Acer , Product=iphone  last week : index=sold  Product=Samsung , Product=iphone Query Used : index=sold earliest=-0d@d latest=now |stats count as Today by Product | appendcols [search index=sold earliest=-7d@d latest=-6d@d |stats count as Lastweeksameday by Product] --- As Samsung Product is not sold on "Today" that particular Product is not showing up in the Output though it was sold on "Last week". Ideally it should show as 0 for "Today" and as 1 for "Last week" in the output. Could someone please help.
Does anyone know how to make use of a 2nd y axis on a line graph or column chart or where it might be documented?
Greetings, I'm running Splunk Enterprise on a Windows Server (requirement driven). The Windows Server & Splunk have FIPS Mode Enabled (another requirement).  The Splunk Process (splunkd.exe) is c... See more...
Greetings, I'm running Splunk Enterprise on a Windows Server (requirement driven). The Windows Server & Splunk have FIPS Mode Enabled (another requirement).  The Splunk Process (splunkd.exe) is causing the windows server to generate an excessive number of 6417 events (The FIPS mode crypto selftests succeeded) in the local Windows Security Log, creating excessive noise in the logs (4,500/hr) and eating up HDD space. Any indication why and/or steps I can take to limit beyond turning off FIPS?
Please need help with this command - Average response time with 10% additional buffer ( single number) – Use “Eval” option
In my query. I am trying to combine the output from one index and sourcetype with the output of another index and sourcetype. I have looked at the documentation and came across subsearches and have a... See more...
In my query. I am trying to combine the output from one index and sourcetype with the output of another index and sourcetype. I have looked at the documentation and came across subsearches and have attempted to use the search command but not getting any results. Leaving me to believe I definitely am doing it wrong. Please see my example below.  index=A sourcetype=cat ProjectOwner="person" dest_owner="person" [search sourcetype=FW destp=1111 action=denied | table host] | srcdns srcip
Hi All, I don't have much experience with Splunk. My JSON payload looks like as shown below. The msg.details array can have any number key/value pairs in any order.     { "appName": "TestAp... See more...
Hi All, I don't have much experience with Splunk. My JSON payload looks like as shown below. The msg.details array can have any number key/value pairs in any order.     { "appName": "TestApp", "eventType": "Response", "msg": { "transId": "Trans1234", "status": "Success", "client": "clientXyz", "responseTime": 1650, "details": [ { "keyName": "rtt", "keyValue": 2778 }, { "keyName": "trace", "keyValue": 97007839130680 } ], "url": "/v1/test" } }      I am trying to write a query and form a table as shown below. I am interested in displaying only the keyValue of keyName:trace in the table. Any help is appreciated. Thanks. index=* appName="TestApp" msg.url="/v1/test" | table msg.transId, msg.status, msg.details[keyName="trace"].keyValue msg.transId msg.status msg.details[keyName="trace"].keyValue Trans1234 Success 97007839130680 Trans7890 ERROR 29411645500355  
Hi Splunkers, I'm trying to get the Splunk interface up but with no luck. When I restart Splunk, it is getting stuck in the last step: Waiting for web server at https://127.0.0.1:8000 to be avai... See more...
Hi Splunkers, I'm trying to get the Splunk interface up but with no luck. When I restart Splunk, it is getting stuck in the last step: Waiting for web server at https://127.0.0.1:8000 to be available................................... I then noticed that when I run the following command: netstat -an | grep 8000 The output is: tcp 13 0 0.0.0.0:8000 0.0.0.0:* LISTEN tcp 518 0 10.242.13.20:8000 10.201.184.6:63670 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63648 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63717 CLOSE_WAIT tcp 155 0 127.0.0.1:8000 127.0.0.1:51036 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63743 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63742 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:64055 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63669 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63649 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63730 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63718 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:63731 CLOSE_WAIT tcp 518 0 10.242.13.20:8000 10.201.184.6:64056 CLOSE_WAIT   After some troubleshooting I have tried to set the "connection_host" in the inputs.conf to "none" but with no luck. Then I tried to increase default values for the web threadpool in the web.conf but with no luck also. I even tried to restart the server itself but with no luck.   Any thoughts that can help me?
Hi, I am struggling with following task. I have a lookup file containing all the configured dhcp scopes in the following format.  ScopeId SubnetMask Name State StartRange EndRange LeaseDuration I... See more...
Hi, I am struggling with following task. I have a lookup file containing all the configured dhcp scopes in the following format.  ScopeId SubnetMask Name State StartRange EndRange LeaseDuration In the dhcp.log i have the ip address for a client.  I need the ScopeID and the LeaseDuration for each client.  My idea is to look if the given IP Address is within StartRange and EndRange and get the ScopeID and LeaseDuration. My problem is I don't have a clue on how to do so. Any Ideas? thanks  Alex
Hi team, in my environment  lets say say i have 1000 forwarders so in order to set deployment poll i can not go in each every forwarder and set ./splunk set deploy-server DS IP:8089.   in this scen... See more...
Hi team, in my environment  lets say say i have 1000 forwarders so in order to set deployment poll i can not go in each every forwarder and set ./splunk set deploy-server DS IP:8089.   in this scenario how we do set please clarify is there any automatic script?   thanks in advance
Hello,  i am looking to narrow down my search field, i only want to search for events that happen outside of  a specific time range. I want to search for events that happen outside of 0800 to 1700 ... See more...
Hello,  i am looking to narrow down my search field, i only want to search for events that happen outside of  a specific time range. I want to search for events that happen outside of 0800 to 1700 Any help would be appriceated  Kind regards 
Hi Community,  I am trying to generate a timechart by month with the following query:  index=xyz Question="zzz" NOT "Could not get results" NOT "No Deployment Results Found" NOT "No Matching Deploy... See more...
Hi Community,  I am trying to generate a timechart by month with the following query:  index=xyz Question="zzz" NOT "Could not get results" NOT "No Deployment Results Found" NOT "No Matching Deployments Found" NOT "Unable to load PatchLib" | sort Computer_Name, patchmeantime | stats max(patchmeantime) as MaxAge by Computer_Name | stats avg(MaxAge) as MTTP | timechart span=1mon avg(MTTP) But nothing is showing up, so I am pretty sure I am missing something critical or super simple here but not sure what it is..... Any help will be really appreciated. 
So I have a field named "domain" that has values of single domains (A, B, C) and combinations of domains with two different values. A B C A/B A/C A, B C, D I can successfully split the value... See more...
So I have a field named "domain" that has values of single domains (A, B, C) and combinations of domains with two different values. A B C A/B A/C A, B C, D I can successfully split the values by either "," or "/" with eval new_field1=(domain,",") but if I do another one after with eval new_field1=(domain,"/") or eval new_field2=(new_field1,"/") after it doesn't work. Is there a way to split by both "," and "/"