All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Team, I have a dashboard with form inputs and once the user enters the form inputs, around 5-6 panels will be loaded. Is there a way to send the dashboard results in Email. I could see "Schedu... See more...
Hi Team, I have a dashboard with form inputs and once the user enters the form inputs, around 5-6 panels will be loaded. Is there a way to send the dashboard results in Email. I could see "Schedule PDF Delivery" option disabled in dashboard for me. How can we schedule email delivery for this type of dashboards. Thanks in advance.
Hi, In the past if a users develops a scheduled report whose results are used by other users and dashboards we would normally change the ownership to "nobody" so that if the person ever left the re... See more...
Hi, In the past if a users develops a scheduled report whose results are used by other users and dashboards we would normally change the ownership to "nobody" so that if the person ever left the report would keep running. We dont like using service accounts as our security team are very anti them. But in splunk cloud if we reassign the saved search to nobody it seems to apply the cron schedule based on UTC rather than the timezone of the previous user (in our case New Zealand). Is there someway we can set the timezone of the nobody user?  We are reluctant to have to specify cron schedules in UTC when eveything we do is in NZ time. Any suggestions? Thanks
I've got a JSON array I ingest that I want to extract certain fields from to save into a lookup table. Here's an example of the JSON:     { "Response": { "results": [ { "memberTyp... See more...
I've got a JSON array I ingest that I want to extract certain fields from to save into a lookup table. Here's an example of the JSON:     { "Response": { "results": [ { "memberType": 2, "isOnline": false, "lastOnlineStatusChange": "1657176499", "groupId": "1234567", "destinyUserInfo": { "LastSeenDisplayName": "UserName1", "LastSeenDisplayNameType": 1, "iconPath": "/img/theme/bungienet/icons/xboxLiveLogo.png", "crossSaveOverride": 1, "applicableMembershipTypes": [ 2, 1 ], "isPublic": false, "membershipType": 1, "membershipId": "1234567890123456789", "displayName": "UserName1", "bungieGlobalDisplayName": "UserName1", "bungieGlobalDisplayNameCode": 9999 }, "bungieNetUserInfo": { "supplementalDisplayName": "UserName1#9999", "iconPath": "/img/profile/avatars/default_avatar.gif", "crossSaveOverride": 0, "isPublic": false, "membershipType": 254, "membershipId": "12345678", "displayName": "UserName1", "bungieGlobalDisplayName": "UserName1", "bungieGlobalDisplayNameCode": 9999 }, "joinDate": "2021-10-27T20:56:48Z" }, { "memberType": 2, "isOnline": false, "lastOnlineStatusChange": "1657390180", "groupId": "1234567", "destinyUserInfo": { "LastSeenDisplayName": "UserName2", "LastSeenDisplayNameType": 1, "iconPath": "/img/theme/bungienet/icons/xboxLiveLogo.png", "crossSaveOverride": 1, "applicableMembershipTypes": [ 2, 3, 1 ], "isPublic": false, "membershipType": 1, "membershipId": "4611686018431599324", "displayName": "UserName2", "bungieGlobalDisplayName": "UserName2", "bungieGlobalDisplayNameCode": 8888 }, "bungieNetUserInfo": { "supplementalDisplayName": "UserName2#8888", "iconPath": "/img/profile/avatars/HaloRingcopy.gif", "crossSaveOverride": 0, "isPublic": false, "membershipType": 254, "membershipId": "1990219", "displayName": "UserName2", "bungieGlobalDisplayName": "UserName2", "bungieGlobalDisplayNameCode": 8888 }, "joinDate": "2020-04-07T15:07:21Z" } ], "totalResults": 2, "hasMore": true, "query": { "itemsPerPage": 2, "currentPage": 1 }, "useTotalResults": true }, "ErrorCode": 1, "ThrottleSeconds": 0, "ErrorStatus": "Success", "Message": "Ok", "MessageData": {} }     (I truncated the results array to 2, there are normally many more) I want to write to a lookup table like this:     _time | membershipId | joinDate 2022-07-17 16:20:28 | 1234567890123456789 | 2021-10-27T20:56:48Z 2022-07-17 16:20:28 | 9876543210123456789 | 2020-04-07T15:07:21Z     I can get something close into a table with:     index=myindex | rename Response.results{}.destinyUserInfo.membershipId as membershipId Response.results{}.joinDate as JoinDate | table _time ID JoinDate     but saving that to a lookup table makes the membershipId and joinDate into multivalue fields and stores all of the values accordingly. I need them separate. Help?
In the Overview tab, it shows 25TB of total ingest volume. This is incorrect, we should be at ~4TB. This is important for our licensing and storage levels.
Hi All, i want to filter out url that contains IP , one way is i can write regex for it,, extract IP in other field and then i can filter out with that field, but here i want to save run time as we... See more...
Hi All, i want to filter out url that contains IP , one way is i can write regex for it,, extract IP in other field and then i can filter out with that field, but here i want to save run time as well,, as i dont have fixed index,, i need to search all indexes that are having field url with ip address,, so i want to apply search first instead of extracting ips.  raw url formats are =http://1.1.1.1/, 1.1.1.1:443, http://1.1.1.1/xyc.co  i tried so far  (index=*)  |fields url |where match )url, "^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\") it is working but only showing url as 1.1.1.1:443, not rest formats.  any idea what can i do? note: i dont want to write regex to extract IPs in new field first and than apply search (new field=*), it will work but query will take time as first it will search for all the urls,, then apply regex and then will apply filter.
Is there a form element that accepts only a single datetime value? User will be entering two fields, field 1 as  a single datetime and field 2 as a duration like "1h". Based on this two inputs, n... See more...
Is there a form element that accepts only a single datetime value? User will be entering two fields, field 1 as  a single datetime and field 2 as a duration like "1h". Based on this two inputs, need to calculate and set time values as below   For eg: User input field 1[TimeTokenMid] --> 07/12/2022:18:00:00 User input field 2[Duration] --> "1h" Need to calculate and set time ranges as below using relative_time function. Token_1 = 07/12/2022:17:00:00 [TimeTokenMid - Duration using relative_time fnc] Token_2 = 07/12/2022:19:00:00 [TimeTokenMid + Duration using relative_time fnc] Currently i am taking field 1[TimeTokenMid] as text and doing calculation as the current timepicker control is not allowing to set a single custom time value. It has options only for selecting a range like (last 60 mins, between x and y etc)like that. In my usecase,I want to take a single timevalue input and calculate time ranges(+-) based on the duration input dynamically. Could you please suggest any solution?
Hi Team,  Can some one help me who already enabled below attribute ([config_change_audit] in Version 8.2.4 or 8.2.7 . below caution was mentioned in document. * CAUTION: This setting is experimen... See more...
Hi Team,  Can some one help me who already enabled below attribute ([config_change_audit] in Version 8.2.4 or 8.2.7 . below caution was mentioned in document. * CAUTION: This setting is experimental and is related to a feature that is still under development. Using the setting might increase resource usage. What is the Experimental feature ? What will risk to the environment a part from resource usage? And if its enabled in environment what will be resource utilizations ?  ================================================================================================================ [config_change_audit] disabled = <boolean> * Whether or not splunkd writes configuration changes to the configuration change log at $SPLUNK_HOME/var/log/splunk/configuration_change.log. * If set to "false", configuration changes are captured in $SPLUNK_HOME/var/log/splunk/configuration_change.log. * If set to "true", configuration changes are not captured in $SPLUNK_HOME/var/log/splunk/configuration_change.log. * Default: true mode = [auto|track-only] * Set to "auto" or "track-only" to get log of .conf file changes under $SPLUNK_HOME/etc/system, $SPLUNK_HOME/etc/apps, $SPLUNK_HOME/etc/users, $SPLUNK_HOME/etc/slave-apps or changes to $SPLUNK_HOME/etc/instance.cfg. * The values "auto" and "track-only" are identical in their effects. Set mode to "auto" to auto-enroll this deployment into all the latest features. * CAUTION: This setting is experimental and is related to a feature that is still under development. Using the setting might increase resource usage. * Default: auto ==========================================================================================================      
Hi team, As per my requirement, on changing a particular form element [Token 1] , a set of other tokens [Token2,Token3 ..etc] needs to be recalculated and set first and then on click of submit , al... See more...
Hi team, As per my requirement, on changing a particular form element [Token 1] , a set of other tokens [Token2,Token3 ..etc] needs to be recalculated and set first and then on click of submit , all my panels should load using the recalculated tokens.   I have added the tokens to be evaluated with in the <change> tag under the <input> of the Token1. But still if i change the form element value, the other tokens are not getting recalculated. Could you please help with this? sample: <input type="text" token="timeTokenMid" searchWhenChanged="false"> <label>Start Time</label> <change> <eval token="formatted_token">strptime($timeTokenMid$,"%m/%d/%Y:%T")</eval> <eval token="timeTokenSt">relative_time($formatted_token$,"-1h")</eval> <eval token="timeTokenSt_datetime">strftime($timeTokenSt$,"%m/%d/%Y:%T")</eval> <eval token="timeTokenEnd">relative_time($formatted_token$,$obtDuration$)</eval> <eval token="timeTokenEnd_datetime">strftime($timeTokenEnd$,"%m/%d/%Y:%T")</eval> </change>    
Hi, Novice splunker here. My search only extracts 1st 10-digit number and my data contains atleast 4 or more  10-digit numbers. Need help on how to make a list of numbers from following example text.... See more...
Hi, Novice splunker here. My search only extracts 1st 10-digit number and my data contains atleast 4 or more  10-digit numbers. Need help on how to make a list of numbers from following example text.  example data in field2: 1111111111,Standard,R9801, 2222222222,Standard,S9801, 3333333333,Standard,T9801, 4444444444,Standard,U9801, 5555555555,Standard,V9801, For order and enquiries contact 111xxxxxx and mention the 15 digit ID "ACOOLXXXXXX"   |rex field=Data "(?<Numbers>\d{10,})" |table Numbers Output of above is only 1st number in the field Numbers>> 1111111111 Thanks.  
I am attempting to use the Splunkbase timeline visualization.  I am trying to create a timeline where:  Y axis  is the months (Jan - Dec) x axis is the days of the month (1-31)   need to di... See more...
I am attempting to use the Splunkbase timeline visualization.  I am trying to create a timeline where:  Y axis  is the months (Jan - Dec) x axis is the days of the month (1-31)   need to display the date a patch comes out, the days we are testing the patch, and when we roll it out to company.  I have never done a timeline in splunk and could use some help.  Thank to the community in advance! If anyone has suggestions on other ways I could visualize this without using timeline, that would be great as well. 
Hi I want to connect influxdb via splunk db, in connection type I can't find influx db! is there any jdbc or odbc driver exist for influxdb?     FYI: find this repo:  https://github.co... See more...
Hi I want to connect influxdb via splunk db, in connection type I can't find influx db! is there any jdbc or odbc driver exist for influxdb?     FYI: find this repo:  https://github.com/influxdata/influxdb-java Any idea? Thanks
Hi when I login with admin account splunk db work correctly but when users of ldap login got this error!   Cannot communicate with task server, please check your settings. <?xml version... See more...
Hi when I login with admin account splunk db work correctly but when users of ldap login got this error!   Cannot communicate with task server, please check your settings. <?xml version="1.0" encoding="UTF-8"?> <response> <messages> <msg type="WARN">insufficient permission to access this resource</msg> </messages> </response>     splunk db version: 3.9.0 splunk version: 8.5   Any idea? Thanks
Hi, I am unable to access splunk web interface. I have taken 1 ec2 instance from aws  Red Hat Enterprise Linux 7 with High Availability installed  the splunk in /opt [root@ip-172-31-82-137 sp... See more...
Hi, I am unable to access splunk web interface. I have taken 1 ec2 instance from aws  Red Hat Enterprise Linux 7 with High Availability installed  the splunk in /opt [root@ip-172-31-82-137 splunk]# netstat -an | grep 8000 tcp 0 0 0.0.0.0:8000 0.0.0.0:* LISTEN   [root@ip-172-31-82-137 splunk]# sestatus SELinux status: disabled [root@ip-172-31-82-137 splunk]# sudo ./splunk restart Stopping splunkd... Shutting down. Please wait, as this may take a few minutes. .. [ OK ] Stopping splunk helpers... [ OK ] Done. Splunk> Take the sh out of IT. Checking prerequisites... Checking http port [8000]: open Checking mgmt port [8089]: open Checking appserver port [127.0.0.1:8065]: open Checking kvstore port [8191]: open Checking configuration... Done. Checking critical directories... Done Checking indexes... Validated: _audit _configtracker _internal _introspection _metrics _metrics_rollup _telemetry _thefishbucket history main summary Done Checking filesystem compatibility... Done Checking conf files for problems... Done Checking default conf files for edits... Validating installed files against hashes from '/opt/splunk/splunk-9.0.0-6818ac46f2ec-linux-2.6-x86_64-manifest' All installed files intact. Done All preliminary checks passed. Starting splunk server daemon (splunkd)... PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate validation for the httplib and urllib libraries shipped with the embedded Python interpreter; must be set to "1" for increased security Done [ OK ] Waiting for web server at http://127.0.0.1:8000 to be available.............. Done If you get stuck, we're here to help. Look for answers here: http://docs.splunk.com The Splunk web interface is at http://ip-172-31-82-137.ec2.internal:8000 ****************************************************************************************** I see error under web_service log 2022-07-16 06:55:45,844 INFO [62d260ef597f205fe25d10] root:733 - CONFIG: error_page.default (method): <bound method ErrorController.handle_error of <splunk.appserver.mrsparkle.controllers.error.ErrorController object at 0x7f205e9e6090>>
I recently installed the Tenable Add-On for Splunk, but whenever I try to access the add on, I am faced with a hanging loading screen. Do I need to change file (input etc.) settings somewhere?  Whe... See more...
I recently installed the Tenable Add-On for Splunk, but whenever I try to access the add on, I am faced with a hanging loading screen. Do I need to change file (input etc.) settings somewhere?  When I install the Tenable App for Splunk, it loads perfectly fine, but I need the Tenable Add-On for Splunk to work in order to use the Tenable App for Splunk. Any help or advice would be truly appreciated! Thank you
I recently installed the Splunk Add-on for Microsoft Security to Splunk Cloud and configured it to connect via API to an app registered in Azure. The data is still not loading on the Security section... See more...
I recently installed the Splunk Add-on for Microsoft Security to Splunk Cloud and configured it to connect via API to an app registered in Azure. The data is still not loading on the Security section of the Microsoft 365 App for Splunk. I checked in Azure and Incident.Read has permissions enabled on the app. The Splunk documentation says that I should go to Add-on > Inputs and click Create New Input to complete the configuration. When I go to the Inputs page I get the message: "Failed to load Inputs Page This is normal on Splunk search heads as they do not require an Input page. Check your installation or return to the configuration page. Error: Request failed with status code 500".  I am not sure how I can fix this as I have no other place to put the endpoints as Inputs.
I want to monitor the critical services running status on a Windows server. Need to trigger ticket if any of the critical service is stopped for x number of minutes. Can this be implemented as normal... See more...
I want to monitor the critical services running status on a Windows server. Need to trigger ticket if any of the critical service is stopped for x number of minutes. Can this be implemented as normal alert or needs episode to be created? Normal alert have drawback of triggering again and again in case of service is down for longer time.
Have a query comprised of 2 subqueries (joins).  Output is exactly as expected.  When I try to push that data to a summary index, only the fields from the original query make it, for all fields and ... See more...
Have a query comprised of 2 subqueries (joins).  Output is exactly as expected.  When I try to push that data to a summary index, only the fields from the original query make it, for all fields and event data generated from the sub queries there is nothing.    Finally, when I run the query (including '|collect index=summary' as the last line) everything expected is in the output, just not making it to the summary index.       index=blah_blah <followed by a search> | join [<search string1> [ <search string 2]] | fields _time IP DNS NETBIOS TRACKING_METHOD OS TAGS QID TITLE TYPE SEVERITY STATUS LAST_SCAN_DATETIME LAST_FOUND_DATETIME LAST_FIXED_DATETIME PUBLISHED_DATETIME THREAT_INTEL_VALUES THREAT_INTEL_IDS CVSS_V3_BASE VENDOR_REFERENCE RESULTS | collect index=summary       Output is fully populated, yet summary index is missing several fields (and the associated data). Note: the missing fields in the summary index are all from the sub-searches/join.  
Hoping someone can help me get past the last hurdle.  I'm trying to create a custom function that dynamically calls other custom functions.  I've got the part of generating the list of desired ... See more...
Hoping someone can help me get past the last hurdle.  I'm trying to create a custom function that dynamically calls other custom functions.  I've got the part of generating the list of desired functions.  I understand how to make sure the datapath into the dynamically selected custom function.  I want to pass the results out to a filter object, but it seems to be coming out only as a single variable. not an array. What am I missing?    def rule_check(action=None, success=None, container=None, results=None, handle=None, filtered_artifacts=None, filtered_results=None, custom_function=None, **kwargs): phantom.debug('rule_check() called') custom_function_results_data_1 = phantom.collect2(container=container, datapath=['build:custom_function_result.data.data_packets.*.packet'], action_results=results) custom_function_results_data_2 = phantom.collect2(container=container, datapath=['get_funcs:custom_function_result.data.found_functions.*.function_path'], action_results=results) custom_function_results_item_1_0 = [item[0] for item in custom_function_results_data_1] custom_function_results_item_2_0 = [item[0] for item in custom_function_results_data_2] rule_check__data = None ################################################################################ ## Custom Code Start ################################################################################ # Write your custom code here... parameters = [] for item0 in custom_function_results_data_1: parameters.append({ 'data_w_fields': item0[0], }) for func in custom_function_results_item_2_0: a = phantom.custom_function(custom_function=func, parameters=parameters, name='rule_check') ################################################################################ ## Custom Code End ################################################################################ phantom.save_run_data(key='rule_check:data', value=json.dumps(rule_check__data)) filter_1(container=container) return    
  I am trying to import data reading a file .But I keep getting the below error in internal logs   INFO WatchedFile - File too small to check seekcrc, probably truncated. Will re-read entire fi... See more...
  I am trying to import data reading a file .But I keep getting the below error in internal logs   INFO WatchedFile - File too small to check seekcrc, probably truncated. Will re-read entire file='/usr/local/ios/var/logs/PN_Usage_iujj_Jun28.22.10.56.csv'   07-15-2022 11:37:42.256 -0400 INFO WatchedFile - File too small to check seekcrc, probably truncated. Will re-read entire file='/usr/local/ios/var/logs/PN_Usage_iuhg_Jun28.22.16.16.csv'. inputs [monitor:///usr/local/ios/var/logs/PN_Usage_*.csv] index = xyz sourcetype=ios:pn:usage #crcSalt = vmr initCrcLength = 10000   props [ios:pn:usage] CHARSET=UTF-8 LINE_BREAKER=([\r\n]+)\"\d+\-\d+\-\d+\_\d+\:\d+ MAX_TIMESTAMP_LOOKAHEAD=17 NO_BINARY_CHECK=null SHOULD_LINEMERGE=false disabled=false pulldown_type=true TIME_FORMAT=%Y-%m-%d_%H:%M TIME_PREFIX=\"   Sample events: 2022-07-14_15:35, PO@abc, InOctets, 4541070, OutOctets, 12763951, Total MB used, 2.163127625 2022-07-14_15:35, BE@abc, InOctets, 75945647, OutOctets, 650376983, Total MB used, 90.79032875   Is there any other settings I need to include or remove.   Thanks in Advance
Let's say I have a multivalue fieldA and a fieldB. I know you can do something like "| where field=value" in a search or just have it in the first part of the search arguments, but is it possible to ... See more...
Let's say I have a multivalue fieldA and a fieldB. I know you can do something like "| where field=value" in a search or just have it in the first part of the search arguments, but is it possible to do something for where I use all returned values part of fieldA as the search for fieldB?