All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have javascript code of which job is to check checkboxes on change in token value: This is for rendering of the checkbox:   var CustomRangeRenderer = TableView.BaseCellRenderer.extend({ ... See more...
I have javascript code of which job is to check checkboxes on change in token value: This is for rendering of the checkbox:   var CustomRangeRenderer = TableView.BaseCellRenderer.extend({ canRender: function(cell) { return _(['Check for Prediction']).contains(cell.field); }, render: function($td, cell) { var a = $('<div>').attr({"id":"chk-sourcetype"+cell.value,"value":cell.value}).addClass('checkbox').click(function() { if($(this).attr('class')==="checkbox") { selected_values_array.push($(this).attr('value')); $(this).removeClass(); $(this).addClass("checkbox checked"); } else { $(this).removeClass(); $(this).addClass("checkbox"); var i = selected_values_array.indexOf($(this).attr('value')); if(i != -1) { selected_values_array.splice(i, 1); } // Change the value of a token $mytoken$ } //console.log(val_arr[1]); console.log(selected_values_array); }).appendTo($td); } });​   This is my on change condition:   tokens.on("change:mytoken2", function() { tokenVal=tokens.get("mytoken2"); console.log("mytoken2onchange: " + tokenVal); if (tokenVal=='value changed') { console.log('inside on:change...') var tableIDs = ["myTable"]; for (i=0;i<tableIDs.length;i++) { var sh = mvc.Components.get(tableIDs[i]); // console.log(sh) if(typeof(sh)!="undefined") { sh.getVisualization(function(tableView) { // Add custom cell renderer and force re-render tableView.table.addCellRenderer(new CustomRangeRenderer()); tableView.table.render(); }); } } } ​   This is default: Screenshot 1 And on click of a button, there will be change in token value and as a result of which the column 'Selected' whereever value is 'Yes', the checkbox corresponding to that row must get enabled: Screenshot 2   Please feel free to shoot your questions if anything is not clear.      
We swapped firewalls running version 8.1.15 and GP 4.1.12 to a replacement running version 9.1.4 and GP 4.1.12.   Prior to the upgrade Add on 6.3.0 was gathering data and appropriately displaying das... See more...
We swapped firewalls running version 8.1.15 and GP 4.1.12 to a replacement running version 9.1.4 and GP 4.1.12.   Prior to the upgrade Add on 6.3.0 was gathering data and appropriately displaying dashboard data.   After upgrading Add on and App to 6.3.1 I'm still getting nothing in my Global Protect Dashboard.   I'm not seeing data in the dashboard after the upgrade to 9.1.4.    I can see the data prior to the upgrade.   I've checked the forwarding configuration and that is fine: BSD Format and no timestamps appended.    Not sure what I'm missing.    Data models are at 100% and other than globalprotect seem to be ok.
I have the test index ready and receiving other API related script outputs. However, I am trying to set up a CSV input towards the same index from a single universal forwarder server. Yet do not have... See more...
I have the test index ready and receiving other API related script outputs. However, I am trying to set up a CSV input towards the same index from a single universal forwarder server. Yet do not have any results coming when the CSV results are searched for. Path for my inputs.conf : SplunkUniversalForwarder\etc\system\local I do not have any props.conf or outputs.conf on the specified inputs.conf path, could that be a reason? I am more suspicious of not specifying a "Current Time" Timestamp in the props.conf but I do not know how to accomplish that. My Current inputs.conf: [monitor://C:\Users\testuser\Desktop\Splunk_test.csv] index = test sourcetype = csv interval = 300 I am new to assigning monitoring of files, so assistance towards the matter would be very appreciated. Regards,  
Need to plot a time chart of some mac with average of Mem0, Mem1, CPU0, CPU1 and CPU2. (index=metrics OR index=hc_trials OR index=hc_prod) (HCTELEM OR HCJUNK) uptime>1800 | fields + payload version... See more...
Need to plot a time chart of some mac with average of Mem0, Mem1, CPU0, CPU1 and CPU2. (index=metrics OR index=hc_trials OR index=hc_prod) (HCTELEM OR HCJUNK) uptime>1800 | fields + payload version deviceid mac uptime | eval payload=replace(payload, "\"\"", "\"") | spath input=payload output=Mem0 path=Mem{0} | spath input=payload output=Mem1 path=Mem{1} | spath input=payload output=CPU0 path=CPU{0} | spath input=payload output=CPU1 path=CPU{1} | spath input=payload output=CPU2 path=CPU{2} | table mac Mem0 Mem1 CPU0 CPU1 CPU2     @thambisetty 
Hello, Do you know how I can put HttpOnly and Secure to true in cookie login? Security team request It to me. It happens in splunk local account login I have web.conf correctly to true in htt... See more...
Hello, Do you know how I can put HttpOnly and Secure to true in cookie login? Security team request It to me. It happens in splunk local account login I have web.conf correctly to true in httponly and set to secure Thanks a lot¡¡ Regards Daniel
Hi, I am trying to query to pick the maximum TPS count of each host(three hosts) and the time when the maximum count was reported. There maybe many instance where the count will be same for more tha... See more...
Hi, I am trying to query to pick the maximum TPS count of each host(three hosts) and the time when the maximum count was reported. There maybe many instance where the count will be same for more than one time, in that case I would chose the count for the latest time. The below query picks the maximum count per host but I am unable to write the time when it was maximum.   index= xyz | timechart span=1s cont=false count BY host_name | untable _time host_name count | stats max(count) as count BY host_name   So I need help to know how to include the _time with my output(as I said count might be same for different time in that case I need the latest time written to the output along with host and maximum count) OUTPUT:
Hello everyone, I'm a new in your community, thank you for the welcome I need to display a map with several fields for each data. I had done this for display with a single data (it works!)   ... See more...
Hello everyone, I'm a new in your community, thank you for the welcome I need to display a map with several fields for each data. I had done this for display with a single data (it works!)       | inputlookup data.csv | search agence_rattachement="*" AND code_client_groupe=* AND nom_site=* AND id_departement=* | lookup villes_france.csv nom_reel AS ville_site OUTPUTNEW longitude_dgr,latitude_dgr | lookup data-2.csv nom_site_rattachement AS nom_site OUTPUTNEW nombre_compresseur, numero_centrale | geostats latfield=latitude_dgr longfield=longitude_dgr count by nom_site       But with a multitude of fields, it no longer works:/ I have the data (see in picture) but the map contains no points:       | inputlookup data.csv | search agence_rattachement="*" AND code_client_groupe=* AND nom_site=* AND id_departement=* | lookup villes_france.csv nom_reel AS ville_site OUTPUTNEW longitude_dgr,latitude_dgr | lookup data_2.csv nom_site_rattachement AS nom_site OUTPUTNEW nombre_compresseur, numero_centrale | geostats latfield=latitude_dgr longfield=longitude_dgr translatetoxy=false count by nom_site       Thx you !!!! Have a good day
There are two buttons on  my dashboard . On clicking the submit button , the other button should be disabled . Can anyone help on the same ? 
Hi Splunkers! We have installed the SailPoint Adaptive Response app on a heavy forwarder but when setting up the inputs we are faced with a certificate verify error. 2020-09-18 10:01:47,091 ERROR p... See more...
Hi Splunkers! We have installed the SailPoint Adaptive Response app on a heavy forwarder but when setting up the inputs we are faced with a certificate verify error. 2020-09-18 10:01:47,091 ERROR pid=31820 tid=MainThread file=base_modinput.py:log_error:307 | Get error when collecting events. Traceback (most recent call last): File "/opt/splunk/etc/apps/TA-sailpoint-identitynow-auditevent-add-on/bin/ta_sailpoint_identitynow_auditevent_add_on/modinput_wrapper/base_modinput.py", line 127, in stream_events self.collect_events(ew) File "/opt/splunk/etc/apps/TA-sailpoint-identitynow-auditevent-add-on/bin/sailpoint_identitynow_auditevent.py", line 72, in collect_events input_module.collect_events(self, ew) File "/opt/splunk/etc/apps/TA-sailpoint-identitynow-auditevent-add-on/bin/input_module_sailpoint_identitynow_auditevent.py", line 225, in collect_events token_response = helper.send_http_request(token_url, "POST", parameters=tokenparams, payload=None, headers=None, cookies=None, verify=True, cert=None, timeout=None, use_proxy=use_proxy) File "/opt/splunk/etc/apps/TA-sailpoint-identitynow-auditevent-add-on/bin/ta_sailpoint_identitynow_auditevent_add_on/modinput_wrapper/base_modinput.py", line 476, in send_http_request proxy_uri=self._get_proxy_uri() if use_proxy else None) File "/opt/splunk/etc/apps/TA-sailpoint-identitynow-auditevent-add-on/bin/ta_sailpoint_identitynow_auditevent_add_on/splunk_aoblib/rest_helper.py", line 43, in send_http_request return self.http_session.request(method, url, **requests_args) File "/opt/splunk/etc/apps/TA-sailpoint-identitynow-auditevent-add-on/bin/ta_sailpoint_identitynow_auditevent_add_on/requests/sessions.py", line 488, in request resp = self.send(prep, **send_kwargs) File "/opt/splunk/etc/apps/TA-sailpoint-identitynow-auditevent-add-on/bin/ta_sailpoint_identitynow_auditevent_add_on/requests/sessions.py", line 609, in send r = adapter.send(request, **kwargs) File "/opt/splunk/etc/apps/TA-sailpoint-identitynow-auditevent-add-on/bin/ta_sailpoint_identitynow_auditevent_add_on/requests/adapters.py", line 497, in send raise SSLError(e, request=request) SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:742) Does anyone know where or how to specify certificate location or keystores in the SailPoint Adaptive Response app? I believe the reason why the verification is failing is because the cert being used is from  internal root CA. We have installed the root cert in the OS keystore but still no luck. Any assistance would be appreciated!
I need some help with parsing Forcepoint CASB CEF logs in Splunk. The data does not seem to parse the epoch time stamps and all comes in as one event. I need to break these up into individual events ... See more...
I need some help with parsing Forcepoint CASB CEF logs in Splunk. The data does not seem to parse the epoch time stamps and all comes in as one event. I need to break these up into individual events and also parse the epoch time stamp in the format  "%Y-%m-%d %H:%M:%S" on ingestion into splunk.  SAMPLE DATA   CEF:0|Forcepoint CASB|Cloud Service Monitoring|1.0|63575192763|Activity|0|act=Monitor app= cat=Normal Activity cs1= destinationServiceName=Office365 deviceExternalId= deviceFacility=true deviceProcessName=loadBalancers dhost= dpriv=User dst=0.0.0.0 duser=14dd43c6-a792-4c07-a33b-c5e561a129de dvc=10.1.2.12 dvchost=somedomain end=1600408516000 externalId=0 fsize=-1 msg=//United States/Unknown outcome=Success proto= reason=modify request= requestClientApplication=Unknown/Unknown/"" rt=1600408516000 sourceServiceName=Unmanaged src=someIP start=1600408516000 suser= cs2= cs3= cs5=false cs6= dproc=Unknown flexString1=mc_rg-int-qa-eus02_aks-int01-qa-eus02_eastus2,kubernetes-internal cs4=14dd43c6-a792-4c07-a33b-c5e561a129de flexString2= AD.ThreatRadarCategory= AD.TORNetworks= AD.MaliciousIPs= AD.AnonymousProxies= AD.IPChain=someIP AD.IPOrigin=External AD.samAccountName=14dd43c6-a792-4c07-a33b-c5e561a129de CEF:0|Forcepoint CASB|Cloud Service Monitoring|1.0|63575216734|Activity|0|act=Monitor app= cat=Normal Activity cs1= destinationServiceName=Office365 deviceExternalId= deviceFacility=false deviceProcessName= dhost= dpriv=User dst=0.0.0.0 duser=someuser_849fcb2777e9@somedomain.onmicrosoft.com dvc=10.1.2.12 dvchost=somedomain end=1600406172000 externalId=0 fsize=-1 msg=//United States/Unknown outcome=Success proto= reason=login request= requestClientApplication=Unknown/Unknown/"" rt=1600406172000 sourceServiceName=Unmanaged src=someIP start=1600406172000 suser= cs2= cs3= cs5=false cs6= dproc=Unknown flexString1= cs4=someaccount9@somedomain.onmicrosoft.com flexString2= AD.ThreatRadarCategory= AD.TORNetworks= AD.MaliciousIPs= AD.AnonymousProxies= AD.IPChain=someIP AD.IPOrigin=Internal AD.samAccountName=someaccount9@somedomain.onmicrosoft.com CEF:0|Forcepoint CASB|Cloud Service Monitoring|1.0|63575216736|Activity|0|act=Monitor app= cat=Normal Activity cs1= destinationServiceName=Office365 deviceExternalId= deviceFacility=true deviceProcessName= dhost= dpriv=User dst=0.0.0.0 duser=someaccount@somedomain.onmicrosoft.com dvc=10.1.2.12 dvchost=somedomain end=1600405713000 externalId=0 fsize=-1 msg=//United States/Unknown outcome=Success proto= reason=login request= requestClientApplication=Unknown/Unknown/"" rt=1600405713000 sourceServiceName=Unmanaged src=someIP start=1600405713000 suser= cs2= cs3= cs5=false cs6= dproc=Unknown flexString1= cs4=someaccount@somedomain.onmicrosoft.com flexString2= AD.ThreatRadarCategory= AD.TORNetworks= AD.MaliciousIPs= AD.AnonymousProxies= AD.IPChain=someIP AD.IPOrigin=External AD.samAccountName=someaccount@somedomain.onmicrosoft.com CEF:0|Forcepoint CASB|Cloud Service Monitoring|1.0|63575216738|Activity|0|act=Monitor app= cat=Normal Activity cs1= destinationServiceName=Office365 deviceExternalId= deviceFacility=false deviceProcessName= dhost= dpriv=User dst=0.0.0.0 duser=someaccount@somedomain.com dvc=10.1.2.12 dvchost=somedomain end=1600405674000 externalId=0 fsize=-1 msg=/1225/United States/Unknown outcome=Success proto= reason=login request= requestClientApplication=Unknown/Unknown/"" rt=1600405674000 sourceServiceName=Unmanaged src=someIP start=1600405674000 suser= cs2= cs3= cs5=false cs6= dproc=Unknown flexString1= cs4=someaccount flexString2= AD.ThreatRadarCategory= AD.TORNetworks= AD.MaliciousIPs= AD.AnonymousProxies= AD.IPChain=someIP AD.IPOrigin=Internal AD.samAccountName=someaccount CEF:0|Forcepoint CASB|Cloud Service Monitoring|1.0|63575216735|Activity|0|act=Monitor app= cat=Normal Activity cs1= destinationServiceName=Office365 deviceExternalId= deviceFacility=false deviceProcessName= dhost= dpriv=User dst=0.0.0.0 duser=someaccount9@somedomain.onmicrosoft.com dvc=10.1.2.12 dvchost=somedomain end=1600406165000 externalId=0 fsize=-1 msg=//United States/Unknown outcome=Success proto= reason=login request= requestClientApplication=Unknown/Unknown/"" rt=1600406165000 sourceServiceName=Unmanaged src=someIP start=1600406165000 suser= cs2= cs3= cs5=false cs6= dproc=Unknown flexString1= cs4=someaccount@somedomain.onmicrosoft.com flexString2= AD.ThreatRadarCategory= AD.TORNetworks= AD.MaliciousIPs= AD.AnonymousProxies= AD.IPChain=someIP AD.IPOrigin=Internal AD.samAccountName=someaccount@somedomain.onmicrosoft.com     Assume that the sourcetype is called "cefevents" for this example.  PROPS.CONF   [cefevents] LINE_BREAKER = CEF:0.+[\r\n]? MAX_TIMESTAMP_LOOKAHEAD = 600 NO_BINARY_CHECK = true KV_MODE = none SHOULD_LINEMERGE = false TIME_FORMAT = %s%Q TIME_PREFIX = \s(start|end|rt)\=     The regex looks ok on regex101 https://regex101.com/r/hf7ZJs/1 but doesn't work on this data. I have also attempted using the props.conf from https://splunkbase.splunk.com/app/487/ but that does not help either.
Hi, I am trying to change the EPOCH value in search having where clause in datamodel using variable but not working  so please help as i have tried different options but didn't work. from datamodel... See more...
Hi, I am trying to change the EPOCH value in search having where clause in datamodel using variable but not working  so please help as i have tried different options but didn't work. from datamodel=Qualys_prod_ext.Qualys_prod where (nodename = Qualys_prod) Qualys_prod.QID=* Qualys_prod.IP=* Qualys_prod.owner="SRE-DIS-ECO-FEA" Qualys_prod.managed=* Qualys_prod.sev="*" Qualys_prod.LAST_FOUND_DATETIME_EPOCH <1600411282 AND Qualys_prod.LAST_FOUND_DATETIME_EPOCH > 1596808800 groupby Qualys_prod.IP, Qualys_prod.signature, Qualys_prod.owner, Qualys_prod.QID, Qualys_prod.CVSS_CUSTOM, Qualys_prod.FIRST_FOUND_DATETIME|search Qualys_prod.STATUS=* NOT Qualys_prod.STATUS=FIXED so want to change from Qualys_prod.LAST_FOUND_DATETIME_EPOCH < 1600411282 to Qualys_prod.LAST_FOUND_DATETIME_EPOCH < epochtime variable but having where clause error. I have defined the variable like | eval epochtime=now() but didn't help    
In my query Support_tier is a field with 3 numbers eg:- 102, 103, 901,802, 908, 103, 107 and HR_Country is America My requirement is to have Assignee field to display value based on below:-- If Sup... See more...
In my query Support_tier is a field with 3 numbers eg:- 102, 103, 901,802, 908, 103, 107 and HR_Country is America My requirement is to have Assignee field to display value based on below:-- If Support_tier have 01 & 02 at the end from above numbers and HR_Country is America then value of Assignee should be 'Earl Vieuten' apart from this all HR_Country which are America should have Assignee as 'Terri' 
Hi, I tried the below query to fit my model, sourcetype=files command="*cmd.exe*" earliest=-90d@d latest=-1d@d|stats count values(file_path) values(user_name) values(action) by device_name,command|... See more...
Hi, I tried the below query to fit my model, sourcetype=files command="*cmd.exe*" earliest=-90d@d latest=-1d@d|stats count values(file_path) values(user_name) values(action) by device_name,command| fit DensityFunction count by "device_name,command,user_name" into mymodel threashold=0.05 dist=norm I am getting the following error, Error in 'fit' command: Error while initializing algorithm "DensityFunction": Algorithm "DensityFunction" cannot be loaded I have tried with LocalOutlierFactor algorithm too but getting the same error. Please suggest.
Hi  I am not able to setup the email server settings and splunk smart export app always gets me to the same page to fill in email settings . is there any other app I can use to send PDFs or screen ... See more...
Hi  I am not able to setup the email server settings and splunk smart export app always gets me to the same page to fill in email settings . is there any other app I can use to send PDFs or screen shots . I am looking for customized pdf not the regular schedule pdf for dashboard.   https://splunkbase.splunk.com/app/4030/
I have two source files as below with different dates .  1.  17092020wlslog  2. 18092020wlslog Now I want to merge these two files and processed as wlslog file name. Before coming to splunk these ... See more...
I have two source files as below with different dates .  1.  17092020wlslog  2. 18092020wlslog Now I want to merge these two files and processed as wlslog file name. Before coming to splunk these two files 
Hi, I am trying to obtain user locked out events (4740) while performing a join with failed password events (4625 logontype 2) and obtain a count of the failed logons on the same day to display with ... See more...
Hi, I am trying to obtain user locked out events (4740) while performing a join with failed password events (4625 logontype 2) and obtain a count of the failed logons on the same day to display with the lockout event.   e.g. User = ForgetfulFrank Event = 4740 Failed Password Attempts = 5   User = PasswordPete Event = 4740 Failed Password Attempts = 2   etc.   using the following query: index=wineventlog sourcetype=WinEventLog source="WinEventLog:Security" EventCode = 4740 AND user="stuarj*" | eval User = user | eval eventTime=strftime(_time, "%Y%m%d") | eval source_account = mvindex(Account_Name,0) | eval target_account = mvindex(Account_Name,1) | join User, eventTime [search index=wineventlog sourcetype=WinEventLog source="WinEventLog:Security" EventCode=4625 AND Logon_Type=2 | eval UserList=split(user,"@") | eval User=mvindex(UserList,0) | eval eventTime=strftime(_time, "%Y%m%d") | eventstats count(RecordNumber) AS "Failed Password Attempts" | fields UserList, User, eventTime, BPT, "Failed Password Attempts"] | rename dest_nt_domain AS "Domain", ComputerName AS "Actioned Server", Caller_Computer_Name AS "Lockout source", source_account AS "Admin Account", target_account AS "Account Locked", EventCode AS "Event Code", status AS "Status", "EventCodeDescription" AS "Description", _time AS "Time" | convert timeformat="%d/%m/%Y %H:%M:%S" ctime("Time") | table Domain, "Actioned Server", "Lockout source", "Admin Account", "Account Locked", "Failed Password Attempts", "Event Code", Status, Description, Time | sort Time   Currently the value for Failed Password Attempts is the total number of records returned for the joined query and not specific only for the user it is performing the join on. I have tried including a where clause to ensure only results for the specific user in the first query are returned e.g. | Where User=user  but this has not helped. How do I obtain the specific number of records returned in the second query specific to the fields performing the join only. Easily done in SQL but this seems to behave differently.
Is there a need to upgrade forwarder as well during Splunk version upgradation process?
Hello. Trying to test a sourcetype using "oneshot".  Although we were able to add raw data using "oneshot" the first time, we are not seeing any subsequent updates.  The command we are using is ... ... See more...
Hello. Trying to test a sourcetype using "oneshot".  Although we were able to add raw data using "oneshot" the first time, we are not seeing any subsequent updates.  The command we are using is ... splunk add oneshot /tmp/<filename>.txt -index <indexname> -sourcetype <sourcetypename> What are the best approaches to troubleshoot "oneshot"? Regards, Max  
Hi All, I want to customize the navigation menu in my custom app to display five different reports in a drop down. I followed documentation https://dev.splunk.com/enterprise/docs/developapps/createa... See more...
Hi All, I want to customize the navigation menu in my custom app to display five different reports in a drop down. I followed documentation https://dev.splunk.com/enterprise/docs/developapps/createapps/addnavsplunkapp/ and was able to display dashboard but reports are not getting displayed.  I tried using <saved name="report1" /> and <saved source="reports" match="xyz"/> but no luck. Looking for help on how can I display reports in a navigation menu as dropdown .
If I have the add-on installed on my heavy forwarder and search heads, is there any need to install it on my indexers as well?????