All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

If all else fails, it's always useful to check job log and see the lispy search. Might not solve the problem but can give valuable insight.
Peace be upon you. I am now running correlation searches and I do not have data to fully test them. I want to activate them in order to protect the company from any attack. I have MITRE ATT&CK Compli... See more...
Peace be upon you. I am now running correlation searches and I do not have data to fully test them. I want to activate them in order to protect the company from any attack. I have MITRE ATT&CK Compliance Security Content But I do not know where to start and how to arrange myself I hope for advice
Well, Splunk can be a bit inconsistent sometimes about using quotes. But when you're referencing something as an argument to the function (or a rvalue in an assignment), double quotes will mean that ... See more...
Well, Splunk can be a bit inconsistent sometimes about using quotes. But when you're referencing something as an argument to the function (or a rvalue in an assignment), double quotes will mean that Splunk will use literal string. So | eval new_value="Posted Transaction Date" Would yield a literal string, not the field contents. (Same with strptime arguments). But yes, in other places that can be a bit unobvious which form to use at any given point.
Hi Joe, there is a command documentation in default/searchbnf.conf [curl-command] syntax = CURL [choice:URI=<uri> OR URIFIELD=<urifield>] [optional: METHOD=<GET|PATCH|POST|PUT|DELETE> VERIFYSSL=<... See more...
Hi Joe, there is a command documentation in default/searchbnf.conf [curl-command] syntax = CURL [choice:URI=<uri> OR URIFIELD=<urifield>] [optional: METHOD=<GET|PATCH|POST|PUT|DELETE> VERIFYSSL=<TRUE|FALSE> DATAFIELD=<field_name> DATA=<data> HEADERFIELD=<json_header_field_name> HEADERS=<json_header> USER=<user> PASS=<password> DEBUG=<true|false> SPLUNKAUTH=<true|false> SPLUNKPASSWDNAME=<username_in_passwordsconf> SPLUNKPASSWDCONTEXT=<appcontext (optional)> TIMEOUT=<float>] -k = "VERIFYSSL=FALSE" headers="{\"content-type\":\"application/json\"}" best regards, Andreas
Hi @chimuru84 , sorry, I  ,isunderstood yur requirement! let me understand: you want to know the users connected to a third party authentication in the last hour that didn't do another connection i... See more...
Hi @chimuru84 , sorry, I  ,isunderstood yur requirement! let me understand: you want to know the users connected to a third party authentication in the last hour that didn't do another connection in the last year but they did before, is it correct? at first: how long do you want to run your check: two years? Then, when you say "authentication at the moment", are you meaning in the last hour or what else? With the above hypotesis So, please try this: index=...... earliest=-2y latest=-h [ search index=...... earliest=-h latest=now | dedup id | fields id ] | eval period=if(_time>now()-31536000, "last Year","Previous Year") | stats dc(Period) AS Period_count values(Period) AS Period BY id | where Period_count=1 AND Period!="Previous Year" | table id In ths way, you have yje users connected in the last hour  that did the last connection (except the last hour) more than one year. If you need a different condition, you can use my approach. Ciao. Giuseppe
Hi @gcusello! I think I didn't ask the question correctly. I want to make a query that returns the users who had a third-party authentication (at the moment), and the last time they passed the authen... See more...
Hi @gcusello! I think I didn't ask the question correctly. I want to make a query that returns the users who had a third-party authentication (at the moment), and the last time they passed the authentication was 365 days ago.
It looks like the script is failing for the data being used - depending on what the script actually does, you could try different time periods e.g. hourly through 5th to see if you can backfill your ... See more...
It looks like the script is failing for the data being used - depending on what the script actually does, you could try different time periods e.g. hourly through 5th to see if you can backfill your summary index that way. Alternatively, try running the search that the script uses (without the collect command) to see if there are any errors.
We are using Splunk cloud in our enterprise and as part of an automation project we want programatic way for doing Splunk search. Based on Splunk website we found that there is node module splunk-sdk... See more...
We are using Splunk cloud in our enterprise and as part of an automation project we want programatic way for doing Splunk search. Based on Splunk website we found that there is node module splunk-sdk (https://www.npmjs.com/package/splunk-sdk) using which we can access Splunk even though the module is not mentioning explicitly anything about Splunk cloud.  Following is the code we attempted but its failing to connect. Would like to know if any special configuration needs to be done in order to achieve the connection.   (async=>{ let splunkjs = require('splunk-sdk'); let service = new splunkjs.Service({username: "myusername", password: "***"}); async function myFunction() { try { await service.login(); console.log("Login was successful: " + success); let jobs = await jobs.fetch(); let jobList = jobs.list(); for(let i = 0; i < jobList.length; i++) { console.log("Job " + i + ": " + jobList[i].sid); } } catch(err) { console.log(err); }   Following is the error we are getting. Please help in understanding and resolving this issue if anyone has encountered the same issue.
Do you have Victoria or Classic experience? You could check it from support -> about or something similar on SCP GUI.
Just as an FYI, I find that a chart is typically most readable when the max value is around 3/4 of the Y-scale so you might want to do something like this: <eval token="chartmax">ceiling($result.ma... See more...
Just as an FYI, I find that a chart is typically most readable when the max value is around 3/4 of the Y-scale so you might want to do something like this: <eval token="chartmax">ceiling($result.max_duration$*4/300)*100</eval>
While kvdelim and pairdelim accept a string of characters as arguments, they will match against only one in each set.  It essentially means, "one of these characters is between each pair".  Try this ... See more...
While kvdelim and pairdelim accept a string of characters as arguments, they will match against only one in each set.  It essentially means, "one of these characters is between each pair".  Try this workaround. | rex mode=sed "s/ _: / = /g" | extract kv pairdelim=" " kvdelim="="
Hi team, New user here.  I was going through https://docs.splunk.com/Documentation/SplunkCloud/9.2.2403/Admin/ConfigureIPAllowList I have the sc_admin role, I have also enabled token authenticatio... See more...
Hi team, New user here.  I was going through https://docs.splunk.com/Documentation/SplunkCloud/9.2.2403/Admin/ConfigureIPAllowList I have the sc_admin role, I have also enabled token authentication and my splunk cloud version is greater then 8.2.2201  I wanted to add certain IP address to allow list, However I don't see the option to add IP address (screenshot attached)
Thank you for your help, Sorry, I am not very familiar at all with coding.  I am struggling thru the Python script, but making progress.  So the first API call I need to make is: "https://ServerName... See more...
Thank you for your help, Sorry, I am not very familiar at all with coding.  I am struggling thru the Python script, but making progress.  So the first API call I need to make is: "https://ServerName/nitro/v1/config/lbvserver" This gives me a JSON return with about 50  name values like:   [ { "name":"server1", "insertvserveripport":"OFF", }, { "name":"server2", "insertvserveripport":"OFF", },   I need to make a secondary API call that would use these names in the API URL, like: https://su-ns-vpx-int-1.siteone.com/nitro/v1/config/lbvserver_servicegroup_binding/(name) I can't figure out how to capture the name values from the first API call in order for the second API call to be executed for each variable. : (   I hope that makes sense.   Here's the actual code I am using:       import os import sys import time import datetime import json ''' IMPORTANT Edit only the validate_input and collect_events functions. Do not edit any other part in this file. This file is generated only once when creating the modular input. ''' ''' # For advanced users, if you want to create single instance mod input, uncomment this method. def use_single_instance_mode(): return True ''' def validate_input(helper, definition): """Implement your own validation logic to validate the input stanza configurations""" # This example accesses the modular input variable # global_account = definition.parameters.get('global_account', None) pass def collect_events(helper, ew): # Note, for single instance mod input, args will be returned as a dict. # For multi instance mod input, args will be returned as a single value. opt_global_account = helper.get_arg('global_account') global_userdefined_global_var = helper.get_global_setting("userdefined_global_var") # The following examples send rest requests to some endpoint. url="https://servername.com/nitro/v1/config/lbvserver" headers = {"authorization":"Basic bnNyb2dghdg90O0c2NsciEh"} final_result = [] response = helper.send_http_request(url, 'GET', parameters=None, payload=None,headers=headers, cookies=None, verify=True, cert=None,timeout=None, use_proxy=True) # get response body as json. If the body text is not a json string, raise a ValueError r_json = response.json() for name in r_json["lbvserver"]: state = helper.get_check_point(name["name"]) if state is None: final_result.append(name) helper.save_check_point(name["name"], "Indexed") helper.delete_check_point(name["name"]) # get response status code r_status = response.status_code if r_status !=200: # check the response status, if the status is not sucessful, raise requests.HTTPError response.raise_for_status() # To create a splunk event event = helper.new_event(json.dumps(final_result), time=None, host=None, index=None, source=None, sourcetype=None, done=True, unbroken=True) ew.write_event(event)         Thanks again for responding, very much appreciated.   Thanks, Tom      
This is going through Internet and from different places so we don't have a FW blocking our traffic at the moment
Grazie Giuseppe! We're gonna try that
I have reviewed the curl command syntax in the details section of the Add-on download page but was not able to discern how pass the following to the "| curl" command 1) How can I pass the equivalen... See more...
I have reviewed the curl command syntax in the details section of the Add-on download page but was not able to discern how pass the following to the "| curl" command 1) How can I pass the equivalent of:   '-k" or "--insecure'  ? 2) How do I pass 2 headers in the same command line ?  From the LINUX prompt, my command looks like this:    curl -X POST -H "Content-Type: application/json" -H "UUID: e42eed31-65bb-4283-ad05-33f18da75513" -k "https://abc.com/X1"  -d "{ lots of data }"
I am trying to extract fields for this custom data but unable to parse the data | extract kv pairdelim="  " kvdelim=" _: " Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.... See more...
I am trying to extract fields for this custom data but unable to parse the data | extract kv pairdelim="  " kvdelim=" _: " Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123569 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 123456789 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 123456789 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 12345 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 12345 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Any help would be appreciated . can someone please help me with parsing from search or command line using props
Splunk have default definitions for access_combined, access_combined_wcookie, access_common and apache_error. You can look those (.../system/default/props.conf and transforms.conf). You could use thi... See more...
Splunk have default definitions for access_combined, access_combined_wcookie, access_common and apache_error. You can look those (.../system/default/props.conf and transforms.conf). You could use this as base for your own sourcetype definitions. 
My laptop runs Splunk 9.2.2.  So the version is not causing a problem. As to asterisk in field name, this has come up in previous discussions several times.  Each time, it turns out that the name of... See more...
My laptop runs Splunk 9.2.2.  So the version is not causing a problem. As to asterisk in field name, this has come up in previous discussions several times.  Each time, it turns out that the name of the field contains some invisible characters like trailing white space.  You mentioned your raw events are JSON.  That makes this type of problem less likely.  But still, check the original JSON documents as well as ingested JSON events.  
Hello!   If I'm understanding you right (and assuming you're using requests), you should be able to assign your API call to a variable and then call the variable.content in order to access what the... See more...
Hello!   If I'm understanding you right (and assuming you're using requests), you should be able to assign your API call to a variable and then call the variable.content in order to access what the results from the call are. The below, when I tested it, returned a dictionary of the results. From there, you should be able to parse out the variable that you're reassigning.   Example: req = requests.post(url, json=payload, headers=headers) data = requests.content