All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi team, New user here.  I was going through https://docs.splunk.com/Documentation/SplunkCloud/9.2.2403/Admin/ConfigureIPAllowList I have the sc_admin role, I have also enabled token authenticatio... See more...
Hi team, New user here.  I was going through https://docs.splunk.com/Documentation/SplunkCloud/9.2.2403/Admin/ConfigureIPAllowList I have the sc_admin role, I have also enabled token authentication and my splunk cloud version is greater then 8.2.2201  I wanted to add certain IP address to allow list, However I don't see the option to add IP address (screenshot attached)
Thank you for your help, Sorry, I am not very familiar at all with coding.  I am struggling thru the Python script, but making progress.  So the first API call I need to make is: "https://ServerName... See more...
Thank you for your help, Sorry, I am not very familiar at all with coding.  I am struggling thru the Python script, but making progress.  So the first API call I need to make is: "https://ServerName/nitro/v1/config/lbvserver" This gives me a JSON return with about 50  name values like:   [ { "name":"server1", "insertvserveripport":"OFF", }, { "name":"server2", "insertvserveripport":"OFF", },   I need to make a secondary API call that would use these names in the API URL, like: https://su-ns-vpx-int-1.siteone.com/nitro/v1/config/lbvserver_servicegroup_binding/(name) I can't figure out how to capture the name values from the first API call in order for the second API call to be executed for each variable. : (   I hope that makes sense.   Here's the actual code I am using:       import os import sys import time import datetime import json ''' IMPORTANT Edit only the validate_input and collect_events functions. Do not edit any other part in this file. This file is generated only once when creating the modular input. ''' ''' # For advanced users, if you want to create single instance mod input, uncomment this method. def use_single_instance_mode(): return True ''' def validate_input(helper, definition): """Implement your own validation logic to validate the input stanza configurations""" # This example accesses the modular input variable # global_account = definition.parameters.get('global_account', None) pass def collect_events(helper, ew): # Note, for single instance mod input, args will be returned as a dict. # For multi instance mod input, args will be returned as a single value. opt_global_account = helper.get_arg('global_account') global_userdefined_global_var = helper.get_global_setting("userdefined_global_var") # The following examples send rest requests to some endpoint. url="https://servername.com/nitro/v1/config/lbvserver" headers = {"authorization":"Basic bnNyb2dghdg90O0c2NsciEh"} final_result = [] response = helper.send_http_request(url, 'GET', parameters=None, payload=None,headers=headers, cookies=None, verify=True, cert=None,timeout=None, use_proxy=True) # get response body as json. If the body text is not a json string, raise a ValueError r_json = response.json() for name in r_json["lbvserver"]: state = helper.get_check_point(name["name"]) if state is None: final_result.append(name) helper.save_check_point(name["name"], "Indexed") helper.delete_check_point(name["name"]) # get response status code r_status = response.status_code if r_status !=200: # check the response status, if the status is not sucessful, raise requests.HTTPError response.raise_for_status() # To create a splunk event event = helper.new_event(json.dumps(final_result), time=None, host=None, index=None, source=None, sourcetype=None, done=True, unbroken=True) ew.write_event(event)         Thanks again for responding, very much appreciated.   Thanks, Tom      
This is going through Internet and from different places so we don't have a FW blocking our traffic at the moment
Grazie Giuseppe! We're gonna try that
I have reviewed the curl command syntax in the details section of the Add-on download page but was not able to discern how pass the following to the "| curl" command 1) How can I pass the equivalen... See more...
I have reviewed the curl command syntax in the details section of the Add-on download page but was not able to discern how pass the following to the "| curl" command 1) How can I pass the equivalent of:   '-k" or "--insecure'  ? 2) How do I pass 2 headers in the same command line ?  From the LINUX prompt, my command looks like this:    curl -X POST -H "Content-Type: application/json" -H "UUID: e42eed31-65bb-4283-ad05-33f18da75513" -k "https://abc.com/X1"  -d "{ lots of data }"
I am trying to extract fields for this custom data but unable to parse the data | extract kv pairdelim="  " kvdelim=" _: " Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.... See more...
I am trying to extract fields for this custom data but unable to parse the data | extract kv pairdelim="  " kvdelim=" _: " Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123569 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 123456789 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 123456789 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 12345 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 12345 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: UAT _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: NFT2 _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123570 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate _: data _: testing Log _: Alert _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123571 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicable Tax Rate Info _: data _: testing Log _: Inform _: 2024-07-28T15:00:00-06:01 _: 1234 _: duckcreek.medpro.com _: QA _: EDW Policy _: 1.2 _: 600123568 _: Quote Intake from Outlook _: intake_quote _: CreateCalculation.cpp _: CalculateTaxRate _: true _: Identify Applicatble Tax Rate _: data _: testing Any help would be appreciated . can someone please help me with parsing from search or command line using props
Splunk have default definitions for access_combined, access_combined_wcookie, access_common and apache_error. You can look those (.../system/default/props.conf and transforms.conf). You could use thi... See more...
Splunk have default definitions for access_combined, access_combined_wcookie, access_common and apache_error. You can look those (.../system/default/props.conf and transforms.conf). You could use this as base for your own sourcetype definitions. 
My laptop runs Splunk 9.2.2.  So the version is not causing a problem. As to asterisk in field name, this has come up in previous discussions several times.  Each time, it turns out that the name of... See more...
My laptop runs Splunk 9.2.2.  So the version is not causing a problem. As to asterisk in field name, this has come up in previous discussions several times.  Each time, it turns out that the name of the field contains some invisible characters like trailing white space.  You mentioned your raw events are JSON.  That makes this type of problem less likely.  But still, check the original JSON documents as well as ingested JSON events.  
Hello!   If I'm understanding you right (and assuming you're using requests), you should be able to assign your API call to a variable and then call the variable.content in order to access what the... See more...
Hello!   If I'm understanding you right (and assuming you're using requests), you should be able to assign your API call to a variable and then call the variable.content in order to access what the results from the call are. The below, when I tested it, returned a dictionary of the results. From there, you should be able to parse out the variable that you're reassigning.   Example: req = requests.post(url, json=payload, headers=headers) data = requests.content  
Hi there are couple of workarounds for this. Here is one which use loadjob. https://community.splunk.com/t5/Other-Usage/Why-export-button-is-grayed-out-in-some-of-my-panels/m-p/647806/highlight/true... See more...
Hi there are couple of workarounds for this. Here is one which use loadjob. https://community.splunk.com/t5/Other-Usage/Why-export-button-is-grayed-out-in-some-of-my-panels/m-p/647806/highlight/true#M437 Another option is just use outputlookup + inputlookup. I don't believe that there will be any real fix for this as Splunk has announces end of support for classic dashboards and put all efforts to Dashboard studio. I'm expecting that the reason, why export is not working, is when you are using base search and then utilising it on another panel is that there is only this one SID which contains only base search not those in panels. For that reason there is no outputs which can exported on those other panels. Probably they have make decision that as you can just run that query/panel on another window and then do export it's not worth of cost to fix it? r. Ismo
Hi @PickleRick , Thank you for your reply. I sorted the date format at the time of uploading the CSV as %d/%m/%Y. By doing this, all of _time events are not the same and are actually the date fro... See more...
Hi @PickleRick , Thank you for your reply. I sorted the date format at the time of uploading the CSV as %d/%m/%Y. By doing this, all of _time events are not the same and are actually the date from the Posted Transactions Date field. I figured it out in the end by realizing that a Scatter chart is not suitable for date values on the X axis so I changed it to a line chart, and used this search query: index=main source="Transaction File.csv" | fields "Posted Transactions Date" _time "Debit Amount" "Spending Type" | timechart sum("Debit Amount")   Regarding your comment on quotes, I have noticed that comparing my searches to searches I have seen elsewhere.  However I seem to need to use double quotation marks on fields. For example: source="Transaction File.csv" 'Debit Amount'="*" (returns no events) source="Transaction File.csv" Debit Amount="*" (returns no events) However source="Transaction File.csv" "Debit Amount"="*" (returns all events)  
Thanks for the idea about the limits. I checked them and it doesn't look like the cause in this case, though we have run into that before where events with too much data didn't get indexed. Also, th... See more...
Thanks for the idea about the limits. I checked them and it doesn't look like the cause in this case, though we have run into that before where events with too much data didn't get indexed. Also, that looks like a very useful presentation.
I can't find any evidence that this is a simple data issue. We have many customers using queries like this, and it was only noticed a couple weeks ago. One guess is that the behavior changed when we ... See more...
I can't find any evidence that this is a simple data issue. We have many customers using queries like this, and it was only noticed a couple weeks ago. One guess is that the behavior changed when we upgraded from Splunk 8.x to 9.x a couple months ago. To further illustrate: |tstats count where index="my_index" eventOrigin="api" (accountId="8674756857") Result: 6618 |tstats count where index="my_index" eventOrigin="api" (accountId="8674756857" OR serviceType="unmanaged") Result: 0 |tstats count where index="my_index" eventOrigin="api" (accountId="8674756857" OR serviceType="unmanaged" OR noSuchField="noSuchValue*") Result: 6618 So adding a bogus OR term with an asterisk in the value returns the correct result, but without it the result is 0. I can't imagine this is correct behavior, and we have submitted a support request to Splunk.
For timechart to work you need to have reasonable _time field. I suppose that you ingested whole csv file at once and didn't properly parse the time from your data so your _time will be the same for ... See more...
For timechart to work you need to have reasonable _time field. I suppose that you ingested whole csv file at once and didn't properly parse the time from your data so your _time will be the same for all events, right? The way around it would be to overwrite the _time field with the contents of your Posted Transaction Date index=<yourindex> ... | eval _time=strptime('Posted Transaction Date',"%proper%time%format") | timechart values('Debit Amount') Key thing here is specifying proper time format for the strptime command. Also be cautious about using proper quotes.
That doesn't seem right. Those fields should not be empty so something must be overwriting it in search-time (or your indexes are damaged but let's assume they aren't). Try | tstats count where ind... See more...
That doesn't seem right. Those fields should not be empty so something must be overwriting it in search-time (or your indexes are damaged but let's assume they aren't). Try | tstats count where index=dfini by source sourcetype host That should show you what are the indexed fields. You have to search your search-time definitions to see what overwrites those values.
  Hi All, Httpevent collector logs in to splunk, not showing the host,source,sourcetype in splunk, please find the below screen shot, please help me.    
Hi All, I am trying to create a scatter dashboard or similar in Dashboard Studio to show debit transaction amounts over time. A query like this works well in Search, but translates poorly to the ... See more...
Hi All, I am trying to create a scatter dashboard or similar in Dashboard Studio to show debit transaction amounts over time. A query like this works well in Search, but translates poorly to the dashboard: source="Transaction File.csv" "Debit Amount"="*" | stats values("Debit Amount") BY "Posted Transactions Date" I am aware I likely need to convert the the date from string format to date format within my search, something to the effect of:  |eval Date = strptime("Posted Transactions Date","%d/%m/%y") But I am struggling to get the final result. I have also played around with using the _time field instead of Posted Transaction Date  field and with timecharts without success which I think is likely also a formatting issue.  Eg:  source=source="Transaction File.csv" | timechart values("Debit Amount")   As there are multiple debit amount values per day in some cases, I would ideally like a 2nd similar dashboard that sums these debits per day instead of showing them as individual values whilst also removing 1 outlier debit amount value of 7000. Struggling a bit with the required search(s) to get my desired dashboard results. Any help would be appreciated, thank you!        
Exchange is a relatively big solution so depending on what you want to ingest the answer can vary. If you want just the message tracking logs, you can easily ingest them using monitor input. I've ne... See more...
Exchange is a relatively big solution so depending on what you want to ingest the answer can vary. If you want just the message tracking logs, you can easily ingest them using monitor input. I've never dealt with SMTP or OWA logs so can't tell you how these work but I suppose they should also be relatively easily readable. The problem might be in parsing the data of course. QRadar is simply different so don't bring it for comparison.
You didn't say what have you tried so far. Maybe you have some small easily fixable mistake in your configs or maybe your approach is completely wrong. Show us what you've got.
its not extracting the whole data