All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I make a script that retourne a certificats list in Excel form then I display uniquely the certifcat about to expire with a validity days. But I would like to display " No certificat about t... See more...
Hello, I make a script that retourne a certificats list in Excel form then I display uniquely the certifcat about to expire with a validity days. But I would like to display " No certificat about to expire" if no value are find and not the message below. Do you know if it is possible ?   index = "index_pki" sourcetype = "splunk_csv" AND (Template=FVE_ServerWeb OR Template=1.3.6.1.4.1.311.21.8.4247237.15172642.2378160.7384375.2155270.77.16524867.13256529 OR Template=FVE_ServerWeb_2Years) | fields ReqID CN Template Validity NotAfter NotBefore San Tumbprint Requester_Name |dedup ReqID CN | where Validity < 30 | sort Validity | table ReqID CN Template Validity NotAfter NotBefore San Tumbprint Requester_Name     Thank you    Regards,   Miguel
I want to present a dashboard to show the flow of a bill (in a shop) from end user to an ERP system .There will be 5 different steps through which the bill flows. I want to show the same as a dashboa... See more...
I want to present a dashboard to show the flow of a bill (in a shop) from end user to an ERP system .There will be 5 different steps through which the bill flows. I want to show the same as a dashboard where a user will be able to see the status of all steps pictorially represented in a single dashboard which shows status of each step(success/failure) and the status of those servers in each step. Is there a way to represent the same  
I have a date field in "%m/%d/%Y" format. I need to find the week number of this date and find the same week number of previous month. For example: my_date = 07/13/2020 - day in 3rd week of July 2... See more...
I have a date field in "%m/%d/%Y" format. I need to find the week number of this date and find the same week number of previous month. For example: my_date = 07/13/2020 - day in 3rd week of July 2020 (07/12/2020 to 07/18/2020) I need to get values of 3rd week of June 2020 (06/14/2020 to 06/20/2020) @All, Please help. Appreciate quick response.  @niketnilay  @lspringer  @carasso 
Basically, I'd like to have a ticket created whenever an alert is triggered or when Submit button is pressed. The Remedy platform already has an API for this and I tested a POST request (using Postma... See more...
Basically, I'd like to have a ticket created whenever an alert is triggered or when Submit button is pressed. The Remedy platform already has an API for this and I tested a POST request (using Postman) based on the required arguments and a ticket would be created successfully. The alert that I have created works fine (e.g. email sent) but the question is, where and how can I implement that ticketing "arguments" into my Splunk script/alert? I've read about Webhook but not sure how to progress further. Below is a sample of the POST call I made to the Remedy endpoint using Postman.   { "Customer": "na\\johndoe", "Contact": "", "Summary": "Group Removed", "Notes": "The Group ABC.LG was removed from Administrator", "Priority": "High", "Work_Order_Type": "General", "Status": "Assigned", "Service": "Applications - Shared Platforms", "Assignee": "Tom Baker", "Group_Assign": "Analysis and Reporting" }   Any step-by-step guidance would be much appreciated.
Hi, I have a list of tasks and their start and end date in an index. And list of holidays in a lookup file. I want to find the difference between start and end date. Also this should exclude the day... See more...
Hi, I have a list of tasks and their start and end date in an index. And list of holidays in a lookup file. I want to find the difference between start and end date. Also this should exclude the day if there are any holidays in between start and end date. Task Start End Days A 5-Jan-20 5-Feb-20   B 15-Jan-20 5-Mar-20   C 1-Apr-20 30-Apr-20     Holidays: 1-Jan-20 27-Jan-20 10-Apr-20 11-Apr-20 12-Apr-20 13-Apr-20 .... Am currently during this using appendcols and mv commands which is making it very slow. Am looking for a direct/easier/perf optimized  solution | appendcols [ inputlookup Holidays.csv | tableHolidays ] | reverse | eventstats list(Holidays) as Holidays delim="|" | nomv Holidays | eval between= mvrange(start, end, "1d"), date_diff = strftime(between,"%d/%m/%y %H:%M:%S") | mvexpand date_diff | where NOT match(date_diff, Holidays) | stats list(date_diff) AS date_diff BY tasks
Hi Splunk community I have a set of data under an index. I want to share part but not all of the data under this indexed data with another user. Can this be done and how? Thanks in advance.
Is the Hashicorp Vault App on Splunkbase a free app or is there a fee?
I'm interested in FISMA compliant threat detection and mitigation software to upgrade network defense for govt defense contract... where do I look for information..do not want demo
I have a problem with parsing, so I want to change the sourcetype. ex) index=A sourcetype=A  →  index=A sourcetype=B I am using forwarder and restarted after changing sourcetype in inputs.conf. ... See more...
I have a problem with parsing, so I want to change the sourcetype. ex) index=A sourcetype=A  →  index=A sourcetype=B I am using forwarder and restarted after changing sourcetype in inputs.conf. However, the log flows into the existing sourcetype. How can I solve it?
Hello  I am getting the following error while inserting the incident in ServiceNow through Splunk Add-On (while the connectivity between Splunk and ServiceNow is established, able to retrieve the i... See more...
Hello  I am getting the following error while inserting the incident in ServiceNow through Splunk Add-On (while the connectivity between Splunk and ServiceNow is established, able to retrieve the incidents in Splunk) command="snowincidentstream", Failed to create ticket. Return code is 400 (Bad Request). One of the possible causes of failure is absence of event management plugin or Splunk Integration plugin on the ServiceNow instance. To fix the issue install the plugin(s) on ServiceNow instance. Search source="cpu_data_updated_1.csv" |where CPU___Usage >= 47|eval contact_type="email" | eval account="splunk_snow_dev" | eval contact_type="email" | eval custom_fields="u_affected_user=nobody||u_caller_id=12345" | eval ci_identifier=host | eval priority=1 | eval category="Software" | eval subcategory="database" | eval short_description="CPU on ". host ." is at ". CPU___Usage | table account, category, subcategory, short_description, contact_type, custom_fields, ci_identifier, priority |snowincidentstream ------------ Getting this even after installing both the plugins and following the instructions in the link: - https://docs.splunk.com/Documentation/AddOns/released/ServiceNow/ConfigureServiceNowtointegratewithSplunkEnterprise Regards
Hi,  I created a custom search command following the instruction in the page and it was working fine (https://www.splunk.com/en_us/blog/tips-and-tricks/write-your-own-search-language.html) however ... See more...
Hi,  I created a custom search command following the instruction in the page and it was working fine (https://www.splunk.com/en_us/blog/tips-and-tricks/write-your-own-search-language.html) however stopped working suddenly.  Tried to put some file creation statement, the files are not created - looks the python program is not running at all. Python code (C:\Program Files\Splunk\etc\apps\search\bin) import splunk.Intersplunk def getShape(text): phrase1 = "upload" phrase2 = "TrustedInstaller" description = [] if (phrase1 in text): description.append("Infra") elif (phrase2 in text): description.append("InstallationGroup") else: description.append("Misc") Corpus = pd.read_csv(r"E:\corpus_single.csv",encoding='latin-1') Corpus.to_csv(r'E:\corpus_func.csv', index = False) if len(description) == 0 return "normal" return "_".join(description)   # get the previous search results results,unused1,unused2 = splunk.Intersplunk.getOrganizedResults() Corpus = pd.read_csv(r"E:\corpus_single.csv",encoding='latin-1') Corpus.to_csv(r'E:\corpus_out.csv', index = False) # for each results, add a 'shape' attribute, calculated from the raw event text for result in results: result["assignmentgrp"] = getShape(result["Message"]) # output results splunk.Intersplunk.outputResults(results) --------- Entry in command.conf (folder :C:\Program Files\Splunk\etc\apps\search\default) [getgroup] filename = getgroup.py --------------- Search Query  source="winlog1.txt" | rex field=_raw "Message: <(?<Message>.*)>" | dedup Message | table Message, getgroup -------- winlog1.txt sample data - having around 10 records  2016-09-28 04:30:31, Info Message: <Ending TrustedInstaller initialization.> 2016-09-28 04:30:31, Info Message: <Starting the TrustedInstaller main loop.> 2016-09-28 04:30:31, Info Message: <TrustedInstaller service starts successfully.> 2016-09-28 04:30:31, Info Message: <Initializing online with Windows opt-in: False.>  
This question was asked a year ago and I was wondering if Splunk had any updates on this. The way the XML works right now, using `$job.sid$` in a post-process search gets the id of the base search ra... See more...
This question was asked a year ago and I was wondering if Splunk had any updates on this. The way the XML works right now, using `$job.sid$` in a post-process search gets the id of the base search rather than the post-processed search. Still looking for solutions to this. Original thread: https://community.splunk.com/t5/Dashboards-Visualizations/Is-it-possible-to-loadjob-a-post-processed-search/td-p/442161 
I've been working on remediating this vulnerability https://www.splunk.com/view/SP-CAAAP3M "Potential Local Privilege Escalation through instructions to run Splunk as non-root user" and the 'fixes' d... See more...
I've been working on remediating this vulnerability https://www.splunk.com/view/SP-CAAAP3M "Potential Local Privilege Escalation through instructions to run Splunk as non-root user" and the 'fixes' don't seem to work. It seems like no matter the tactic - boot-start, init.d, systemd, etc. the requirements for the vulnerability always seem to be met; 1, 2, and 3a and/or 3b . "Potential Local Privilege Escalation through instructions to run Splunk as non-root user (SPL-144192)" Is there a way to configure splunk to startup and run that will mitigate the vulnerability?  
Lets say I have a search that brings up X, Y, and Z and displays it in a panel. X, Y, and Z contain events with ip address, phone number, name. I want to then, in another panel on a dashboard,  list ... See more...
Lets say I have a search that brings up X, Y, and Z and displays it in a panel. X, Y, and Z contain events with ip address, phone number, name. I want to then, in another panel on a dashboard,  list all of the attributes of X, Y, and Z in order of the first panel.
Hi team, the issue that I am currently experiencing is that WinEventLog not sending data to the main index . I am new to Splunk and so far have not been able to figure out the reason. Thoughts?
I am looking to get the total number of items in an array, so in the example below the data collect would display the number 9.  Does anyone know a good getter chain for this.  So far everything I ha... See more...
I am looking to get the total number of items in an array, so in the example below the data collect would display the number 9.  Does anyone know a good getter chain for this.  So far everything I have tried hasnt worked Thanks [<QueryResultSet> <Metadata> <Column name="cvcId" type="java.lang.Long"/> <Column name="owner" type="java.lang.String"/> <Column name="groupName" type="java.lang.String"/> <Column name="opsName" type="java.lang.String"/> <Column name="cvcType" type="java.lang.String"/> <Column name="claimSystemId" type="java.lang.Long"/> <Column name="hccClaimNumber" type="java.lang.String"/> <Column name="receiptDate" type="java.sql.Date"/> <Column name="claimType" type="java.lang.String"/>
What is the best practice for collecting events in which the user performs a query against the cloudera / hadoop ecosystem database and also on the database logon events. Basically, I need to know w... See more...
What is the best practice for collecting events in which the user performs a query against the cloudera / hadoop ecosystem database and also on the database logon events. Basically, I need to know which user logged into the hadoop database and which queries were performed. Which of these two applications, "Hadoop Monitoring" or "Splunk Hadoop Connect", meets these expectations? Would there also be the possibility of using the "Splunk DB Connect" app?
Splunkers, I sure hope this is just user error and I am myopic today!  Have a simple macro:   collectevents(2) args=index_parm,testmode_parm | addinfo | collect index=$index_parm$ testmode=$testm... See more...
Splunkers, I sure hope this is just user error and I am myopic today!  Have a simple macro:   collectevents(2) args=index_parm,testmode_parm | addinfo | collect index=$index_parm$ testmode=$testmode_parm$ source=mysource   These both work:   `collectevents("Indexname",0)` `collectecents("Indexname","False")`   But these don't work:   stuff.... |eval index_parm="Indexname" |eval testmode_parm=0 `collectevents(index_parm,testmode_parm)`    When ever I pass a variable I get:  Error in 'SearchProcessor': Invalid option value. Expecting a 'boolean' for option 'testmode'.  Instead got 'testmode_parm'. It only complains about the testmode, but it's not passing the index_parm string correctly either.   If I don't pass or remove testmode_parm, I stop getting an error but nothing shows up in the index.  Crtl+Shift+E show index=index_parm.  It's like the substitution is just not taking place. Any ideas?  Thank you.
We are trying to ingest Kemp LoadMaster WAF/ModSecurity logs to Splunk cloud via http event collector (HEC).  I've already done my part setting up on Splunk side and as test, I used curl statements t... See more...
We are trying to ingest Kemp LoadMaster WAF/ModSecurity logs to Splunk cloud via http event collector (HEC).  I've already done my part setting up on Splunk side and as test, I used curl statements to test and I'm successful.  However, when going into the settings of Kemp LoadMaster WAF settings my options are Logging Format<JSON>; Enable Remote Logging <checked>; Remote URI <https://mysplunkHF.com:8088/services/collector/event>; Username<splunk>; Password<token HEC>  final step, Set Remote Parameters to save my config.   At this point, I should be seeing events in the Splunk cloud, but I'm not.  Has anyone tried to get Kemp LoadMaster WAF logs to Splunk cloud via HEC?    
My query looks like this index=* sourcetype="MYSOURCE"  | table company_id | dedup company_id | where company_id != "-" | lookup companyid_companyname_usercount.csv CompanyId as company_id OUTPUT N... See more...
My query looks like this index=* sourcetype="MYSOURCE"  | table company_id | dedup company_id | where company_id != "-" | lookup companyid_companyname_usercount.csv CompanyId as company_id OUTPUT Name | table company_id Name | sort Name   The first part of the query is  index=* sourcetype="MYSOURCE"  | table company_id | dedup company_id | where company_id != "-"   second part | lookup companyid_companyname.csv CompanyId as company_id OUTPUT Name | table company_id Name | sort Name   Requirement: I want to perform the second part of my query only if 'companyid_companyname.csv' exists. other wise just return the list of company IDs.    I have tried using the IF in eval, but not sure how the else part of the query can workout.  Any thoughts and help is highly appreciated