All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Is there a way to include number of packets to check while using lookup ping? What I can see is default it is taking only 1 packet sent.
Under /opt/splunkforwarder/etc/system/local/server.conf , we have used the env variable $INSTANCE_ID . [general] serverName = $INSTANCE_ID We then verified that we got the right results by... See more...
Under /opt/splunkforwarder/etc/system/local/server.conf , we have used the env variable $INSTANCE_ID . [general] serverName = $INSTANCE_ID We then verified that we got the right results by using the command: ./splunk show servername This then showed the correct id we were looking for. However, when we go to start Splunk using ./splunk start OR ./splunk to restart it will then show it's started. Right after it shows it's started, I attempt to verify it's running by using ./splunk status . This shows us that the service did not in fact start, and it still shows splunkd is not running . Why is this? We verified that if we hard code the result into the servername in server.conf , the service does in fact start. But, for some reason, by using the env variable something is keeping Splunk from running. Any help on this would be great!
How do I get only the value that is before the ms? Remember that this log is multiline, each statement is an event. Ex: 13657, 5469, 6000 2020-06-02 18:01:04,331 INFO ect-1-1rere872 25000 E... See more...
How do I get only the value that is before the ms? Remember that this log is multiline, each statement is an event. Ex: 13657, 5469, 6000 2020-06-02 18:01:04,331 INFO ect-1-1rere872 25000 Execution Info +[Job_ExtractICON].......................................................................13657 ms. Invocations 1 2020-06-02 17:48:40,449 INFO ecp-2-14343527 25000 Execution Info +[Job_ExtractICON].................................................................................5469 ms. Invocations 1 2020-06-02 17:45:27,697 INFO ecj-1-16576 25000 Execution Info +[Job_ExtractICON]...........................................................................6000 ms. Invocations 1
Hi, I would like to run a search,which gives me the list of host with status' - normal,warning and critical Where Critical being logs not present in a host for 30 mins,warning - not present in 15... See more...
Hi, I would like to run a search,which gives me the list of host with status' - normal,warning and critical Where Critical being logs not present in a host for 30 mins,warning - not present in 15 mins and but lesser than 30 mins and otherwise its normal. Am facing 2 problems here,if a server has no logs for 2 days,and when i run a search today,my search is not showing up that host itself. Another problem is that when change my time modifier to issue time,it is not showing the exact result. Below is the query used. index = abc* host=efg* |stats latest(_time) as latest by host |eval Status = case (latest <= relative_time(now(),"-15m") AND latest > relative_time(now(),"-30m"),"Warning", latest <= relative_time(now(),"-30m"),"Critical", true(),"Normal") |eval Recent_Updated_Time = strftime(latest,"%c") Kindly suggest
Hello, We have an issue where when we set up the alert action we put all the required fields with the red *. We trigger the alert and in the logs, we see that it is requiring the content: component... See more...
Hello, We have an issue where when we set up the alert action we put all the required fields with the red *. We trigger the alert and in the logs, we see that it is requiring the content: components. "fields": { "issuetype": { "name": "Task" }, "priority": { "name": "High" }, "summary": "Splunk Alert: Alert", "project": { "key": "id " }, "description": "The alert condition for ' Alert' was triggered." } }, **HTTP Error=400, content={"errorMessages":[],"errors":{"components":"Components is required."}}**" In custom fields we put a component and we triggered the alert and we got a different error : }, \"content\": {\"component\": { \"name\": \"Incidents"} } } }" **action_name="jira_service_desk" search_name="CrowdStrike Detection Alert" signature="Unexpected error: Expecting property name enclosed in double quotes: line 15 column 2 (char 257)."** Can you please Advise? Thanks!
Dear, couple hours i am trying to get: i have one log with no similar way of words in one line... because of that i cannot get in one search what i need. This two searches get what i need: i... See more...
Dear, couple hours i am trying to get: i have one log with no similar way of words in one line... because of that i cannot get in one search what i need. This two searches get what i need: index=ise "authentication failed" "Administrator-Login" index=ise "authentication failed" "UserName" Now i want this two query to join in one and get results which admin login and user login have authentication failed... thank you
Hello, I need help on Kafka Connect. I am using Kafka _2.12-1.1.1 and Splunk Connect for Kafka version 1.20. It's distributed mode but there is only 1 Kafka connect node. I'm having a problem with... See more...
Hello, I need help on Kafka Connect. I am using Kafka _2.12-1.1.1 and Splunk Connect for Kafka version 1.20. It's distributed mode but there is only 1 Kafka connect node. I'm having a problem with managing configurations. When I DELETE a connector, and then restart Kafka Connect, the old connectors might be reloaded. And sometimes create a new connector with curl localhost:8083/connectors -X POST -H… . will failed (no error message, but not added successfully). It seems the old connector configurations were stored somewhere and sometimes Kafka Connect will looking for those when restarted. I am thinking if we can delete old configurations permanently before creating new connectors then the problem might be solved.
I have one search that checks for entries with duration >= 50000 (responses for requests) source="abc.log" | regex "\"duration\" : ([5-9][0-9]{4}|[0-9]{6,})" The search returns results with JS... See more...
I have one search that checks for entries with duration >= 50000 (responses for requests) source="abc.log" | regex "\"duration\" : ([5-9][0-9]{4}|[0-9]{6,})" The search returns results with JSON format: ... "duration" : 60026, "correlationId" : "be225a47972b95f5", ... I want to show the connected request for that response in the same result set. So I would like to find results where the correlation id matches. Something like: (source="abc.log" | regex "\"duration\" : ([5-9][0-9]{4}|[0-9]{6,})" | eval correlationId) OR (source="abc.log" | correlationId) The second part of the "OR" statement would return the requests that ended up with 50s+ response time. Could anyone help me with the syntax to achieve this? Thanks!
I recreated the dashboard using the report search and have the search returning all of the table results. I have an input for the reference number as a text box. The token name is: purchCostReference... See more...
I recreated the dashboard using the report search and have the search returning all of the table results. I have an input for the reference number as a text box. The token name is: purchCostReferenceToken I want to limit the table results based on this token. This is the search: <form> <label>Thru Train Dashboard</label> <fieldset submitButton="false" autoRun="true"> <input type="text" token="purchCostReferenceToken" searchWhenChanged="true"> <label>Enter a TMS Reference Number to Filter Table</label> <default>*</default> <initialValue>*</initialValue> </input> </fieldset> <row> <panel> <title>Thru Train Data</title> <table> <search> <query>index=... "<billingMethod>RULE</billingMethod>" "createMessage MsgSource" | xmlkv | rex max_match=0 "\<purchasedCostTripSegment\>(?P<segment>[^\<]+)" |eval Segments = mvrange(1,mvcount(mvindex(segment, 0, 2))+1,1) | rex max_match=0 "\<carrier\>(?P<Carriers>[^\<]+)" | rex max_match=0 "\<billingMethod\>(?P<BillingMethod>[^\<]+)" | rex max_match=0 "<purchasedCostTripSegment>[\s\S]*?<origin>\s*<ns2:numberCode>(?P<Origin>\d+)" | rex max_match=0 "<purchasedCostTripSegment>[\s\S]*?<destination>\s*<ns2:numberCode>(?P<Destination>\d+)" | rex max_match=0 "<purchasedCostTripSegment>[\s\S]*?<stopOff>\s*<ns2:stopOffLocation>\s*<ns2:numberCode>(?P<StopOffLocation>\d+)" | eval Time =_time | convert timeformat="%m-%d-%Y %H:%M:%S" ctime(Time) | table purchCostReference, eventType, Time, Segments, Carriers, BillingMethod, Origin, Destination, StopOffLocation | sort Time</query> <earliest>-30d@d</earliest> <latest>now</latest> </search> <option name="drilldown">none</option> </table> </panel> </row> </form> Where do I add the token to limit the search? I tried adding this to the end of the search before the table command: ... | eval Time =_time | convert timeformat="%m-%d-%Y %H:%M:%S" ctime(Time) purchCostReference=$purchCostReferenceToken$ | table purchCostReference, eventType, Time, Segments, Carriers, BillingMethod, Origin, Destination, StopOffLocation | sort Time I get an error...error in convert command: the argument purchCostReference- is invalid I would like to add filters in several of the table columns. The purchCostReference value is an extracted field in the search using xmlkv
Hello, I have two questions that are quite confusing to me, can you please explain this to me in layman terms? Field inclusion happens before field extraction and can improve performance. Fie... See more...
Hello, I have two questions that are quite confusing to me, can you please explain this to me in layman terms? Field inclusion happens before field extraction and can improve performance. Field exclusion happens after field extraction, only affecting displayed results.
I have integrated Splunk with Demisto. I am trying to run the below search from Demisto: source="squid" clientip="xxx" | where server_ip IN(${DBotAvgScore.Indicator}) | stats count by server_ip DB... See more...
I have integrated Splunk with Demisto. I am trying to run the below search from Demisto: source="squid" clientip="xxx" | where server_ip IN(${DBotAvgScore.Indicator}) | stats count by server_ip DBotAvgScore.Indicator is an array that contains the below values ["204.79.197.200","13.107.18.254","117.18.237.29","13.107.21.200","104.16.133.229","35.241.8.149","52.88.91.154","104.17.211.204","209.197.3.15"] The search gets replaced with the value of the array and fails to run because of '['. I am stuck here. I would appreciate any help.
_Time is the column that gets moved from last to first only within the reports csv. Within the Inline results, the search, and a direct csv from the search keeps the columns in the correct order. How... See more...
_Time is the column that gets moved from last to first only within the reports csv. Within the Inline results, the search, and a direct csv from the search keeps the columns in the correct order. How can I correct for this current and future reports?
Hello Folks, I am trying to set up Splunk App for Windows Infrastructure for easier dashboarding and management, however, despite days of research, I am still unable to fix/solve the problem regard... See more...
Hello Folks, I am trying to set up Splunk App for Windows Infrastructure for easier dashboarding and management, however, despite days of research, I am still unable to fix/solve the problem regarding sourcetype . So far, I have already installed Splunk Add-on for Microsoft Windows and I am able to receive various data already, to show a snippet of my inputs.conf at Splunk Add-on for Microsoft Windows: ``` ###### Host monitoring ###### [WinHostMon://Computer] interval = 600 disabled = 0 index = hostmonitoring sourcetype=WinHostMon type = Computer [WinHostMon://Process] interval = 600 disabled = 0 index = hostmonitoring sourcetype=WinHostMon type = Process ``` I have a lot more configuration but the concept should be clear that I followed the initial inputs.conf in the default and use only the portions which I require. If I were to search for index=hostmonitoring I will be able to get data just fine, but I am unable to get any data when I search sourcetype=WinHostMon . The concept is the same regarding the other sourcetypes, Perfmon , WinHostMon , WinPrintMon , and WinRegMon , for some odd reason, ONLY WinEventLogs were "searchable". Upon researching deeper, even though I included sourcetype={my_input} , it seems like the props.conf requires a matching stanza if not it wouldn't work anyways. On the other hand, I have seen people saying that some app authors do not allow customization of sourcetype . I am truly puzzled by this and I have seen just a few similar queries online but a proper solution was never shared. https://answers.splunk.com/answers/583743/how-to-enable-sourcetypewinregistry-for-windows-in.html I am truly struggling with this and I hope someone can help me out! Thank you very much for taking the time to read this long message!
A requirement for one of our support teams is to be able to export a PDF of a dashboard using the Schedule PDF Delivery option within Splunk. As an admin, I have temporarily added the list_settings c... See more...
A requirement for one of our support teams is to be able to export a PDF of a dashboard using the Schedule PDF Delivery option within Splunk. As an admin, I have temporarily added the list_settings capability to the team's role so that they can perform this action. I am tentative to grant the team this capability long-term as I cannot find any documentation to fully explain what this list_settings capability. Please can you explain the full list of abilities that a user is granted when given this capability (better explanation than on the documentation page https://docs.splunk.com/Documentation/Splunk/8.0.4/Security/Rolesandcapabilities). Thank you
I am trying to re-format the x-axis time to read cleaner. Here is my spl: index="servers" source="/var/log/secure" action=failure | timechart count | eval time=_time |table time count | fieldfor... See more...
I am trying to re-format the x-axis time to read cleaner. Here is my spl: index="servers" source="/var/log/secure" action=failure | timechart count | eval time=_time |table time count | fieldformat time=strftime(time, "%Y%m%d%H%M") How can I get it in a format like %Y-%m-%d %H:%M ?
I want to define two separate BTs: * BT-1:  URI Ends with:  hendelser/feed  * BT-2:  URI matches Regex: hendelser\/[a-f0-9]+   (match for hexadecimal GUID) My problem is that the Regex for BT-2 al... See more...
I want to define two separate BTs: * BT-1:  URI Ends with:  hendelser/feed  * BT-2:  URI matches Regex: hendelser\/[a-f0-9]+   (match for hexadecimal GUID) My problem is that the Regex for BT-2 also consumes the URI for BT-1 which I definitely do not want, meaning that I cannot find a way to define a Regex that excludes the string "feed" when searching for a hexadecimal GUID. I know some engines has support for Regex Conditionals which Appdynamics apparently don't support.  After spending too much time on this I wonder if some of you are more experienced than I am and could guide me on this matter?
Hello There, I have recently started using the DB Connect App and I want to create a Batch Input with frequency of every 120 minutes. I have set this up and now it is loading data in the index ev... See more...
Hello There, I have recently started using the DB Connect App and I want to create a Batch Input with frequency of every 120 minutes. I have set this up and now it is loading data in the index every 120 minutes, but is there any way with which I can just keep the latest data in the index and remove all the old data added by previous runs of this batch? So, I want to override the data in index after every batch run and not to append the data. Any help would be greatly appreciated. Thank you. Madhav
Hi Team, I am new to splunk and trying to explore its features and capabilities. I just setup my splunk enterprise instance and want to install ITSI in that instance. Could anyone help me how to do... See more...
Hi Team, I am new to splunk and trying to explore its features and capabilities. I just setup my splunk enterprise instance and want to install ITSI in that instance. Could anyone help me how to do that? I tried installing from splunkbase but it says contact sales for this application. I tried reaching them but no response. Is there any other way to setup ITSI or only sales team will do that? I also found some modules regarding ITSI in splunkbase like ITSI module for APM. Please guide me thorugh that also. Thank you in advance.
I want to show percentage of data of certain month, along with that i also want the data from last month in single value visualization. So that we can compare overall percent data of current month to... See more...
I want to show percentage of data of certain month, along with that i also want the data from last month in single value visualization. So that we can compare overall percent data of current month to the previous month in single value.Can anyone give the query? For ex:single value visualization looks like this: 99.9996%--current month percent data 99.666%--Previous month percent data
After updating to ES App version 5.3.1, the extreme search commands no longer exist. An error message is shown that the command is not found. e.g. Search: Access - Authentication Failures By So... See more...
After updating to ES App version 5.3.1, the extreme search commands no longer exist. An error message is shown that the command is not found. e.g. Search: Access - Authentication Failures By Source - Context Gen Unknown search command 'xsupdateddcontext'.