All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We are using Splunk Light Version 8.0.0 but have discovered recently that Splunk seems to stop logging for a few days once a new month starts. I've attached the splunkd.logs from 00:00.   09-30-20... See more...
We are using Splunk Light Version 8.0.0 but have discovered recently that Splunk seems to stop logging for a few days once a new month starts. I've attached the splunkd.logs from 00:00.   09-30-2020 00:00:00.027 +0300 INFO LMStackMgr - finished rollover, new lastRolloverTime=1601413200 09-30-2020 00:00:27.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:00:35.446 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:00:48.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:00:59.027 +0300 INFO LMSlaveInfo - Detected that masterTimeFromSlave(Tue Sep 29 23:59:58 2020) < lastRolloverTime(Wed Sep 30 00:00:00 2020), meaning that the master has already rolled over. Ignore slave persisted usage. 09-30-2020 00:00:59.554 +0300 WARN TcpOutputProc - Cooked connection to ip=10.210.50.94:9997 timed out 09-30-2020 00:01:09.028 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:01:30.030 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:01:35.480 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:01:51.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:01:59.299 +0300 WARN TcpOutputProc - Cooked connection to ip=10.210.50.94:9997 timed out 09-30-2020 00:02:12.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:02:33.031 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:02:35.492 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:02:35.504 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:02:54.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:03:15.028 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:03:35.504 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:03:36.028 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:03:57.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:03:58.786 +0300 WARN TcpOutputProc - Cooked connection to ip=10.210.50.94:9997 timed out 09-30-2020 00:04:19.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:04:35.516 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:04:40.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:05:02.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:05:23.028 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:05:35.528 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:05:44.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:06:05.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:06:27.026 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:06:28.314 +0300 WARN TcpOutputProc - Cooked connection to ip=10.210.50.94:9997 timed out 09-30-2020 00:06:35.540 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:06:48.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:07:18.028 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:07:28.094 +0300 WARN TcpOutputProc - Cooked connection to ip=10.210.50.94:9997 timed out 09-30-2020 00:07:35.551 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:07:40.031 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:08:02.028 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:08:24.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:08:35.563 +0300 INFO HttpPubSubConnection - Running phone uri=/services/broker/phonehome/connection_10.214.120.71_8089_10.214.120.71_radian01_CD4B2B7B-B69D-4097-B528-0F7B136F6DF1 09-30-2020 00:08:45.033 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:09:07.027 +0300 INFO LMStackMgr - license_warnings_update_interval=auto has reached the minimum threshold 10. Will not reduce license_warnings_update_interval beyond this value 09-30-2020 00:09:27.789 +0300 WARN TcpOutputProc - Cooked connection to ip=10.210.50.94:9997 timed out
I have struggled with getting splunk to recognize timestamps in timestamps from an udp input. I have tried for many hours to figure this out, trying out suggestions found on forum. Hopefully someone ... See more...
I have struggled with getting splunk to recognize timestamps in timestamps from an udp input. I have tried for many hours to figure this out, trying out suggestions found on forum. Hopefully someone here can point me in the right direction. My events comes in like this, where splunk has added a extra timestamp in exactly the same format as the one at the start of the message. So the events message start with two identical timestamps as seen below, the blacked out part is the Ip-adress. Clearly splunk does not recognize this timestamp an puts on it's own with the time of indexing.  I have tried to remove the ":" at the end of timestamp with no success, and i have also edited the props.conf Props.conf: ----------------- [syslog_multiline] BREAK_ONLY_BEFORE_DATE = DATETIME_CONFIG = LINE_BREAKER = NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Operating System disabled = false pulldown_type = 1 TIME_FORMAT = %b %d %H:%M:%S TIME_PREFIX = MAX_TIMESTAMP_LOOKAHEAD = 17
Dear all, I'm in the process of grouping hosts by location. I would like it to be based on the hostname. The goal is to limit users and show them only logs they're supposed to have access to. I ma... See more...
Dear all, I'm in the process of grouping hosts by location. I would like it to be based on the hostname. The goal is to limit users and show them only logs they're supposed to have access to. I managed to add a tag to an event type, and then I discovered it is possible to add metadata to events. Is it possible to segregate access this way too ? What is the best practice for this ? Thanks in advance
I'm trying to import a csv file Using the monitor function. The imported csv file will be updated (overwritten). Since the header line is included in duplicate, I put the setting of "HEADER_FIELD_L... See more...
I'm trying to import a csv file Using the monitor function. The imported csv file will be updated (overwritten). Since the header line is included in duplicate, I put the setting of "HEADER_FIELD_LINE_NUMBER=1". Initially it works, but when I reboot it doesn't. Does anyone know the answer or the issue?
Hello, I´m new to splunk and need a short hint, concerning the following question: I have some Firewall logs in Splunk and would like to search in the Destination (DST) field for specific Servers. ... See more...
Hello, I´m new to splunk and need a short hint, concerning the following question: I have some Firewall logs in Splunk and would like to search in the Destination (DST) field for specific Servers. I uploaded a Server-2.csv and a " | inputlookup Server-2.csv  " shows the content of the file correctly. A manual search like " index=firewall DST=8.8.8.8 " works fine. From my point of view a " index=firewall [ | inputlookup Server-2.csv | table DST ] " should do a search for every entry in the CSV file, but I get no error and no result. There should be a result because 8.8.8.8 is in the CSV as first entry. Is the table entry the wrong syntax? Sorry if this question is too simple, but I really would appreciate some hints. Thx André    
Hello Everyone! I have an output in the below format and would like to filter the duplicate ids with 'fieldA' value as zero then calculate 'fieldA' in the place of zeros Current Output:            ... See more...
Hello Everyone! I have an output in the below format and would like to filter the duplicate ids with 'fieldA' value as zero then calculate 'fieldA' in the place of zeros Current Output:                                                             filtered Outpt:                                                  Required Output id                     fieldA                                                        id                     fieldA                                           id                     fieldA             101            54236.0100                                             101            54236.0100                                101            54236.0100 101            4526441.012                                          101            4526441.012                              101            4526441.012 102                     0                                                           102                     0                                               102             fieldB-fieldC 103           4479652.2456                                         103           4479652.2456                            103           4479652.2456 103                     0                                                            103                     0                                              103            fieldB-fieldC 104                476526                                                   104                476526                                      104                476526 105                     0                                                            105                     0                                               105            fieldB-fieldC 105                     0 I have tried various stats commands but was not able to achieve this. Can you please help me on how to get the required output? Thank you in advance
  @chrisyounger  Another 'feature' I noticed in drilldown is that unless you sort by path, the drilldown path token is set to the first element if finds matching the path node selected, i.e. if the... See more...
  @chrisyounger  Another 'feature' I noticed in drilldown is that unless you sort by path, the drilldown path token is set to the first element if finds matching the path node selected, i.e. if the data is   seg1/seg2/seg3 seg1/seg2 seg1   And you click on the seg2 node, the path element is set to seg1/seg2/seg3 If you sort the path, then all works. Might be worth mentioning this in the docs - or is it easy enough to fix?  
I installed splunk in on    of my index when im connecting to Deploy-poll just getting splunk general terms why  
I have two Splunk servers and run the following command | makeresults | fields - _time | collect index=temp addtime=f   In this , _time variable then comes in the indexed data as 2020-10-01 HH:MM:... See more...
I have two Splunk servers and run the following command | makeresults | fields - _time | collect index=temp addtime=f   In this , _time variable then comes in the indexed data as 2020-10-01 HH:MM:SS as expected. _time is taken from the info_search_time field, which is UTC However, if I do this   | makeresults | collect index=temp addtime=t   Then the raw event data looks like this in BOTH servers 10/01/2020 HH:MM:SS ..... i.e. MM/DD/YYYY but in one server the ingested event is dated 2020-10-01, so parsing correctly as MM/DD/YYYY and in the other, the event is dated 2020-01-10, which is parsing the data as DD/MM/YYYY, which is not correct. I don't believe the collect command provides control to format the added raw time field formatted as ISO8601, if it does, how? But can anyone say what controls the parsing format used to ingest this collected data?  
I have a query saved to display all Hard Disk error for hosts. We generate this data daily from the query and send it to the respective Assignee every Wednesday. My requirement is to prepare a visua... See more...
I have a query saved to display all Hard Disk error for hosts. We generate this data daily from the query and send it to the respective Assignee every Wednesday. My requirement is to prepare a visualization for 30 days data of these scheduled search?? Please help me with the query
Hi, I have a sever with splunk enterprise installed to monitor a directory containing <sample-filename>.gz files Each file is of the below format and need to create a sourcetype that can  1. Ignor... See more...
Hi, I have a sever with splunk enterprise installed to monitor a directory containing <sample-filename>.gz files Each file is of the below format and need to create a sourcetype that can  1. Ignore lines staring with // 2. Map the vales in [ ] to a standard header ---------------------------------------------------- [1599249608,75972,"sample@user.ca",638744076,1,861,337,3,"9","http",80,388951746,"http://abc.com",0,"","","","empty","Sample Filtering","","ctldl.windowsupdate.com","GET",21,3,126] // random info here // something something random ------------------------------------------------------- Tried various strategies but filed. Looking for you help.  
i am writing scripts for dashboard and report in splunk's search but i need to do some modification in its html tags. where i can do that changes?
Hello guys, I'm having issues solving this one. I have a generated datamodel of our network traffic (internal) and I need to filter out the IPs from this traffic flow which are NOT present in this d... See more...
Hello guys, I'm having issues solving this one. I have a generated datamodel of our network traffic (internal) and I need to filter out the IPs from this traffic flow which are NOT present in this datamodel. I've played with several answers here and tried to combine but I'm not getting the right output. DM name = Network_Traffic SubDM = All_Traffic    
Does anyone know where the last checkpoint value or file from a DB input via DB Connect 1?  Thanks.
Is there a document in Splunk show how to upgrade Splunk DB Connect 1 to DB Connect 3?  Thanks.
Hello, I'm very new to splunk. I have a task to query an external bug system and display the results in splunk using a pattern search.  I used splunk external lookup script written in python by foll... See more...
Hello, I'm very new to splunk. I have a task to query an external bug system and display the results in splunk using a pattern search.  I used splunk external lookup script written in python by following the external_lookup.py script as a guide. I'm able to get the results displayed in splunk via this SPL: | stats count | eval definition="sourcetype=SipAdapterGrp1" | lookup defectlookup definition This gives me following:     The lookup script queries the bug system for any bugs that has the search string (definition) in it.  The problems is they show up in one single row. I would have expected it to be in 3 separate rows as there are 3 bugs. I'm thinking this something very simple to correct in the output of the script or somehow modify the SPL cmd? Any advise please?  
All,    Just trying to get a swag of cost by sourcetype. I wrote this search, but seems to me there is a more cost effective way to do this.    index=* sourcetype=* NOT sourcetype=stash NOT ind... See more...
All,    Just trying to get a swag of cost by sourcetype. I wrote this search, but seems to me there is a more cost effective way to do this.    index=* sourcetype=* NOT sourcetype=stash NOT index=stash | fields _raw | eval bytes=len(_raw)/1024/1024/1024 | eval splunk_cost = 685 | eval gigcost = bytes * splunk_cost | stats sum(gigcost) as CostPerYear by sourcetype | sort - CostPerYear | addcoltotals
I have a Splunk Enterprise Heavy Forwarder which is forwarding SQL Audit Logs by way of the Splunk DB Connect App. my version of Splunk DB Connect is 3.2.0 (I know this is not the latest version bu... See more...
I have a Splunk Enterprise Heavy Forwarder which is forwarding SQL Audit Logs by way of the Splunk DB Connect App. my version of Splunk DB Connect is 3.2.0 (I know this is not the latest version but I am using version 3.2.0 because as soon as we upgrade to 3.4.0 DB Connect breaks completely and won't display anything it the UI). I am having an issue where I am unable to delete an input which is a clone of another input and not required any more. the input I want to delete is disabled but the issue happens whether the input is in an enabled or a disabled state. to delete the input I login to my heavy forwarders Splunk Web Interface next I select the Splunk DB Connect App from the Apps List on the left next I select the Data Lab TAB next I select the inputs option then I click the delete option next to the input I want to delete I am prompted with the "Are you sure to delete this input?" message and I select OK at that point I get an error message at the top of the page with the following message: Splunkd error: HTTP 404 -- Action forbidden. my input remains and I am unable to delete it.   I am an admin and I am also a DB Connect Admin so I have permissions to do whatever I want on this box. I have looked through the splunk_app_db_connect folder in the hope of finding the .conf file where the inputs configurations are stored so that I could manually remove them there. unfortunately I am unable to figure out where these Data Lab -> Inputs are being stored I am hoping that someone on here has a deeper understanding of DB Connect and can help me figure out how I can delete my old inputs from DB Connect either by finding the cause of the error and resolving it so it can be done via the GUI or by helping me with finding a way to delete the inputs manually via the file system for the app.  
Hello All,   We are upgrading Splunk Heavy Forwarder from  v6.4.0 to v7.3.1.1 and we were evaluating the need to upgrade add-ons installed on the HF to ensure compatibility with v7.3.x. We have na... See more...
Hello All,   We are upgrading Splunk Heavy Forwarder from  v6.4.0 to v7.3.1.1 and we were evaluating the need to upgrade add-ons installed on the HF to ensure compatibility with v7.3.x. We have narrowed down the add-ons which require an upgrade on the HF, but are unsure if they need to be upgraded to the same version as HF on SH/IDX also (in Splunkcloud) for the field extraction to happen properly? Below are the apps: Any guidance is appreciated, thanks. https://splunkbase.splunk.com/app/1620/ https://splunkbase.splunk.com/app/1915/ https://splunkbase.splunk.com/app/1747/