Activity Feed
- Got Karma for Re: Why is KV store initialization failing after fassert() error in mongod.log?. 12-13-2024 05:48 AM
- Posted After upgrade, emails are not being sent on triggered alerts. on Alerting. 02-10-2022 09:54 AM
- Posted How is indexing daily amount configured? on Splunk Enterprise. 01-26-2022 05:59 AM
- Got Karma for Re: How do I set up inputs.conf to allow for a cloud application to send syslog over a SSL connection?. 02-04-2021 04:27 PM
- Posted Syslog Data is not being parsed correctly by Fortinet Fortigate App on All Apps and Add-ons. 08-19-2020 10:50 AM
- Got Karma for Re: How do I set up inputs.conf to allow for a cloud application to send syslog over a SSL connection?. 06-05-2020 12:50 AM
- Got Karma for How do I stop getting duplicate entries of JSON data from an API feed?. 06-05-2020 12:50 AM
- Got Karma for Re: Is Mongodb vulnerable to outside manipulation?. 06-05-2020 12:48 AM
- Got Karma for Re: Splunk Enterprise - How do I move indexed data from a cluster of indexers to a single instance?. 06-05-2020 12:48 AM
- Got Karma for Re: Trying to extract the value of a field which occurs twice in one event. Regex maybe?. 06-05-2020 12:48 AM
- Got Karma for Why am I getting "Invalid key in stanza [lookup:cam_category_lookup] in E:\Splunk\etc\apps\Splunk_SA_CIM\default\managed_configurations.conf, line 34: expose (value: 1)". 06-05-2020 12:48 AM
- Got Karma for How do I add a time range to a datamodel search that cannot use tstats?. 06-05-2020 12:48 AM
- Got Karma for How do I match an IP address to a range that spans multiple CIDRs?. 06-05-2020 12:48 AM
- Got Karma for Re: Why is KV store initialization failing after fassert() error in mongod.log?. 06-05-2020 12:48 AM
- Got Karma for Re: Why am I now getting "SSL configuration issue: invalid CA public key file" from Splunk Supporting Add-on for Active Directory after upgrading ?. 06-05-2020 12:48 AM
- Got Karma for Why am I now getting "SSL configuration issue: invalid CA public key file" from Splunk Supporting Add-on for Active Directory after upgrading ?. 06-05-2020 12:48 AM
- Got Karma for Why am I now getting "SSL configuration issue: invalid CA public key file" from Splunk Supporting Add-on for Active Directory after upgrading ?. 06-05-2020 12:48 AM
- Got Karma for What is the difference between splunk_managment_console and splunk_monitoring_console?. 06-05-2020 12:48 AM
- Got Karma for Why is splunkd.log not getting indexed? Receiving error "The file 'E:\Splunk\var\log\splunk\splunkd.log' is invalid. Reason: binary". 06-05-2020 12:48 AM
- Got Karma for Re: Why is KV store initialization failing after fassert() error in mongod.log?. 06-05-2020 12:48 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
1 | |||
0 | |||
0 |
02-10-2022
09:54 AM
After the upgrade of Splunk Enterprise to 8.2.4, several triggered alerts with tokens are no longer sending out emails. Looking at splunkd.log, there is a warning message concerning the alert 02-10-2022 10:02:28.244 -0600 WARN Pathname [15448 AlertNotifierWorker-0] - Pathname 'E:\Splunk\bin\Python3.exe E:\Splunk\etc\apps\search\bin\sendemail.py "results_link= "ssname=Password Reset Reminder" "graceful=True" "trigger_time=1644508948" results_file="E:\Splunk\var\run\splunk\dispatch\scheduler__srunyonadm__search__RMD5c5f30383081059ef_at_1644508800_24883\results.csv.gz" "is_stream_malert=False"' larger than MAX_PATH, callers: call_sites=[0xd4d290, 0xd4f001, 0x15d1632, 0x15ce217, 0x1439f53, 0x13c8176, 0x71f406, 0x71ea9e, 0x71e899, 0x6eaeeb, 0x70c3c5] I am concerned with the "larger thanMAX_PATH" message because Splunk doc states - "The Windows API has a path limitation of MAX_PATH which Microsoft defines as 260 characters including the drive letter, colon, backslash, 256-characters for the path, and a null terminating character. Windows cannot address a file path that is longer than this, and if Splunk software creates a file with a path length that is longer than MAX_PATH, it cannot retrieve the file later. There is no way to change this configuration." What can be done to get this working again? Regards, Scott Runyon
... View more
Labels
- Labels:
-
Windows
01-26-2022
05:59 AM
I upgraded Splunk Enterprise to 8.1.8 from 8.0.6. I am now getting messages where 45 days are allowed over 60 days to go over the indexing limit. Looking at the indexing, the largest amount are from internal Splunk. I have a single instance. The first three indexes are internal Splunk The largest source is the Splunk Metrics log And lastly, the sourcetypes splunk_metrics_log and splunkd are a major portion of indexed data My question is, why is the internal Splunk processes counting towards my indexing? Regards, Scott Runyon
... View more
Labels
- Labels:
-
administration
-
metrics
08-19-2020
10:50 AM
Syslog data from my Fortinet firewall is not being parsed out correctly. I have noticed that there are multiple formats of messages. This first format parses out correctly. ... policyid=474 sessionid=3929476361 user="FRED" group="RegularSupport.Grp" srcip=10.120.2.26 ... These do not (specifically the user field is not populated) .... policyid=441 sessionid=3929476369 user="BARNEY" srcip=10.120.36.105 .... (missing group after user field) ..... policyid=471 sessionid=3929476336 user="BETTY" group="TL-AVP-SVP.Grp" srcip=10.120.2.128 .... (has "-" in the text for group) .... policyid=103 sessionid=3929476142 user="WILMA" group="Wkstns_SSLVPN_PD.Grp" srcip=172.24.1.18 ...... (has "_" in the text for group). I tried to do a extract fields on one of the different events, that solved the issue for the new event but the original event no longer works. I assume that there is a regex somewhere that parses this out but I cannot find it. My question is where do I go find out where it is so I can hopefully generate one that works? Scott
... View more
Labels
- Labels:
-
troubleshooting
02-04-2020
05:49 AM
Giuseppe,
I ended up adding a seperate transforms.conf entry for each dstip. After testing, it appears to be working.
Regards,
Scott
... View more
02-03-2020
06:36 AM
Giuseppe,
Thanks for the response. I want to dump the events where action="deny" and scrip=10.12.55.55 are always present and there can be multiple dstip entries. Based on your answer, will the regex in transforms.conf look like this?
(?:action=\"deny\") (?:srcip=10.12.55.55) (?:dstip=192.168.10.)|(?:dstip=172.16.10.)|(?:dstip=172.18.10.)
Regards,
Scott
... View more
01-31-2020
12:32 PM
I am receiving Syslog data from the firewall and I would like to send a subset of it to the nullQueue.
The issue I am having is that I have two set values (action and srcip) but 6 values for the destination.
The formats of the fields of the raw data are -
action="deny" scrip=10.12.55.55 dstip=192.168.10.0
The regex I have come up with is:
(?:srcip\=10\\.12\\.55\\.55)|(?:dstip\=192\\.168\\.10\\.*)|(?:dstip\=172\\.16\\.10\\.*)|(?:dstip\=172\\.18\\.10\\.*)|(?:action\=\"deny\")
What happens is that the "|" acts as an "or" so I will be dumping too much information.
My question is what is the format to place in transforms.conf to filter out the events to be dropped?
Regards,
Scott
... View more
01-17-2020
08:53 AM
peterm30, I would like to help you out but we have moved away from DUO and no longer use the DUO app in Splunk.
... View more
12-10-2019
08:26 AM
My data is from a command system that is being sent over UDP connection direct to the indexer. It sends data to Splunk every hour.
Data format is Month Date Time sent from command system, system name, 1, Logon Time
Examples
Dec 10 00:51:46 system.network.net 1 2019-12-10T00:51:19.188-06:00
Dec 9 10:58:25 system.netework.net 1 2019-12-09T10:58:23.793-06:00
Dec 9 22:38:38 system.network.net 1 2019-12-06T08:05:23.745-06:00
I want to use Logon Time as the event time not the time it was received.
This used to work until I upgraded to 7.3.3 from 7.3.0 to fix the Y2K20 issue.
props file contains
DATETIME_CONFIG =
LINE_BREAKER = ([\r\n]+)
NO_BINARY_CHECK = true
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3Q%:z
TIME_PREFIX = \w+\s{1,}\d{1,}\s\d{2}:\d{2}:\d{2}\s\S+\s\d\s
category = Custom
disabled = false
MAX_TIMESTAMP_LOOKAHEAD = 256
TIMESTAMP_FIELDS =
Any suggestions on what happened?
Scott
... View more
11-19-2019
11:29 AM
I did find the app for 7.3.0 and now have it installed. I am now getting the API config screen.
ivanreis, if you would place your response as and answer, I would like to vote it as an answer.
... View more
11-18-2019
12:42 PM
Looking at the app configs, I noticed that under /default/package.conf, there is a stanza -
[splunk]
version = 8.0.0
I am running 7.3.0, If I override this parameter, will it help?
... View more
11-15-2019
06:55 AM
After migrating from OSSEC to Wazuh , I installed the Wazuh app ver. 3.10.2. When starting the app, the API screen comes up with the message - "Kv Store is being initialized please wait some seconds and try again later." I has been a few days and the KV store is still not there.
What do I need to do to get the KV store initialized?
System details - single instance running Splunk Enterprise 7.3.0
Regards,
Scott
... View more
11-14-2019
08:13 AM
2 Karma
To solve the issue, in inputs.conf
Removed [tcp://6514] stanza
Added
[tcp-ssl://6514]
connection_host = dns
sourcetype = syslog
index = avprogram
[SSL]
rootCA = E:\Splunk\etc\auth\cacert.pem
serverCert = E:\Splunk\etc\auth\server.pem
password = *******************
Note: I had to add the entire path because this is a Windows system.
... View more
11-14-2019
07:03 AM
To get this to work, I had to remove the existing TCP input and add both the tcp-ssl and ssl sections.
... View more
10-11-2019
12:22 PM
Our anti-virus application is located in the "cloud" and is sending syslog data to the indexer over TCP port 6514. The application has the ability to use SSL to encrypt this data. Looking at previous answers, it looks like I should add [tcp-ssl://6514] to \etc\system\local\inputs.conf. After modifing the config and changing the remote end to use SSL, I get gibberish like this -
\x00\x00\x00\x00\x00\x00
index = avprogram source = tcp:6514 sourcetype = syslog
When I remove the SSL requirement from the remote end, the data shows up as correct. It looks to me that I am missing a setting to decrypt the incoming data.
Any suggestions on what I need to do?
... View more
08-14-2019
11:26 AM
As nothing is this easy in Splunk, I added the line with my fingers crossed. Restarted Splunk and did a test.
Each field is now showing the correct number of values.
Thank you for the help.
... View more
08-14-2019
07:59 AM
David, how would I unset INDEXED_EXTRACTIONS? According to the props.conf doc, there is no "none" parameter only CSV|TSV|PSV|W3C|JSON|HEC.
The props.conf in apps\duo_splunkapp\default is configured
[source::duo]
INDEXED_EXTRACTIONS = json
KV_MODE = none
TIME_PREFIX = timestamp
TIME_FORMAT = %s
The props.conf in apps\duo_splunkapp\local is configured
[source::duo]
AUTO_KV_JSON = false
... View more
08-14-2019
07:13 AM
I have a single instance.
... View more
08-14-2019
06:30 AM
1 Karma
I installed the Duo Security App that uses the API to download events in the JSON format. The data is collected and when I perform a search, the events look correct but the data in fields is doubled. Looking through Answers, there are several responses that suggest that props.conf be modified on the UF (adding KV_MODE = none) to prevent both index time extractions and search time extractions. Since the data is not be forwarded, which props.conf do I need to modify? I put KV_MODE = none in props.conf under the apps\duo_splunkapp\local but there was no change.
Regards,
Scott
... View more
06-28-2019
12:31 PM
I have a lookup file that contains two columns, ip and mac. I want to update this file daily by running a query that catches when either a new device is added or an existing device is moved. My query is
index=syslog logdesc="neighbor table change" vendor_action="add"
| regex srcip = "(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})"
| stats latest(srcip) BY mac
| rename "latest(srcip)" AS srcip
| fields mac srcip
| lookup srcip_mac.csv mac OUTPUTNEW srcip
| outputlookup append=true srcip_mac.csv
What happens is that I end up with a file that contains the updated data in a new line but the existing items are duplicated. I end up with a file that is twice the size it needs to be.
Any help will be greatly appreciated.
Regards,
Scott
... View more
11-21-2018
07:25 AM
I opened a support case and the outcome is that in the event of a new file is created, it is indexed starting at the beginning of the file. The Splunk UF is working as designed.
... View more
10-04-2018
08:07 AM
A case with Splunk has been opened.
... View more
10-04-2018
05:52 AM
I ran the commands before and after the csv file was updated. The crc and decimal values of the compute command are the same. The results of the -d command show some differences. The key, fcrc and flen results are the same. The scrc, sptr, mdtm and wrtm values are different. The crc and key values match both times.
... View more
10-03-2018
08:24 AM
The file that I looked at is only adding entries to the end of the file. I don't see where any lines that are being removed from the file.
I ran the command for the fishbucket only after the file was updated. I will have to check tomorrow morning when the file is updated.
... View more
10-03-2018
05:48 AM
Thank you for the trouble shooting tips.
To answer the question about where the files are being appended, new lines are added at the end of the file. I checked this morning and verified that. There are 14 new lines at the end of the file. I ran the btprobe compute command on the file before and after the update. The crc and decimal values are the same.
The key value in the fishbucket is the same as the crc value of the file.
What else do I need to look at?
Regards,
Scott
... View more
10-02-2018
11:27 AM
i have multiple applications that place login information (Logon Date/Time, Logoff Date/Time, userid, etc.) into existing CSV files (one per application). I am monitoring these files, but when they are indexed, the old data is reindexed, so I have multiple events per logon. This is causing errors in reporting (I shouldn't have to do a dedup ) and is ballooning the size of each index (wasting disk space).
My understanding is that when a file being monitored, a beginning and end CRC is generated to fingerprint the file along with a Seek Address.
Documentation states:
"A matching record for the CRC from the file beginning in the database, the content at the Seek Address location matches the stored CRC for that location in the file, and the size of the file is larger than the Seek Address that Splunk Enterprise stored. While Splunk Enterprise has seen the file before, data has been added since it was last read. Splunk Enterprise opens the file, seeks to Seek Address--the end of the file when Splunk Enterprise last finished with it--and starts reading the new from that point."
I take this to mean that existing events are not added and only new events are indexed. This isn't happening in my case.
I have read the questions concerning "duplicate data" and two settings keep appearing. One is "followTail", reading the doc for this, i see "WARNING: Use of followTail should be considered an advanced administrative action." and "DO NOT leave followTail enabled in an ongoing fashion.". This doesn't look to be a good fit for my problem.
The second is "crcSalt". The question I have on that setting is if I do set it, does that ignore the Seek Address causing the entire file to be indexed, which is where I am now.
Thank you in advance for any help that can be provided.
Scott
... View more