Activity Feed
- Posted Re: Windows Perfmon data not collecting on Getting Data In. 07-10-2024 09:20 AM
- Posted Re: Windows Perfmon data not collecting on Getting Data In. 07-10-2024 09:02 AM
- Posted Re: Windows Perfmon data not collecting on Getting Data In. 07-10-2024 09:00 AM
- Posted Windows Perfmon data not collecting on Getting Data In. 07-10-2024 01:41 AM
- Posted Add M2Crypto, Pycrypto to Splunk 9.0.1 to run incapsula logs downloader? on Splunk Dev. 01-25-2023 02:53 AM
- Posted Event Log Subscription Server: Splunk could not get the description for this event. Either the component that raises thi on Getting Data In. 10-25-2022 02:30 AM
- Posted Re: How to configure VMWare vCenter logs via syslog to get into splunk? on All Apps and Add-ons. 05-26-2022 02:05 AM
- Posted How to configure VMWare vCenter logs via syslog to get into splunk? on All Apps and Add-ons. 05-23-2022 02:31 PM
- Karma Re: Single panel with multiple values for cmerriman. 06-05-2020 12:49 AM
- Got Karma for Forcepoint Proxy syslogs - Parsing questions. 06-05-2020 12:48 AM
- Got Karma for Re: Forcepoint Proxy syslogs - Parsing questions. 06-05-2020 12:48 AM
- Karma Re: CSV delimited with double quotes and seperated by comma. Header in every field for ogdin. 06-05-2020 12:46 AM
- Posted Re: Splunk Add-on for RSA SecurID App: sourcetype settings on All Apps and Add-ons. 03-27-2018 05:19 AM
- Posted Take only selected parts of a Windows Event Log on Getting Data In. 11-09-2017 07:18 AM
- Tagged Take only selected parts of a Windows Event Log on Getting Data In. 11-09-2017 07:18 AM
- Tagged Take only selected parts of a Windows Event Log on Getting Data In. 11-09-2017 07:18 AM
- Posted Re: How do I filter Active Directory events based on complex multi-line searches transforms.conf on Getting Data In. 10-25-2017 03:01 AM
- Posted How do I filter Active Directory events based on complex multi-line searches transforms.conf on Getting Data In. 10-24-2017 08:16 AM
- Tagged How do I filter Active Directory events based on complex multi-line searches transforms.conf on Getting Data In. 10-24-2017 08:16 AM
- Tagged How do I filter Active Directory events based on complex multi-line searches transforms.conf on Getting Data In. 10-24-2017 08:16 AM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
1 |
07-10-2024
09:20 AM
Yes - It's only perfmon data we're not getting. Splunk internals and event log events are both OK. AFAIK (and intended) these are not being collected as metrics. I'd been through the article you referenced, and heve now been back and checked my workings. We've not installed the Windows add-on to every layer yet - I've just used bit of inputs.conf from it initially to get the data to look at and will then go back to all the clever bit once the basics are working.
... View more
07-10-2024
09:02 AM
We apply a range of GPO settings to get us close to a CIS Level One hardening. This does usually include the Windows Firewall, but it's set to off where it needs to be and it's off here.
... View more
07-10-2024
09:00 AM
Thanks for the thoughts - I've re-checked both and: inputs all good and showing in the btool output. All other logs and events are getting through fine.
... View more
07-10-2024
01:41 AM
Splunk is faliing to collect perfmon data from our Windows 2022 servers. I've extracted and deployed the stanzas from the Splunk TA for windows to collect selected perfmon stats from servers. We use a deployment server to push this out. Here's a sample: [perfmon://CPU]
counters = % Processor Time
disabled = 0
instances = *
interval = 10
mode = single
object = Processor
useEnglishOnly=true
index=2_###_test The Splunk Universal Forwarder now restarts as expected on deployment (missed that first time 😉) . There are no apparent errors in splunkd.log. Nothing turns up! Metrics confirms nothing being sent to that index from the UF. I'm guessing that our Security lockdown is preventing collection, but with no error messages anywhere it's hard to diagnose! Perfmon works on the server target so we know that the data is there and working. Splunk is 9.2.1. it's running in "least privilege" mode on the UF (the new default). Any hints and pointers most welcome!
... View more
Labels
- Labels:
-
Windows
01-25-2023
02:53 AM
Looking to use the Github supplied python script @ https://github.com/imperva/incapsula-logs-downloader to fetch Incasula logs into Splunk. This requires python libraries not already in the Splunk 3.7 libraries. This is on Windows 2016.
Having broken one Splunk install by installing a standalone version of Python, I would prefer to use the inbuilt python 3.7 version that comes with Splunk.
How do I import the modules to Splunk. Or is there documented best practise for installing Python outside Splunk?
... View more
10-25-2022
02:30 AM
We recently moved our windows event log service up to Windows 2016 and Splunk 9.0.1 and all Security Auditing events are coming through with the message Message=Splunk could not get the description for this event. Either the component that raises this event is not installed on your local computer or the installation is corrupt. The Event data looks like this: the data is present but not the usual field descriptions that allows Splunk to work out the structure. There are many posts, they all date from over 2 years ago, and all refer back to a master post from 2014, (https://community.splunk.com/t5/Getting-Data-In/quot-FormatMessage-error-quot-appears-in-indexed-message-for/m-p/139982#M28765 ) that doesn’t appear to be for current versions of Windows. I have however followed the broad advice in here: Checked the registry keys – they match the old server Started Splunk after the event log service (I tried stopping and starting Splunk on a running host to mimic). Confirmed that the event format is set to Events. HF is Splunk 9.0.1 / Windows 2016 version 1607 Build 14393.5427 / Splunk Cloud is Version:9.0.2208.3
... View more
Labels
- Labels:
-
heavy forwarder
-
Windows
05-26-2022
02:05 AM
In part to respond to my own question: - we have been able to get something working by using the default config in vCenter, but only by using for format much closer to the default vCenter config than that recommended by Splunk. template(name="RSYSLOG_ForwardFormat1" type="string" string="<%PRI%>%syslogtag:1:32%%msg:::sp-if-no-1st-sp%%msg%") #$WorkDirectory /var/spool/rsyslog *.info @x.x.x.x:514RSYSLOG_ForwardFormat1 This now means I have to rework the transforms.conf in the Splunk App, so far from ideal. I've got the sourcetype working, with some caeveats, but I suspect that the whole piece about field extractions will now fail.
... View more
05-23-2022
02:31 PM
Not strictly a Splunk question, more a VMWare vCenter one, but II'm hoping somebody has solved this before me!!!
We're working to get the logs from vCenter into Splunk using syslog, Kiwi and the Splunk Add-on for vCenter Logs. We've figured out all the components:
configured vCenter correctly, using rsyslog.config
set Kiwi up to use Native messages, not add a date and time stamp
and we were just about to start the app to fetch the kiwi logs when we found we could not control the severity level in rsyslog. We referred to the help cited - https://www.rsyslog.com/doc/v8-stable/configuration/modules/imfile.html - but this refers to the directive $InputFileSeverity as being legacy...
Regardless of what we set the parameter $InputFileSeverity to it ignores us and sends everything right up to Debug (Level 7). As that more than doubles the log size for no material benefit, I'd like to tell vCenter not to bother.
What is the corect syntax of the stanza in rsyslog.conf to set the severity level to Level 6 / Info or lower?
We tried
$InputFileSeverity 6
$InputFileSeverity Info
$InputFileSeverity Info,Warning
... View more
Labels
- Labels:
-
configuration
-
troubleshooting
03-27-2018
05:19 AM
Yes - that was exactly what we ended up doing.
... View more
11-09-2017
07:18 AM
Windows event logs have a habit of repeating key/value pairs e.g.
11/08/2017 02:29:59 PM
LogName=Security
SourceName=Microsoft Windows security auditing.
EventCode=4624
EventType=0
Type=Information
ComputerName=server.emea.company.loc
TaskCategory=Logon
OpCode=Info
RecordNumber=178069065
Keywords=Audit Success
Message=An account was successfully logged on.
Subject:
Security ID: S-1-0-0
Account Name: -
Account Domain: -
Logon ID: 0x0
Logon Type: 3
Impersonation Level: Impersonation
New Logon:
Security ID: S-1-5-21-1234567-3099065758-1111111111-222222
Account Name: username
Account Domain: DOMAIN
Logon ID: 0x57203C56
Logon GUID: {2E25E5E0-50D0-A3D5-9757-339CB370EF0D}
Process Information:
Process ID: 0x0
Process Name: -
Network Information:
Workstation Name:
Source Network Address: 6.7.8.9
Source Port: 49329
Detailed Authentication Information:
Logon Process: Kerberos
Authentication Package: Kerberos
Transited Services: -
Package Name (NTLM only): -
Key Length: 0
where "Security ID", "Account name" and "Account Domain" are repeated under "Subject" and under "new Logon"
is there an easy way to construct a pre-index transform that finds the stanzas
Security ID: S-1-0-0
Account Name: -
Account Domain: -
Logon ID: 0x0
and just chops it out and leaves only the "before" and "after" to be indexed?
Thanks
... View more
10-25-2017
03:01 AM
Thanks Guys! A combination of both answers and a change to the props.conf (that i didn't share).
This is the Regex string that worked:
(?ms)(EventCode=4624).*(Account Name:.{5,30}\$)
it was then also necessary to reverse the order in the prop.conf for the two transforms: they appear to be order specific. Define the blanket rule first, then exceptions after. "Keep everything except" appears to work, rather than drop this narrow list and keep this broad list
[WinEventLog:Security]
TRANSFORMS-Filter_4624 = Keep_4624,Drop_Some4624
There was then one other change I made, as I discovered there were some cases where events were logged with both "account name" fields populated the first with the $ on the end and the second with out, which were being dropped when i wanted them
eg:
10/25/2017 09:18:41 AM
LogName=Security
SourceName=Microsoft Windows security auditing.
EventCode=4624
EventType=0
Type=Information
ComputerName=servername.emea.company.loc
TaskCategory=Logon
OpCode=Info
RecordNumber=7206502
Keywords=Audit Success
Message=An account was successfully logged on.
Subject:
Security ID: S-1-5-18
Account Name: servername$
Account Domain: EMEA
Logon ID: 0x3E7
Logon Type: 3
Impersonation Level: Impersonation
New Logon:
Security ID: S-1-5-x-8209345-y-z-a
Account Name: account
Account Domain: EMEA
Logon ID: 0x809217BA
Logon GUID: {00000000-0000-0000-0000-000000000000}
So I finished up with:
(?ms)(EventCode=4624).*(Account Name:).*(Account Name:.{5,30}\$)
Thanks for your help and hope this solution helps others.
... View more
10-24-2017
08:16 AM
I'm trying to filter a stream of events at a heavy forwarder before they head for our Cloud Splunk instance to reduce the data volumes. It's AD security Event log data, where the event looks like this:
10/24/2017 01:52:11 PM
LogName=Security
SourceName=Microsoft Windows security auditing.
EventCode=4624
EventType=0
Type=Information
ComputerName=servername.emea.company.loc
TaskCategory=Logon
OpCode=Info
RecordNumber=7062543
Keywords=Audit Success
Message=An account was successfully logged on.
.
.
.
Logon Type: 3
Impersonation Level: Delegation
New Logon:
Security ID: S-1-x-21-8209345-x-y-370233
Account Name: WSabcd123$
the 4624 eventcode covers both users and computers unfortunately, with the computer account generating 80% of the events. I only want user events.
The outcome I therefore want is if EventCode=4624 and Account name ends in $ I want the transform to route to the nullQueue, otherwise route to the indexQueue; which is the other form of the same event code where the Account name DOESN'T end in $.
The metadata keyword is in here so that i only do this for now to my test source.
here's what I tried:
[Drop_Some4624]
REGEX = (EventCode=4624)(Account Name:.{5,30}\$)
SOURCE_KEY = MetaData:Host
FORMAT=servername
DEST_KEY = queue
FORMAT = nullQueue
[Keep_4624]
REGEX = EventCode=4624
DEST_KEY = queue
FORMAT = indexQueue
This feels like it must be possible, But...
- is it the metadata that's messing up - have I misunderstood this filter key?
- or is it the REGEX?
- or is this just too complex
- how might I approach this from another angle?
... View more
09-22-2017
09:54 AM
sorry for taking so long - I've only just got around to installing on the HF. It seems to have done the trick... the records are getting the right source type, and hence the log entries are now getting properly parsed up.
Thanks!
... View more
09-08-2017
09:17 AM
That's the inputs.conf in the OP. Sorry - I didn't make that clear.
[monitor://D:\Syslog-logs\rsa]
disabled = false
host_segment = 3
index = hays_active_directory
sourcetype = rsa:securid:syslog
... View more
09-08-2017
08:53 AM
Thanks for your help:
The app is installed on the Splunk Cloud platform. It's not installed on the HF. if i did, would it still forward the stuff to the cloud? Would they fight?
... View more
09-08-2017
08:23 AM
syslog looks OK - type_1 and type_2 match in regex101.com
here's a sample (i've masked IP, userids and servernames)
2017-09-08 14:57:47 User.Info 99.220.1.240 2017-09-08 14:57:47,701, , audit.admin.com.rsa.authmgr.internal.admin.principalmgt.impl.AMPrincipalAdministrationImpl, INFO, 19403756f001dc0a7cf063a0dc2891a9,94924becf001dc0a001b7418f802c658,99.104.16.235,99.221.1.240,UPDATE_AM_PRINCIPAL,20002,SUCCESS,,,,,,,,,,PRINCIPAL,511b7c049517be0a5d89ee28e32e5c69,f5fe44869517be0a078e4dc7f37ec085,000000000000000000001000e0011000,adm-xxxx,,,,,,
2017-09-08 14:57:48 User.Info 99.221.1.240 2017-09-08 14:57:48,064, , audit.runtime.com.rsa.authmgr.internal.oa.engine.OAProcessor, INFO, 2ee13f38f001dc0a05ccc2ed4a81ff1e,94924becf001dc0a001b7418f802c658,99.104.16.235,99.221.1.240,OA_DATA_DOWNLOAD,23016,SUCCESS,,,511b7c049517be0a5d89ee28e32e5c69,f5fe44869517be0a078e4dc7f37ec085,000000000000000000001000e0011000,adm-xxxx,masked,xxx,c25cc2579517be0a19f64e7a9a53db1c,000000000000000000001000e0011000,99.221.16.235,maskedxxxx1.emea.xxxx.loc,100,,,,,000249852704,,,,,,,,,,
and that matches what's in the raw data on splunk.
So how do I get the app to spot and transform the data?
... View more
09-08-2017
06:56 AM
We've installed the Splunk Add-on for RSA SecurID App to our Cloud instance, ans we're feeding he events in from our RSA servers using a SYSlog aggregator running on a heavy forwarder.
the forwarder has the following in inputs.conf
[monitor://D:\Syslog-logs\rsa]
disabled = false
host_segment = 3
index = hays_active_directory
sourcetype = rsa:securid:syslog
we see the events into the cloud with this sourcetype, but none of the forms / fields / transforms on the cloud app seem to be working.
the App docs hint at setting the sourcetype based on the event type to one of three type s
"The add-on converts the rsa:securid:syslog source type to rsa:securid:runtime:syslog, rsa:securid:admin:syslog, or rsa:securid:system:syslog according to the log file source."
I don't see this happening. Do we have to configure how the RSA syslogs to help this happen, or capture the syslog messages differently?
... View more
05-26-2017
11:37 AM
1 Karma
Found my own answer: https://answers.splunk.com/answers/453509/websense-stripping-ldap-ou-dc-strings-from-user-fi.html
I also downloaded the Websense plug CIM module for Splunk, which contains this already.
https://splunkbase.splunk.com/app/2966/
... View more
05-22-2017
01:52 AM
That's the crux of the issue. There's nothing obvious to use. In fact characters within the value are reasonably being interpreted as the terminator.
I either need to teach the Forcepoint Appliance how to write it's log better, or hope some Splunking Hero has figured out how to get Splunk to parse this better.
Eric
... View more
05-19-2017
04:02 AM
1 Karma
we're getting the syslogs exports from our Forcepoint appliances, using their standardised SIEM integration. The format of the output generally works except for "user".
Because the Forcepoint syslogger insists on listing the full LDAP path of the user, it's littered with commas and equals, but has no external delimiter:
user=LDAP://ldap.emea.company.loc OU=PST Disable Test,OU=Test,DC=emea,DC=Company,DC=loc/surname\, first
Splunk parses this as user= "LDAP://ldap.emea.company.loc"
can force remove teh Spaces with "_" using Forcepoint's escape sequences, which results in:
user=LDAP://ldap.emea.company.loc_OU=PST_Disable_Test,OU=Test,DC=emea,DC=company,DC=loc/surname_,_first
which Splunk interprets as:
user = "LDAP://ldap.emea.company.loc_OU=PST_Disable_Test"
any suggestions as to how i can delimit or replace the field to just contain something Splunk can recognise?
... View more