Getting Data In: Session 1 - Wed 3/15/23

Community Office Hour Cover Images copy 2.png
Published on ‎02-23-2023 09:29 AM by Community Manager | Updated on ‎06-26-2023 02:44 PM

Register and ask questions here. This thread is for the Community Office Hours session on Getting Data In (GDI) to Splunk Platform on Wed, March 15, 2023 at 1pm PT / 4pm ET.


Join our bi-weekly Office Hour series where technical Splunk experts answer questions and provide how-to guidance on a different topic every month! This is your opportunity to ask questions related to your specific GDI challenge or use case, like how to onboard common data sources (AWS, Azure, Windows, *nix, etc.), using forwarders, apps to get data in, Data Manager (Splunk Cloud Platform), ingest actions, archiving your data, and anything else you’d like to learn!


There are two 30-minute sessions in this series. You can choose to attend one or both (each session will cover a different set of questions):


Wednesday, March 15th – 1:00 pm PT / 4:00 pm ET

Wednesday, March 29th – 1:00 pm PT / 4:00 pm ET


Please submit your questions below as comments in advance. You can also head to the #office-hours user Slack channel to ask questions (request access here)


Pre-submitted questions with upvotes will be prioritized. After that, we will go in order of the questions posted below, then will open the floor up to live Q&A with meeting participants. If there’s a quick answer available, we’ll post as a direct reply.


Look forward to connecting!


0 Karma
Splunk Employee

Hey everyone!

Drop your questions/comments here for any topics you'd like to see discussed in the Community Office Hours session (you can also head to the #office-hours user Slack channel to ask questions and join the discussion - request access here).

New Member

I am looking for ways to Whitelist and Blacklist specific Windows event IDs for auditing purposes. Target logs are the Security, Application and System logs.

Splunk Employee

Docs (starting point): Follow step-by-step instructions for "Example 5: Exclude Windows Event Code 4662 events whose Message field contains a specific value"

To ignore Windows Event Code 4662 events whose Message field contains events with the value Account Name: "example account", add the following line to the inputs.conf file:


blacklist1 = EventCode = "4662" Message = "Account Name:\s+(example account)"


Expert Solution: Splunk UF on Windows allows for quick blacklisting by EventCode:
per stanza
blacklist1 = 4662,4663,4664,4756

or combine with a field/regex to filter Event Codes with specific messages by pattern

You also whitelist by sourcetype >> thoroughly test, use with appropriate QA!

[monitor: {some log path}]
whitelist = \.log$


Splunk Employee

Office Hours Q&A

Q: Are there any MacOS equivalents to WinEventLogs we can pull into Splunk? I mean, one to one codes. If I wanted to get the same functionality that wineventlogs do, but for mac

  • A: There are event IDs for apple products, but I don't I don't believe we have any of that functionality built into the UF right now (submit a feature request in Splunk Ideas).

Q: I only have a 2GB license, so I’m trying to limit the amount of data that’s ingested by doing at the inputs.conf rather than busting my license.

  • A: Use an ingest action to manipulate the data before you index it. And that happens before your licensing. So what i'm doing is, i'm forwarding that data in, and before I actually index it, I can actually manipulate that data. I can change source types, etc.

Q: When is it best to have logs ingested directly to the Splunk ES search head versus the Splunk Heavy Forwarder?

  • A: This is one of those “what works better for my deployment” scenarios. Splunk Best Practice is to utilize dedicated Search Head(s) for Enterprise Security and dedicated heavy forwarder(s) for data ingestion. Reasons for separation: Resource allocation, Access and configuration control, and Administration flexibility.

Q: For Data Manager, when will AWS Organizations be supported?

  • A: Check out the blog: Announcing the General Availability of Data ManagerNote that in order to get started you must be a Splunk Cloud Platform customer with AWS as your provider and be part of one of the AWS regions (US East Virginia, US West Oregon, UK (London), Europe (Dublin, Frankfurt, Paris), Asia Pacific (Singapore, Sydney, Tokyo), and Canada (Central).

Q: What is the recommended way and time to normalize logs? ingest time, search time, etc. Any documented best practices?

  • A: Check out the lantern article "How to Normalize Logs". 
  • There are four main ways to normalize data:
    1. (of course) key value pairs time={some time format}
    2. Index time: Use regex in your props.conf to grab unusual located timestamps in the logs
    3. Use the Common Information Model (CIM) (link to Docs) 
    4. Search time:  


	| eval formatted_timestamp=strftime(strptime(timestamp_field,"%Y-%m-%d %H:%M:%S"),"%m/%d/%Y")