Getting Data In

What is the best architecture setup to handle multiple log sources and apply all data to specific field headers?

bbyttn
New Member

Hello - as you may see by my account status, I'm a complete newbie to Splunk.

I apologize for any confusion or use of incorrect terminology. Here's my questions as I've written them down along my journey.

What is the best architecture setup to handle multiple log sources and apply all data to specific field headers? I ask this knowing that I want to first define the fields I want. These fields will be able to cover the vast of incoming log sources, but I know some fields will be null, and that’s ok.

I need this data to be under its respective field at index time ( I think? ), the reason is because I need the data NOT only viewable in the Splunk Web but also need it defined/assigned correctly on the backend for further and separate processing.

Do I need multiple indexers? If so, how do I go about setting this up (step by step please)?
How do I connect a multi system Splunk architecture? tutorial reference should be fine.
How do I ensure specific data is assigned to specific field headers?

0 Karma

hunters_splunk
Splunk Employee
Splunk Employee

Hi bbyttn,

You can ingest many different types of logs on a single indexer, and the beauty of Splunk is that you don't need to define your fields ahead of time. Only a few basic fields such as source, sourcetype, host, and time are captured at indexed time, and all other fields can be extracted on-the-fly at search-time.

To understand index time vs. search time, please refer to documentation here:
http://docs.splunk.com/Documentation/Splunk/6.5.1/Indexer/Indextimeversussearchtime

This section gives you an overview of how data moves through Splunk deployments - the data pipeline:
http://docs.splunk.com/Documentation/Splunk/6.5.1/Deploy/Datapipeline

The Search Tutorial may be a good way to get started with using Splunk.
http://docs.splunk.com/Documentation/Splunk/6.5.1/SearchTutorial/WelcometotheSearchTutorial

Hope this helps. Thanks!
Hunter

0 Karma

bbyttn
New Member

Hi Hunter,

Thank you for the response. I should clarify that I'm more-so a newbie with administration and data handling, rather than being a user of Splunk and its resources within the web interface.

I have a bit better grasp on how the architecture will be setup. I do have a question about data ingestion though. Can you point me in the right direction if I want to accomplish the following tasks:

  1. I want to receive data from a Cisco ASA FW for VPN logging purposes. This data comes in with different formats depending on the VPN log.

  2. I want to have each uniquely formatted VPN log indexed as a separate sourcetype. How is this possible and where do I start making config changes to accomplish this?

  3. Furthermore, I want to ingest more VPN data from another Cisco ASA FW but I need to tag this data in a way that it is separate from the first Cisco FW. I'd prefer to have it tagged with a custom ID, rather than differentiating by the 'host' or 'source' field. If this is done through specific config files, can you point out which ones and which locations?

Thank you!

Bobby

0 Karma
Get Updates on the Splunk Community!

Index This | I am a number, but when you add ‘G’ to me, I go away. What number am I?

March 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...