Splunk Dev

Is there any guide available for Custom Data Source Integration with Splunk?

sayash27
Explorer

Is there any guide available for Custom Data Source Integration with Splunk? What all methods are available for custom Data Source. What are the challenges for the same?

Tags (1)
0 Karma

skalliger
SplunkTrust
SplunkTrust

Hi,

I think this is a good start:

http://docs.splunk.com/Documentation/Splunk/6.6.1/Data/WhatSplunkcanmonitor and
http://dev.splunk.com/view/dev-guide/SP-CAAAE3A

So, out of the box, Splunk can index:

  • Files and directories
  • Network events (also streams)
  • Windows sources
  • anything that uses Splunk's REST api

What kind of answer do you expect? Your question is really.. abstract to be answered. Even if there was no appropriate method to get your data into Splunk, someone could use the Splunk SDK to write a modular input for your use case: http://dev.splunk.com/view/python-sdk/SP-CAAAER3

Skalli

0 Karma

sayash27
Explorer

Thanks Skalli for your input.

I was looking for any guide if we want to integrate any inhouse application or if any data source is not supported by splunk. So what all are the ways to integrate those application or devices(Like Custom Parser) and what can be challenges for the same.

Is any guide available for this?

Regards
Sayash

0 Karma

skalliger
SplunkTrust
SplunkTrust

We had the same issues, too when we were starting to integrate many different applications.
The problem in enterprise environments is that you have many different applications, where a few may only be able to send syslog data, while other are only accessible via DBConnect or other vendor-specific apps from Splunkbase.

A couple of big vendors already have documented some of these information in their documentation (how to get data into third-party tools).

We started like this: Make a list with the available options to get data into your Splunk environment. If possible, concentrate on a few of them (syslog, directory monitoring, UF, DBConnect, scripted input, ...).
You don't want to support 20 different ways in your company of how to get data into your Splunk environment.

So, whenever a new application wants to get its data analyzed by Splunk, its responsible person could fill a check list which options are supported by the application (database connection, syslog stream, HTTP Event Collection, OPSEC-lea, local directory monitoring, ...).

We have a couple of standard inputs we offer applications/application owners:
- Syslog (and specify a port, we don't use 514 because splunk doesn't run as root)
- DBConnect
- vendor-specific app (like OPSEC LEA)
- Universal Forwarder (deployed on the host of the application, for example useful with Domain Controllers)

However, sometimes you need to allow the option to get data on a different way into your system. For example, if you have special applications (like anything on z/OS.. pain in the ass sometimes).

Tl;dr: Look at the common ways to get data into Splunk, choose a couple of them and build your infrastructure around it. We, for example, are using a lot of Heavy Forwarders (HF) in different (V)LANs where applications send their data to us. So we are kind of flexible here. If a product doesn't support syslog, we can check for an existing Splunkbase app, install it on the HF and use a different way then.

I don't think there are specific guides out there, atleast I don't know any. If you have a big project coming up, you might want to get Splunk involved when planning a big infrastructure.

Skalli

Edit: typo

0 Karma

sayash27
Explorer

Thanks a lot Skalli. Appreciate your detailed input 🙂

Get Updates on the Splunk Community!

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...

March Community Office Hours Security Series Uncovered!

Hello Splunk Community! In March, Splunk Community Office Hours spotlighted our fabulous Splunk Threat ...

Stay Connected: Your Guide to April Tech Talks, Office Hours, and Webinars!

Take a look below to explore our upcoming Community Office Hours, Tech Talks, and Webinars in April. This post ...