Getting Data In

What information do we need from respective server and application owners for installing and configuring Splunk forwarders to collect event logs?

kapuralasharad
Engager

I am new to Splunk. What information do we need from Application owners, for installing and configuring a Forwarder? We need all events to be logged and sent to Indexers, so while configuring the Forwarder, what questions do we pose to the respective server or Application owners? I'd appreciate any relevant documentation as well.

Thanks

1 Solution

jimodonald
Contributor

We try to get the following information from our application owners:

  • What are the pain points you are trying to resolve?
  • How do you currently analyze the data?
  • How many logs files will you want ingested?
  • How many servers (UFs or otherwise) will be sending logs to Splunk?
  • What is the full path for each log file on each server? Provide sample logs for each file.
  • What is the length of time you want to keep them searchable?
  • What is your estimated daily volume of data?
  • Do you want/need assistance with setting up initial reports/alerts/dashboards or will your team be setting those up?

It helps to understand their use case. We follow up with them 3-6 months after it's in production to understand how the solution is working for them and the value it's added to their work. Typically they are raving fans by then and we get a nice value statement to show the positive impact to the business.

View solution in original post

miteshvohra
Contributor

Apart from the list of the questions, there are some more that would be useful for a Splunk Solution Architect:

  • Is this a stand-alone Splunk instance or a distributed deployment.
  • How are the Indexers placed (close of Application Servers or away in another DataCenter)
  • By default, FWDR are configured for consuming 256kbps of bandwidth. Is this sufficient or can the customer/business unit allocate more?
  • Do you need the logs in offline/archive format for Compliance/Regulatory reasons? If so, provision archival/backup storage accordingly.
  • Are there any key terms within Application logs that should/can be discarded or route+filter special event logs to special index?
  • Is there a plan to migrate the Application to a different platform (OS, language or on-premise-to-cloud, etc)? If so, get it documented right now.

Suggest that OS and Network logs be collected/captured for better troubleshooting Application-related errors, latency problems, challenges/concerns in future.

jimodonald
Contributor

Those are great questions also. I'll certainly add them to my list of things to keep in mind.

0 Karma

jimodonald
Contributor

We try to get the following information from our application owners:

  • What are the pain points you are trying to resolve?
  • How do you currently analyze the data?
  • How many logs files will you want ingested?
  • How many servers (UFs or otherwise) will be sending logs to Splunk?
  • What is the full path for each log file on each server? Provide sample logs for each file.
  • What is the length of time you want to keep them searchable?
  • What is your estimated daily volume of data?
  • Do you want/need assistance with setting up initial reports/alerts/dashboards or will your team be setting those up?

It helps to understand their use case. We follow up with them 3-6 months after it's in production to understand how the solution is working for them and the value it's added to their work. Typically they are raving fans by then and we get a nice value statement to show the positive impact to the business.

Get Updates on the Splunk Community!

Routing Data to Different Splunk Indexes in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. The OpenTelemetry project is the second largest ...

Getting Started with AIOps: Event Correlation Basics and Alert Storm Detection in ...

Getting Started with AIOps:Event Correlation Basics and Alert Storm Detection in Splunk IT Service ...

Register to Attend BSides SPL 2022 - It's all Happening October 18!

Join like-minded individuals for technical sessions on everything Splunk!  This is a community-led and run ...