Knowledge Management

Best practice for FTP from numerous sources

msarro
Builder

Greetings everyone. I am working to try and aggregate .csv data from a number of sources. Initially its just a few devices but the number will be millions when the project is completed.

For now, I just need to get our test lab working with some essential infrastructure equipment. All of the equipment is configured to regularly export .csv files via FTP. I would like to set up a directory on the test server I have set up to receive these files, and have splunk monitor the directory. I'm pretty sure this is possible, but it leads me to my next question.

If I have numerous different devices all dumping files to the same directory, how does Splunk tell what data came from what device?

0 Karma
2 Solutions

Brian_Osburn
Builder

I'd suggest you set up sub-directories underneath the main directory, one for each system dumping their .csv files there.

This way, when setting up Splunk to digest those .csv files, it can extract the host from the sub-directory name using the "Segment on path" option in the input setup.

You can get more information from here http://www.splunk.com/base/Documentation/latest/Admin/Setadefaulthostforaninput

Brian

View solution in original post

southeringtonp
Motivator

  • If it's manageable, have one directory for each host, and use host_segment in inputs.conf to assign hostnames.
  • If the hostname is anywhere in the file path, you can use host_regex in inputs.conf.
  • If the hostname appears in each event, you can use a transform to assign the host. This is per-event.
  • Take a look at these doc entries also:
    http://www.splunk.com/base/Documentation/4.1.5/Admin/Setadefaulthostforaninput

    http://www.splunk.com/base/Documentation/4.1.5/admin/Overridedefaulthostassignments

    View solution in original post

    southeringtonp
    Motivator

  • If it's manageable, have one directory for each host, and use host_segment in inputs.conf to assign hostnames.
  • If the hostname is anywhere in the file path, you can use host_regex in inputs.conf.
  • If the hostname appears in each event, you can use a transform to assign the host. This is per-event.
  • Take a look at these doc entries also:
    http://www.splunk.com/base/Documentation/4.1.5/Admin/Setadefaulthostforaninput

    http://www.splunk.com/base/Documentation/4.1.5/admin/Overridedefaulthostassignments

    msarro
    Builder

    Thank you for your help, I really appreciate it.

    0 Karma

    Brian_Osburn
    Builder

    I'd suggest you set up sub-directories underneath the main directory, one for each system dumping their .csv files there.

    This way, when setting up Splunk to digest those .csv files, it can extract the host from the sub-directory name using the "Segment on path" option in the input setup.

    You can get more information from here http://www.splunk.com/base/Documentation/latest/Admin/Setadefaulthostforaninput

    Brian

    msarro
    Builder

    Thank you for your help, I really appreciate it. Both of these solutions work, and I'm going to set up a hierarchical structure just to keep things organized. Thanks!

    0 Karma
    Career Survey
    First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

    Can’t make it to .conf25? Join us online!

    Get Updates on the Splunk Community!

    Take Action Automatically on Splunk Alerts with Red Hat Ansible Automation Platform

     Are you ready to revolutionize your IT operations? As digital transformation accelerates, the demand for ...

    Calling All Security Pros: Ready to Race Through Boston?

    Hey Splunkers, .conf25 is heading to Boston and we’re kicking things off with something bold, competitive, and ...

    Beyond Detection: How Splunk and Cisco Integrated Security Platforms Transform ...

    Financial services organizations face an impossible equation: maintain 99.9% uptime for mission-critical ...