Getting Data In

Can these ingestion related tasks be completed through the REST API?

thisissplunk
Builder

Our organization creates new indexes almost daily for one-off/one-shot logs from different customers we work with. This leads to a lot of overhead around creating inputs.conf, index.conf stanzas, the rare props.conf stanza for custom sourcetypes/logs and especially filesystem level access like creating the new batch directories on the forwarder.

We are automating all of this but still have a few things that seem to need filesystem access. We are trying to avoid this type of access but can't seem to find other solutions for the following:

  1. Moving the config files updated by the API from .../system/local/ to .../master-apps/_cluster/local on the Master.
  2. Creating the batch directories on the forwarder
  3. Pushing the log files to the batch directories on the forwarder

Is there any way to avoid these items? Is there a completely different way to do this considering the constant index/inputs/props creation requirements for various types of one-off logs?

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...