Getting Data In

Can these ingestion related tasks be completed through the REST API?

thisissplunk
Builder

Our organization creates new indexes almost daily for one-off/one-shot logs from different customers we work with. This leads to a lot of overhead around creating inputs.conf, index.conf stanzas, the rare props.conf stanza for custom sourcetypes/logs and especially filesystem level access like creating the new batch directories on the forwarder.

We are automating all of this but still have a few things that seem to need filesystem access. We are trying to avoid this type of access but can't seem to find other solutions for the following:

  1. Moving the config files updated by the API from .../system/local/ to .../master-apps/_cluster/local on the Master.
  2. Creating the batch directories on the forwarder
  3. Pushing the log files to the batch directories on the forwarder

Is there any way to avoid these items? Is there a completely different way to do this considering the constant index/inputs/props creation requirements for various types of one-off logs?

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...