Getting Data In

Can these ingestion related tasks be completed through the REST API?

thisissplunk
Builder

Our organization creates new indexes almost daily for one-off/one-shot logs from different customers we work with. This leads to a lot of overhead around creating inputs.conf, index.conf stanzas, the rare props.conf stanza for custom sourcetypes/logs and especially filesystem level access like creating the new batch directories on the forwarder.

We are automating all of this but still have a few things that seem to need filesystem access. We are trying to avoid this type of access but can't seem to find other solutions for the following:

  1. Moving the config files updated by the API from .../system/local/ to .../master-apps/_cluster/local on the Master.
  2. Creating the batch directories on the forwarder
  3. Pushing the log files to the batch directories on the forwarder

Is there any way to avoid these items? Is there a completely different way to do this considering the constant index/inputs/props creation requirements for various types of one-off logs?

0 Karma
Get Updates on the Splunk Community!

Splunk Search APIを使えば調査過程が残せます

   このゲストブログは、JCOM株式会社の情報セキュリティ本部・専任部長である渡辺慎太郎氏によって執筆されました。 Note: This article is published in both Japanese ...

Integrating Splunk Search API and Quarto to Create Reproducible Investigation ...

 Splunk is More Than Just the Web Console For Digital Forensics and Incident Response (DFIR) practitioners, ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...