Getting Data In

Scripted input with bash script is not generating any results

ricotries
Communicator

I have a bash script that queries audit.log using ausearch for events that I have configured in audit.rules to have attached a specific key.
This is the general idea of the script:

# Assign path variables
# Capture saved timestamp from last execution
# Save new timestamp for future execution
# Execute query using ausearch
    # Redirect stdout and stderr to two different variables
# Check stderr variable does not equal "<no matches>" and exit execution if true

Now this is where I have tried multiple things and while all of them work when executed from a terminal, they don't generate any results when Splunk executes them.

echo $stdout_var 

OR

echo $stdout_var > /path/to/tmp
cat /path/to/tmp  

I have even tried monitoring "/path/to/tmp", that's when I realized this might be a user permissions issue since the file is generated, but there is never any content in it.

Currently, SPLUNK_OS_USER=root, but does that mean that the script is executed as SPLUNK_OS_USER? Or do I have to configure the script through Splunk to run as a specific user?

Again, when I execute this command manually from the CLI as root, it works exactly as expected, but it generates nothing when executed through the scripted input.

EDIT:
So I continue to debug to find the issue.
1. Script is being executed as root (placed "echo $UID" at top of script, which showed on Splunk Web as an event that simply returned 0)
2. I have added "echo" commands at every step of the execution, and I have found that it actually keeps exiting execution at the stderr variable check. This makes no sense, because when I run exactly the same command with the same timestamp, on the command line it works as expected, but apparently when it is executed by Splunk as a scripted input, ausearch returns nothing.

I know this is starting look like a bash script question, but from a Linux standpoint, the script works as it should. I don't know what else to do at this point to make it work through Splunk.

0 Karma
1 Solution

ricotries
Communicator

After debugging as much as I could, I decided to change the way the data is processed:

  1. The script to query audit.log using ausearch still exists, but now it generates logs under a directory in the same app the script is under
  2. I created a script that generates a crontab with the schedule I was going to use for the scripted input (which has interval = -1 and verifies every time Splunk re/starts if the cron jobs exist, and creates them if necessary)
  3. Set up monitors on these log files

View solution in original post

0 Karma

ricotries
Communicator

After debugging as much as I could, I decided to change the way the data is processed:

  1. The script to query audit.log using ausearch still exists, but now it generates logs under a directory in the same app the script is under
  2. I created a script that generates a crontab with the schedule I was going to use for the scripted input (which has interval = -1 and verifies every time Splunk re/starts if the cron jobs exist, and creates them if necessary)
  3. Set up monitors on these log files
0 Karma

jsmithn
Path Finder

Did you ever circle back to this issue and discover the actual cause? I have something similar where a script that runs via CLI as root or user splunk works as expected, but when executed via Splunk application, the behavior is different. I can workaround by outputting to a logfile and then using Splunk to ingest the content.

0 Karma

woodcock
Esteemed Legend

This is a very complicated topic but your best bet is to leverage this app:
https://splunkbase.splunk.com/app/2642/
Be sure to understand why many (most) people opt to use something like rlog.sh that is packaged here:
https://splunkbase.splunk.com/app/833/
See here for some explanation:
https://answers.splunk.com/answers/311061/why-does-splunk-ta-nix-rlogsh-cause-huge-amount-of.html

0 Karma

ricotries
Communicator

I am actually creating my own custom app for our deployments of *nix because the previous admins used these apps you're mentioning and they consumed too much data, much more than what was really required. After going through the scripts within these apps and reviewing the requirements that our teams have for Splunk, I decided to not use them and use my experience as an Linux admin to create scripts that would generate data that our teams actually wanted.

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...