Greetings ,
Does anyone know if it's possible to create a script that writes splunk search quey based on the alerts results / table, for example:
"Multiple Failure Attempts" uses "Authentication" data model to display results and only shows specific fields as : username , total failure attempts, source ip, destination..etc.
But I want to conduct more investigation and check raw logs to see more fields so I have to write a new search query with specifying fields and their values to get all information. (index=* sourcetype=xxx user=xxx dest=xxx srcip=xxx) then look for more fields under the displayed results. And I would like to automate this process.
Any suggestions for Apps, Scripts, recommended programming language?
@PickleRick is correct that you cannot "automate" this from outside of Splunk because the nature of an alert is not to carry all context. (Otherwise you wouldn't be asking this question.)
Meanwhile, I will take a totally different approach from your stated approach. Start from this dummy "alert" as an example.
index=_internal log_level=ERROR earliest=-5m
| stats count
| where count > 50
| sendalert dummyaction
If, say, every time this alert is triggered, you want a search to give you all raw events used in this alert to your email. Schedule the following the same way the alert is scheduled:
index=_internal log_level=error earliest=-5m
| stats count
| where count > 100
| map search search="search index=_internal log_level=error earliest=-5m"
| sendemail to="elvis@splunk.com" sendresults=true
There are many ways to refine and develop this idea, many different commands to choose from and many ways to customize according to what additional information you need for your investigation. The bottom line is: You don't run a "script" (to respond to a triggered alert). Just use the same filter to trigger an action that gives you the appropriate level of detail.
As a general task it's simply impossible. How are you supposed to know whether your results come from a search
index=windows | stats count
or
| makeresults | eval count=10 | table count
Ok, this is an extreme example but should show my point fairly well - without a lot of assumptions you can't know what data the results came from.
The main issue with your problem is not the tool (although you probably want something that has ready-made libraries to interface with Splunk so you don't have to reinvent the wheel). The main issue is the method you'd want to use to build such search. This is something you'd have to give the most consideration to.