Hi all,
I work in a Security Operations Center for an organization, which is targeted by lots of Spam and Phishing mails.
Our users have the opportunity to report those Phishing mails to us, where we continue analysing them.
The issue is, that we get plenty of Spam mails reported too.
Now we would like to filter out only the Phishing ones.
Therefore, I have obtained a dataset and trained a model in Splunk.
Regarding the worklow, I am not sure on how to integrate Splunk in the chain.
In our current situation, the raw metadata of the extracted features cannot be streamed in realtime to Splunk but has to be uploaded manually one a week.
I am looking for a solution something like this:
1. Extract the features from newly reported mails
2. Call a ML algorithm with the previously obtained features as paramaters in Splunk via REST
3. Get the result of the classification in an instant
Can this be done within Splunk without having the data to be first forwarded and indexed?
Cheers,
Dan
Hi
"In our current situation, the raw metadata of the extracted features cannot be streamed in realtime to Splunk but has to be uploaded manually one a week."
Why is this? Most of the customers I work with can extract their features from the streaming data in Splunk, which are used in the training of the model (|fit) . Can you elaborate?
If you go through the SDK/API documentation for Splunk, you can submit a search to splunk and get the results back from an external system. An example search can be making an empty result set (makeresults) , filling the correct fields with features (|eval) and then applying the ML model to get the result back (|apply).
Hi
"In our current situation, the raw metadata of the extracted features cannot be streamed in realtime to Splunk but has to be uploaded manually one a week."
Why is this? Most of the customers I work with can extract their features from the streaming data in Splunk, which are used in the training of the model (|fit) . Can you elaborate?
If you go through the SDK/API documentation for Splunk, you can submit a search to splunk and get the results back from an external system. An example search can be making an empty result set (makeresults) , filling the correct fields with features (|eval) and then applying the ML model to get the result back (|apply).
I only saw now your reply.
Thank you very much! This was exactly the solution I was looking for.
Regarding your question, we have some sort of a process bottleneck in place, which will need time to get removed.