When using enterprise security protocol intelligence dashboards, how do you build a complete email transaction log (e.g. sourcetype=sendmail qid=* | transaction qid) prior to senndmail logs being pulled into the datamodels? Since single email 'transactions' are spread over numerous logs with same qid, it would be advantageous to build the complete email 'event' prior to populating the data models. Any thoughts?
I want to give this one a bump. It is becoming a point of significant frustration. Sendmail and email filter logs have fields like srcuser, recipient, filename, size, etc. spread across multiple events. The enterprise security dashboards expect all this data to be on one event that gets CIM normalized into one row of data. I built a transaction child node, which can't be accelerated, and also means I have to re-write a lot of ES email dashboards. Is there a practical and efficient way to build the transactions prior to normalization into the datamodel so I can benefit from the ES dashboards and acceleration? This must be a common problem for many customers using ES.
I found a work around to this, based on some of the Splunk documentation. I removed the email tag from the email logs directly. I then run a report that aggregates all the fields required by the Email datamodel and consolidate based on the session id. So, generally, this looks like:
"index=mtasysloglog | stats values(emaildatamodelfield) as emaildatamodelfield by sid"
This report is then sent to a summary index (I've also accelerated it) and runs every 5 minutes. I then tag the summary index events will "email" and configure the ES CIM to include the summary index using "index=summary source=savedreportname" as the constraint for an eventtype which assigns the tags.
This duplicates the some of the data on a summary index...but it vastly improved datamodel build performance. Not ideal, but it works.