I am currently testing forwarding logs from a file I am monitoring, but the software that generates those logs creates a line per type event related to a process, while keeping an ID of the process that generated all the individual messages. Is there a way to combine/merge all messages with the same ID into an event when they reach the indexer (the IDs are unique, even when the process runs multiple times, it generates a new ID for every occurrence)? I'm looking for a solution that does not involve setting up a search that will generate the same result, I want all the messages stored as one event that then can be searched for more details.
There aren't any settings that will merge events at index time based on a field.
Consider writing a scripted input that reads the file and does the required processing before writing the results to stdout for indexing. You'll need a heavy forwarder to run the script.
There aren't any settings that will merge events at index time based on a field.
Consider writing a scripted input that reads the file and does the required processing before writing the results to stdout for indexing. You'll need a heavy forwarder to run the script.