Requirement:
Have a log file that is always appended with data. I wish to send the log file details as it is appended, to a destination server which is either run as a typical TCP server or a syslog server. The Universal forwarder only sends raw data which it is not what I desired. It the log file is appended with "date: ipaddress" for example, then my TCP server will just receive the details as "date:ipaddress". Hence I am looking into installing a full splunk instance (i.e. splunk enterprise) so that I can have control over the data I want to send over to my TCP server. However, do I need to create an indexer at my destination? My purpose is only to forward the appended data to destination and my destination will not run any Splunk instance. also, if it is possible, how should I configure my config files i.e. inputs.conf, outputs.conf?
thanks.
Hi Michael,
You are right that universal forwarders cannot extract or transform the data you want to forward. If you just want to process raw data to meet your specific requirement and forward the tailored data to a destination without indexing it, you just need to install a heavy forwarder - no indexer needed.
To deploy a heavy forwarder, you install the full Splunk instance as an indexer, but configure it to use the forwarder license:
To transform your raw data - e.g., rewrite "date:ipaddress" as events, define and use the transforming processor. Here is an example:
props.conf
[source::...\my.log]
TRANSFORMS-dateip = dateip
transforms.conf
REGEX = .*(\d{1,2}\/\d{1,2}\/\d{4}).*(?:[0-9]{1,3}\.){3}[0-9]{1,3})
DEST_KEY = _raw
FORMAT = $1:$2
The transforming processor captures dates ($1) and IP addresses ($2) from the original raw data and rewrites the _raw data using the captured data in the format you want.
Hope this helps. Thanks!
Hunter