Hello, I have an issue in which my searches are suddenly offset by one field. In other words, the Action field now contains clientip and so forth. I do not use IFE (Interactive Field Extractions) because on occasion the regex would be thrown off for a similar reason so I decided to simply handle field extractions using a static, field by field method in the transforms.conf file. This has always worked but today I noticed the offset due to the single digit in the first month field.
The SQUID syslog format on my appliance sends out data in the following manner. The first set of date fields use MMM D HH:MM:SS and the second set of date fields use MMM DD HH:MM:SS. For this reason, I believe the white space in the first Day field is being ignored thus offsetting the rest of the field extractions.
Here is a Sample Squid Syslog Entry (using fake/sanitized data)
Sep 6 12:26:39 192.168.1.68 Sep 06 12:26:39 AN_SQUID_VIP_HOST_LOG 1378484799.674 339 172.16.40.40 www.testdomain.com 188.8.131.52 TCP_MISS/200 45667 GET /jobs/saved?cmd=save&save_job=4431425 - DIRECT/172.16.40.43 -
My transforms/props on the Search Head. I capitalized the first group to distinguish the first date fields from the second set. Only the lower case fields get used by my Splunk for Squid app. I have heard of using delims="/s" or delims="/t" as a way to handle white space in fields but that isn't working. Please advise. Thanks in advance!
REPORT-squidfields = squid_custom_fields
TIME_FORMAT = %b %e %H:%M:%S
KV_MODE = none
DELIMS = " "
FIELDS = Month,Day,Systime,host,month,day,systime,format,time,duration,server_ip,uri_host,clientip,action,bytes,method,uri_path,username,hierarchy,content_type
... View more