Splunk Enterprise Security

Why are fields not being extracted running TA-squid with the Splunk App for Enterprise Security?

btiggemann
Path Finder

Hi all,

I am struggling with the field extractions in TA-squid.
I have tried the TA-squid with Splunk 6.0 (which is required as you can see at splunk base) and 6.2 on a fresh and clean splunk installation.

The setup:
Squid is running on ubuntu with a splunkforwarder on it. Splunkforwarder is reading the access_splunk.log and sends it to the indexer.
I can't see any issue, because I can see the events with sourcetype=squid on my indexer. But I only see the field "tag" which points out proxy and web. I can't see the fields described in props and transforms. There is no issue with permission. TA-squid is defined as global.

I am not deep into regex, but can it be that the regex is just wrong?

Running squid version is 3.3.8 on Ubuntu.

I have done the customization of squid log events in squid.conf:

# The format definitions squid, common, combined, referrer, useragent are built in.
logformat squid %ts.%03tu %6tr 127.1.%{%H}tl.%{%M}tl %Ss/%03>Hs %<st %rm %ru %un %Sh/%<A %mt
logformat splunkES %ts.%03tu %09tr %09>a %09>st %09Ss %09>Hs %09<st %09rm %09ru %09un %09Sh %09<A "%09mt" "%09{User-Agent}>h" "%09{Referer}>h" "%09{Cookie}>h"
access_log /var/log/squid3/access_splunk.log splunkES
access_log /var/log/squid3/access.log squid
access_log none magellan
log_ip_on_direct on

The events look like this in splunk:

1433251217.917 000000086 1.1.1.114 000000259  TCP_MISS 000000200 000001396   CONNECT r2---sn-oxujvavbox-jbol.googlevideo.com:443 - HIER_DIRECT r2---sn-oxujvavbox-jbol.googlevideo.com "        -" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:38.0) Gecko/20100101 Firefox/38.0" "-" "-"

Time stamp is looking good.
I can't see the fields described in props.conf and transforms.conf:

props.conf:

[root@magsplunk1 default]# cat props.conf
[squid]
TIME_FORMAT = %s.%3N
MAX_TIMESTAMP_LOOKAHEAD = 15
KV_MODE = none
SHOULD_LINEMERGE = false
REPORT-squid = squid_fields
LOOKUP-vendor_info_for_squid_proxy = squid_vendor_info_lookup sourcetype OUTPUT vendor,product

transforms.conf

[root@magsplunk1 default]# cat transforms.conf
[squid]

[squid_fields]

REGEX = ^\d+\.\d{3}\s+(\d*)\s+([0-9{1,3}\.]*)\s+([0-9]*)\s+([^0-9]*)\s(\d{3})\s(\d*)\s+(\w*)\s+((http|ftp|https)://[^\s]+)\s+([^\s]*)\s+(\w*)\s+([0-9{1,3}\.]*)\s+([^/]+/[^ ]+)\s\"([^"]+)\"\s+\"([^\"]+)\"\s+\"([^\"]+)\"

FORMAT = duration::$1 src::$2 bytes_in::$3 action::$4 status::$5 bytes_out::$6 http_method::$7 url::$8 protocol::$9 user::$10 hierarchy_code:$11 destination::$12 http_content_type::$13 user_agent::$14 http_referrer::$15 http_cookie::$16

[squid_vendor_info_lookup]
filename = proxy_vendor_info.csv

This are the default config files, but this fields are beeing not extracted. I have set the sorucetype = squid in my inputs.conf

[monitor:///var/log/squid3/access_splunk.log]
disabled = 0
host = mag-squid
index = proxy
sourcetype = squid
0 Karma

o_calmels
Communicator

Hi btiggemann

Have you solve your parsing problem ?
I'm interested in the solution !

Cheers.

Olivier.

0 Karma

ekost
Splunk Employee
Splunk Employee

I would begin by confirming that TA-squid has been deployed on the indexer. Also check that the props/transforms from TA-squid are being merged properly on the indexer by using "btool". If those configs are in place and being merged, please review the readme file included with TA-squid, specifically the log format requirements stated therein. And good luck!

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...