All Apps and Add-ons

Has anyone gotten Splunk for Bluecoat working?

Path Finder

Hi.

I'm having a lot of problems with the Splunk for Bluecoat app.
After following the tips from this post i have gotten a little bit further, but still have a lot of problems.

  1. The searches in the app only works on the indexers. If I try to run the app on a search node, all searches fail with an error message: "Reached end-of-stream while waiting for more data from peer index1. Search results might be incomplete!" (actually one of those from each indexer)
  2. Running the app on an indexer gives some data, but is slow to the point of being useless. A search for data for the last 15 minutes takes about 5 minutes to complete. Doing other searches (in the search app for instance) does not appear to very slow (although they are probably not so complex)

So my question is quite simply if anyone has gotten this app to work in a distributed environment, and if so how?

Any tips would be appreciated.

New Member

Hi,

we can use the BlueCoat App for Splunk, but there still some fields that cannot be compared correctly.
But for the first basic reports it's enough.
We using it in this combination:

  • The ProxySG's are using the an custom Accesslog Server, with the logformat "bcreportermain_v1"
  • Because the custom logserver cannot use the local time, the ProxySG is sending the log in UTC, Splunk itself is running in local time (GMT+1)
  • We have configures an local props file on Splunk: /opt/splunk/etc/apps/SplunkforBlueCoat/local/props.conf

[bcoat_proxysg]

TZ = UTC

REPORT-main = bcreportermain_v1


  • The data input for the dedicated TCP port has the sourcetype value "bcoat_proxysg", and the Index is "bcoat_logs"

Now, after restarting Splunk, the fields are mostly correct like date, time, c-ip, but there are still some fields that are not 100% recognized. For an example, "action" has now the values from the http_statuscode. We haven't found a solution for it, because we are very beginners in Splunk, but when I compare the logformat with the transforms.conf, the order of the fields seems good.

0 Karma

Path Finder

This seems tied to the eventtype=bcoat_request in the BlueCoat - Datacube and BlueCoat - Datacube - Summary Index saved searches.

By editing the saved search and replacing eventtype=bcoat_request in both searches with the expansion from macros.conf, i.e.

sourcetype=bcoat_cacheflow OR (sourcetype=bcoat_proxysg  filter_result!="DENIED")

the application works. Editing default/savedsearches.conf directly didn't seem to force this, even with a restart. Adding a local/savedsearches.conf with the correct stanzas (which I achieved through editing the saved search in Manager) does have the desired effect.

local/savedsearches.conf now contains

[BlueCoat - DataCube]
action.email.inline = 1
alert.digest_mode = True
alert.suppress = 0
alert.track = 0
search = sourcetype=bcoat_cacheflow OR (sourcetype=bcoat_proxysg  filter_result!="DENIED") |  bin _time span=5m | makemv delim=";" allowempty=t category |  fillnull src_ip cs_bytes category  dest_host rs_bytes sc_bytes sc_status sr_bytes  | eval client_bytes=sc_bytes+cs_bytes | eval server_bytes=rs_bytes+sr_bytes | eval savings_bytes=client_bytes-server_bytes | eval savings_bytes=if(server_bytes==0,0,savings_bytes) |  eval savings_perc = (1/client_bytes)  * savings_bytes * 100  | stats count by host src_ip sourcetype category dest_host server_bytes client_bytes savings_bytes savings_perc _time

[BlueCoat - DataCube - Summary Index]
action.email.inline = 1
alert.digest_mode = True
alert.suppress = 0
alert.track = 0
search = sourcetype=bcoat_cacheflow OR (sourcetype=bcoat_proxysg  filter_result!="DENIED") |  bin _time span=5m | makemv delim=";" allowempty=t category |  fillnull src_ip cs_bytes category  dest_host rs_bytes sc_bytes sc_status sr_bytes  | eval client_bytes=sc_bytes+cs_bytes | eval server_bytes=rs_bytes+sr_bytes | eval savings_bytes=client_bytes-server_bytes | eval savings_bytes=if(server_bytes==0,0,savings_bytes) |  eval savings_perc = (1/client_bytes)  * savings_bytes * 100  | sistats count by host src_ip sourcetype category dest_host server_bytes client_bytes savings_bytes savings_perc _time

I have no idea why the use of the eventtype foxes the distributed search - this could be a bug in Splunk.

New Member

hi i am also recieving this error in our distributed search environment ... has only from splunk been able to address this bug? i am using just the standard search app across a distro environment but encountering this error rather frequently.

Even rerunning the search(s) doesnt seem to help.

0 Karma

Explorer

First - I want to say I gave up on this and rolled the functionality for my bluecoat tracking into Enterprise Security. But, before I gave up, I found out that the bluecoat app is not designed to work in a distributed environment (at all). I had it working pretty well when I uninstalled it from everything and set it up to run solely on one indexer. A side effect of that is you have to send all your bluecoat traffic to one indexer. Hope this helps.

0 Karma

Explorer

I'm sorry that was over a year ago, I have no idea. I think there were still some quirks and I've just moved on from that app altogether.

0 Karma

Engager

@trademarq:

What all did you have to do to get the app to work, even in a non-distributed environment. I added data to an input; attached it to sourcetype='bcoat_proxysg' and on index='bcoat_logs'.

When I open up the Blue Coat app I see "0BlueCoats Reporting
Top Category:N/A
Top Client:N/A
Blocked Sites:"

The Map below shows a few data points.

A search for bcoat_request displays all my data if the right time-period is chosen.

Any suggestions?

0 Karma

Engager

I had problems too and gave up.
As the description stated "Splunk and Blue Coat are teaming up..." I had hoped for much more.
With the last update being 12 months ago I'm not sure this is going to go anywhere.

New Member

I've never got it to work on on our search heads either, but it does work mostly fine on the indexers themselves. I'm tempted to give up on it actually.

0 Karma

Explorer

Not an answer, but I'm having the same exact problem. Not sure what the issue is..