All Apps and Add-ons

Cisco Umbrella Log Management with App: Splunk Add-on for AWS

kennymann
New Member

Has anyone been able to successfully setup Cisco Umbrella(OpenDNS, Cloud) and the AWS Splunk Add-on?

I have setup Cisco Umbrella using the Amazon S3. The bucket is syncing data according to Cisco Umbrella.
https://support.umbrella.com/hc/en-us/articles/115004685266

My problems start with the configuration of Splunk Addon Inputs error:

S3ResponseError: 403 Forbidden <?xml version="1.0" encoding="UTF-8"?> <Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>
Tags (1)
0 Karma

hrithiktej
Communicator

Umbrella suport replied to me the same that due to permission issues it is not possible to ingest the logs using this app. So i followed the work around mentioned in this article

https://support.umbrella.com/hc/en-us/articles/360001388406-Configuring-Splunk-with-a-Cisco-managed-...

and now i have it all working

his article covers the basics of getting Splunk up and running so it is able to consume the logs from your Cisco-managed S3 bucket. You will:

1) Set up your Cisco-managed S3 bucket in your dashboard.

2) Create a cron job to retrieve files from the bucket and store them locally on your server.

3) Configure Splunk to read from a local directory.

0 Karma

hmorales1
New Member

When you are running this on your Splunk instance is it downloading the csv.gz files?

0 Karma

hrithiktej
Communicator

Yes below are the examples of files getting downloaded, and i am downloading it to my syslog server and then from there the splunk UF reads and forwards to my splunk indexer

-rw-r--r--. 1 root root 628 Mar 15 14:50 2019-03-15-14-40-2593.csv.gz
-rw-r--r--. 1 root root 683 Mar 15 14:50 2019-03-15-14-40-32fa.csv.gz
-rw-r--r--. 1 root root 844 Mar 15 14:50 2019-03-15-14-40-7f7d.csv.gz
-rw-r--r--. 1 root root 930 Mar 15 14:50 2019-03-15-14-40-b3ab.csv.gz
-rw-r--r--. 1 root root 798 Mar 15 14:50 2019-03-15-14-40-dcd8.csv.gz

0 Karma

hmorales1
New Member

Awesome, I'm getting the same thing!

Question how are you cleaning these folders up? are you running a cron job?

0 Karma

hrithiktej
Communicator

Nope i manually clean up after every 30 days. You can setup a cron job for 30 days cleanup. Because in umbrella bucket we have set the log retention for 30 days, so if clean up anything it downloads complete 30 days of log folders again, so once anything gets older than 30 days i delete them.

0 Karma

BonMot
Explorer

Cisco doesn't allow listing on the root of the bucket, and the Splunk AWS app doesn't allow you to type in a bucket_id. Luckily you can manually enter the bucket id in local/inputs.conf. Here is an example that sets a sourcetype compatible with TA-cisco-umbrella. This assumes your S3 bucket is

s3://YOUR-CISCO-S3-BUCKET/YOUR-INCREDIBLY-LONG-FOLDER-NAME

Everything in ALL-CAPS should be replaced. Don't forget the trailing slash (/) after the key_name

[aws_s3://ANY-NAME-YOU-WANT]
start_by_shell = false
aws_account = YOUR-ACCOUNT-NAME-CREATE-IN-UI
sourcetype = opendns:s3
initial_scan_datetime = default
max_items = 100000
max_retries = 3
polling_interval = 30
interval = 30
recursion_depth = -1
character_set = auto
is_secure = True
host_name = s3.amazonaws.com
ct_blacklist = ^$
ct_excluded_events_index =
key_name = YOUR-INCREDIBLY-LONG-FOLDER-NAME/
bucket_name = YOUR-CISCO-S3-BUCKET
0 Karma

hrithiktej
Communicator

@ David rose

TRY THIS workaround it works
The work around mentioned in this article

https://support.umbrella.com/hc/en-us/articles/360001388406-Configuring-Splunk-with-a-Cisco-managed-...

and now i have it all working

I am downloading logs onto my syslog servers with a cron job to run after every 10mins and from there i have the splunk reading them.

If u need help with cron job let me know

0 Karma

david_rose
Communicator

Thanks. If I can't use the existing aws add on, i might as well create a modular input. Just trying not to invent the wheel if I don't have to.

0 Karma

david_rose
Communicator

I also tried this with no luck. Logs always show 403 Forbidden. settings are verified by working with s3browser.

0 Karma

hrithiktej
Communicator

Hi BonMot,

Thanks for your post but i tried exactly this and it is still not working? Anything else you can suggest?

0 Karma

kennymann
New Member

After opening a ticket with Cisco Umbrella. This is the answer I received.

"Hello Kenny,
Currently it's not possible to retrieve Cisco managed S3 log files with Splunk. This is due to the restrictive permissions on the bucket itself. The Splunk AWS module expects to be able to list all buckets, which it cannot - as the Cisco Managed buckets are restricted to that singular bucket. However, you can access the logs with the Amazon cli tool. And other third party tools like S3 Browser."

0 Karma

nickhills
Ultra Champion

That’s rubbish!
You may need to change the access policy for your Splunk user so that it has list* and get* permissions on the bucket containing the logs, but there is no reason a bucket in your s3 account can not be retrieved by Splunk with the correct policy settings.

If my comment helps, please give it a thumbs up!
0 Karma

rajeevlalla
New Member

I think he's referring to the Cisco provided S3 instance i.e. managed/owned by Cisco Umbrella service.

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...