Reporting

how to pull the logs from Cisco Cloud Web Security storage architecture

82padarthi
Explorer

Need to pull the logs from the Cisco Cloud Web Security storage architecture using splunk

Cisco ScanCenter allows you to extract your data logs from the Cisco Cloud Web Security storage architecture.
Configure secure and fully automated extraction of data logs for import and analysis with your SIEM platform
S3-compatible API. The log extraction service uses Amazon Simple Storage Service (S3) protocol only for
the purpose of API compatibility; it does not use Amazon Web Services.

Please suggest

Tags (1)
0 Karma
1 Solution

friea
Splunk Employee
Splunk Employee

Yes - you can get Cloud Web Security (CWS) data into Splunk! The Cisco CSW PM team worked with Splunk’s newly released AWS Add-on – which can gather log data from generic AWS S3 buckets – to enable users to pull CWS logs into Splunk in the newest CWS release. The AWS add-on is available at https://apps.splunk.com/app/1876/

Cisco's CWS engineers advise the below configuration change to the Splunk connector:


You can configure all input parameters through Splunk Web or manually in inputs.conf, with the exception of host_name and is_secure, which can only be configured in inputs.conf.

When you configure inputs manually in inputs.conf, create a stanza using the following template and add it to $SPLUNK_HOME/etc/apps/Splunk_TA_aws/local/inputs.conf. If the file or path does not exist, create it. If needed, the path to copy the default input.conf file is /opt/splunk/etc/apps/Splunk_TA_aws/default.

[aws_s3://]
disabled = 0
sourcetype = aws:s3
interval = 180
is_secure = True
host_name = vault.scansafe.com
bucket_name = [INSERT CWS ACCOUNT ID HERE]

Here is a unique stanza name, and can be any string. The bucket name will be the customer’s account ID so where bucket name is referenced please insert the CWS account ID.

Also, in the file

/opt/splunk/etc/apps/Splunk_TA_aws/bin/taaws/s3util.py

change the connect_s3 line to:

         def connect_s3(key_id,secret_key,session_key,host="vault.scansafe.com",is_secure=True):

More information on the CWS Log Extraction capability can be found in the CWS Admin Guide or online help.

Please contact your Cisco account team for more details on CWS Log Extraction or, contact your Splunk account team for more information on the AWS S3 connector.

View solution in original post

friea
Splunk Employee
Splunk Employee

Great news ... a brand new Cisco Cloud Web Security (CWS) Add-on For Splunk Enterprise was just published at https://splunkbase.splunk.com/app/2791/

ryanoconnor
Builder

This add-on was very straight forward to setup. I ran into a lot of issues with the AWS Add-on version 3.0.0 and this CWS Add-on saved me a lot of headaches. Highly recommend it.

0 Karma

jeremyarcher
Path Finder

Thanks to others here I am able to get log files. However, they are almost all as follows:

#Fields: datatime   c-ip    cs(X-Forwarded-For) cs-username cs-method   cs-uri-scheme   cs-host cs-uri-port cs-uri-path cs-uri-query    cs(User-Agent)  cs(Content-Type)    cs-bytes    sc-bytes    sc-status   sc(Content-Type)    s-ip    x-ss-category   x-ss-last-rule-name x-ss-last-rule-action   x-ss-block-type x-ss-block-value    x-ss-external-ip    x-ss-referer-host
0 Karma

shaileshmali
Path Finder

CWS log collection is working now . I followed steps below

1) Update s3util.py
file path /opt/splunk/etc/apps/Splunk_TA_aws/bin/taaws/s3util.py
change the connect_s3 line to:
def connect_s3(key_id,secret_key,session_key,host="vault.scansafe.com",is_secure=True):

2) Use app UI setup option to configure
Friendly Name = test
AWS Account Key ID = Enter key ID
AWS Account Secret Key = Enter Secret key

3) Create inputs.conf file in app local directory
/opt/splunk/etc/apps/Splunk_TA_aws/local

[aws_s3://CWSlogExtraction_input]
aws_account = test (same as Friendly Name used in step2)
sourcetype = aws:s3
initial_scan_datetime = default
max_items = 100000
max_retries = 10
queueSize = 128KB
persistentQueueSize = 24MB
interval = 18000
recursion_depth = -1
character_set = auto
is_secure = True
host_name = vault.scansafe.com
bucket_name = Enter bucket id
key_name = cws-logs/

4) Restart splunk service

5) To verify the files are being accessed and indexed from the cloud bucket, access the "aws_s3.log" and verify the entries in the log show the files contained in the bucket.

6) Check on Searchhead index=main and source starting with source="s3://

jeremyarcher
Path Finder

Excellent work. This worked for me as well. I was hoping they would be formatted in something like a squid format but it's great to at least have this working.

Nicely done.

0 Karma

shaileshmali
Path Finder

i keep getting error when i enable disable input from UI
Error occurred attempting to disable : In handler 'aws_s3': The following required arguments are missing: aws_account, bucket_name..

0 Karma

jeremyarcher
Path Finder

Hey shailseshami, maybe we can figure this out together. Did you use the App UI to set up your initial configuration including the "Friendly Name", "Access Key" and "Secret Key"? I did and do not have this error you're reporting.

The best that I can tell this UI 'Set up' modifies the $SPLUNK_HOME/etc/apps/Splunk_TA_aws/local/app.conf file. However, it isn't clear to me how this file is used in s3util.py.

For me, I did use the App's "Set Up" UI to add my "Friendly Name", "AWS Account Key ID" and "AWS Account Secret Key".

Also, it wasn't clear to me from the instructions provided in the OP if this line needed to be changed with the variables from my specific configuration:

  def connect_s3(key_id,secret_key,session_key,host="vault.scansafe.com",is_secure=True):

Do I add my Key ID, Secret Key in this line? If so, what is the Session key?

0 Karma

laleet_pandey
New Member

I used UI to to configure
"Friendly Name", "AWS Account Key ID" and "AWS Account Secret Key". i am not using proxy

inputs.conf looks like this
[aws_s3://]
disabled = 0
sourcetype = aws:s3
interval = 180
is_secure = True
host_name = vault.scansafe.com
bucket_name =

When i check data input AWS S3 from UI , i see these details.
Input Name - no data
AWS account - no data
S3 bucket- no data
S3 key name - this field shows bucket id here
Source type - aws:s3

0 Karma

jeremyarcher
Path Finder

Ok. Same as me then. I'm assuming that you did populate the "bucket_name = " with your Bucket ID from the Scansafe Log Extraction portal?

My stanza appears as:

[aws_s3://]
disabled = 0
sourcetype = aws:s3
interval = 20
is_secure = True
host_name = vault.scansafe.com
bucket_name = 216XXXXX547

I'm unable to get to the AWS S3 data inputs section at all as I get an error:

An error occurred while rendering the page template. See web_service.log for more details
View more information about your request (request ID = 55410b45a818a5f1a390) in Search

This page was linked to from http://vmprdlog01:8000/en-US/manager/launcher/adddata/selectsource?modinput=1&input_mode=1&input_type=aws_s3.
0 Karma

friea
Splunk Employee
Splunk Employee

Yes - you can get Cloud Web Security (CWS) data into Splunk! The Cisco CSW PM team worked with Splunk’s newly released AWS Add-on – which can gather log data from generic AWS S3 buckets – to enable users to pull CWS logs into Splunk in the newest CWS release. The AWS add-on is available at https://apps.splunk.com/app/1876/

Cisco's CWS engineers advise the below configuration change to the Splunk connector:


You can configure all input parameters through Splunk Web or manually in inputs.conf, with the exception of host_name and is_secure, which can only be configured in inputs.conf.

When you configure inputs manually in inputs.conf, create a stanza using the following template and add it to $SPLUNK_HOME/etc/apps/Splunk_TA_aws/local/inputs.conf. If the file or path does not exist, create it. If needed, the path to copy the default input.conf file is /opt/splunk/etc/apps/Splunk_TA_aws/default.

[aws_s3://]
disabled = 0
sourcetype = aws:s3
interval = 180
is_secure = True
host_name = vault.scansafe.com
bucket_name = [INSERT CWS ACCOUNT ID HERE]

Here is a unique stanza name, and can be any string. The bucket name will be the customer’s account ID so where bucket name is referenced please insert the CWS account ID.

Also, in the file

/opt/splunk/etc/apps/Splunk_TA_aws/bin/taaws/s3util.py

change the connect_s3 line to:

         def connect_s3(key_id,secret_key,session_key,host="vault.scansafe.com",is_secure=True):

More information on the CWS Log Extraction capability can be found in the CWS Admin Guide or online help.

Please contact your Cisco account team for more details on CWS Log Extraction or, contact your Splunk account team for more information on the AWS S3 connector.

laleet_pandey
New Member

Can i create my own app and copy s3util.py in bin directory . i can use inputs.conf with stanza given above. Amazon AWS app uses lot of stanzas in inputs.conf

0 Karma

laleet_pandey
New Member

We can get a accessKey and secretKey from cisco CWS can you give mapping between parameters below and keys extracted from cisco CWS

key_id = ?
secret_key = ?
session_key = ?
host="vault.scansafe.com",
is_secure=True

0 Karma

jeremyarcher
Path Finder

Did you ever get this data? I'm working on the same.

0 Karma

82padarthi
Explorer

Thanks friea for your support

0 Karma

jeremyarcher
Path Finder

Hey 82padarthi, just wondering if you ever got this working?

Thanks!

0 Karma

82padarthi
Explorer

Yes its working for me...

0 Karma

jeremyarcher
Path Finder

Would you mind posting your stanza from inputs.conf and the changes you made to the s3util.py?

I have my bucket ID, CWS accessKey and and secretKey (downloaded via .cvs file from our Log Extraction page on our Scansafe portal). But I see nothing about a session key.

Thanks!

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...