<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Frozen archives into Amazon S3 in All Apps and Add-ons</title>
    <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41034#M1989</link>
    <description>&lt;P&gt;I have developed this script coldToFrozenPlusS3Uplaod.py that encrypts and uploads frozen buckets to S3.&lt;/P&gt;

&lt;P&gt;It can be found here: &lt;A href="https://github.com/marboxvel/Encrypt-upload-archived-Splunk-buckets"&gt;https://github.com/marboxvel/Encrypt-upload-archived-Splunk-buckets&lt;/A&gt; &lt;/P&gt;</description>
    <pubDate>Fri, 31 Aug 2018 18:53:37 GMT</pubDate>
    <dc:creator>sbutto</dc:creator>
    <dc:date>2018-08-31T18:53:37Z</dc:date>
    <item>
      <title>Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41024#M1979</link>
      <description>&lt;P&gt;Has anyone got a sample coldToFrozenScript that will copy frozen index archives to S3 before erasing them?&lt;/P&gt;</description>
      <pubDate>Tue, 21 Aug 2012 16:42:03 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41024#M1979</guid>
      <dc:creator>marksnelling</dc:creator>
      <dc:date>2012-08-21T16:42:03Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41025#M1980</link>
      <description>&lt;P&gt;Might have a look at Shuttl -- &lt;A href="http://blogs.splunk.com/2012/07/02/shuttl-for-big-data-archiving/"&gt;http://blogs.splunk.com/2012/07/02/shuttl-for-big-data-archiving/&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 21 Aug 2012 17:09:30 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41025#M1980</guid>
      <dc:creator>dwaddle</dc:creator>
      <dc:date>2012-08-21T17:09:30Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41026#M1981</link>
      <description>&lt;P&gt;This looks promising, I'm not sure how this is deployed though. Do I install Hadoop on my Splunk indexer and map it to S3 or does it need to be installed in EC2 and access S3 that way?&lt;BR /&gt;
I'm assuming Hadoop is required for S3 BTW.&lt;/P&gt;</description>
      <pubDate>Wed, 22 Aug 2012 10:08:43 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41026#M1981</guid>
      <dc:creator>marksnelling</dc:creator>
      <dc:date>2012-08-22T10:08:43Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41027#M1982</link>
      <description>&lt;P&gt;Hadoop does not need to be installed on the Splunk Indexer. If the data is in S3, then you can use the standard ways of deploying Hadoop to operate on the data there. See a discussion here: &lt;A href="http://stackoverflow.com/questions/4092852/i-cant-get-hadoop-to-start-using-amazon-ec2-s3"&gt;http://stackoverflow.com/questions/4092852/i-cant-get-hadoop-to-start-using-amazon-ec2-s3&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;Also keep in mind that if you want to use the data in Hadoop, you will want to archive in CSV format. If you want the data to come back to Splunk, you can bring the CSV data back (however, it may incur compute load on import), or for more efficient index restoration, store in Splunk Bucket format.&lt;/P&gt;</description>
      <pubDate>Wed, 17 Oct 2012 05:03:24 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41027#M1982</guid>
      <dc:creator>bchen</dc:creator>
      <dc:date>2012-10-17T05:03:24Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41028#M1983</link>
      <description>&lt;P&gt;For more info on Shuttl setup see: See: &lt;A href="https://github.com/splunk/splunk-shuttl/wiki/Quickstart-Guide"&gt;https://github.com/splunk/splunk-shuttl/wiki/Quickstart-Guide&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 17 Oct 2012 05:03:52 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41028#M1983</guid>
      <dc:creator>bchen</dc:creator>
      <dc:date>2012-10-17T05:03:52Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41029#M1984</link>
      <description>&lt;P&gt;hey @marksnelling - a bit late but here is our sample script.&lt;/P&gt;

&lt;P&gt;&lt;A href="https://bitbucket.org/asecurityteam/atlassian-add-on-cold-to-frozen-s3/overview"&gt;https://bitbucket.org/asecurityteam/atlassian-add-on-cold-to-frozen-s3/overview&lt;/A&gt;&lt;/P&gt;

&lt;P&gt;there are a few assumptions, like IAM user keys or roles deployed to your nodes... but i've tested successfully across a large index cluster.&lt;/P&gt;</description>
      <pubDate>Tue, 02 Feb 2016 08:19:54 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41029#M1984</guid>
      <dc:creator>awurster</dc:creator>
      <dc:date>2016-02-02T08:19:54Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41030#M1985</link>
      <description>&lt;P&gt;Shuttl is deprecated and not in development any more, and I believe it wont work with &amp;gt; 6.2 due to python library incompatibilities.&lt;/P&gt;

&lt;P&gt;At this point, you'd be better rolling to s3 with s3cmd or s3cli, as a script. In the future, perhaps there will be more functionality to include this as a roll to cold / frozen feature..&lt;/P&gt;</description>
      <pubDate>Tue, 02 Feb 2016 09:00:36 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41030#M1985</guid>
      <dc:creator>esix_splunk</dc:creator>
      <dc:date>2016-02-02T09:00:36Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41031#M1986</link>
      <description>&lt;P&gt;Nice script. On a side note, you might look at awscli instead of s3cmd. Its an officially supported binary, along with being multithreaded (better performance!)&lt;/P&gt;</description>
      <pubDate>Tue, 02 Feb 2016 09:03:20 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41031#M1986</guid>
      <dc:creator>esix_splunk</dc:creator>
      <dc:date>2016-02-02T09:03:20Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41032#M1987</link>
      <description>&lt;P&gt;sure, anything is possible.  but i would be more interested to see Splunk and AWS come together to make something more legit than what we've hacked together in 30 minutes.&lt;/P&gt;</description>
      <pubDate>Tue, 02 Feb 2016 09:13:30 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41032#M1987</guid>
      <dc:creator>awurster</dc:creator>
      <dc:date>2016-02-02T09:13:30Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41033#M1988</link>
      <description>&lt;P&gt;I downvoted this post because this approach is no longer officially supported, and has too many dependencies attached (java, etc).&lt;/P&gt;</description>
      <pubDate>Tue, 16 Feb 2016 04:19:42 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41033#M1988</guid>
      <dc:creator>awurster</dc:creator>
      <dc:date>2016-02-16T04:19:42Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41034#M1989</link>
      <description>&lt;P&gt;I have developed this script coldToFrozenPlusS3Uplaod.py that encrypts and uploads frozen buckets to S3.&lt;/P&gt;

&lt;P&gt;It can be found here: &lt;A href="https://github.com/marboxvel/Encrypt-upload-archived-Splunk-buckets"&gt;https://github.com/marboxvel/Encrypt-upload-archived-Splunk-buckets&lt;/A&gt; &lt;/P&gt;</description>
      <pubDate>Fri, 31 Aug 2018 18:53:37 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41034#M1989</guid>
      <dc:creator>sbutto</dc:creator>
      <dc:date>2018-08-31T18:53:37Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41035#M1990</link>
      <description>&lt;P&gt;Hey &lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/54339"&gt;@sbutto&lt;/a&gt; im using your /coldToFrozenPlusS3Uplaod.py&lt;BR /&gt;
 to upload to S3 but getting issues can any one help me &lt;BR /&gt;
here is the attributes i have added &lt;/P&gt;

&lt;P&gt;import sys, os, gzip, shutil, subprocess, random, gnupg&lt;BR /&gt;
import boto&lt;BR /&gt;
import datetime&lt;BR /&gt;
import time&lt;BR /&gt;
import tarfile&lt;/P&gt;

&lt;H1&gt;applyLogging is a python script named applyLogging.py that exists at the same level of this script.&lt;/H1&gt;

&lt;H1&gt;If the file applyLogging.py doesn't exist where this file is located, the import statement will fail.&lt;/H1&gt;

&lt;P&gt;sys.path.append(script_path)&lt;BR /&gt;
import applyLogging&lt;/P&gt;

&lt;H3&gt;CHANGE THIS TO YOUR ACTUAL ARCHIVE DIRECTORY!!!&lt;/H3&gt;

&lt;P&gt;ARCHIVE_DIR = "/splunk/index/splunk/archiveindex"&lt;/P&gt;

&lt;H1&gt;ARCHIVE_DIR = os.path.join(os.getenv('SPLUNK_HOME'), 'frozenarchive')&lt;/H1&gt;

&lt;P&gt;script_path = '/opt/splunk/etc/apps/Encrypt-upload-archived-Splunk-buckets-master/coldToFrozenPlusS3Uplaod.py'&lt;BR /&gt;
log_file_path = '/opt/splunk/var/log/splunk/'&lt;/P&gt;

&lt;H1&gt;gnu_home_dir = '' #where the gpg directory is. For example /home/s3/.gnupg/&lt;/H1&gt;

&lt;P&gt;gnu_home_dir = /home/splunkq/.gnupg&lt;/P&gt;

&lt;H1&gt;reciepient_email = '' #the email the gpg uses to encrypt the files&lt;/H1&gt;

&lt;P&gt;reciepient_email = &lt;A href="mailto:xxyxy@gmail.com" target="_blank"&gt;xxyxy@gmail.com&lt;/A&gt;&lt;/P&gt;

&lt;H1&gt;Enabling the logging system&lt;/H1&gt;

&lt;P&gt;logger = applyLogging.get_module_logger(app_name='SplunkArchive',file_path=log_file_path)&lt;/P&gt;

&lt;H1&gt;Finding out the epoch value at four month ago so we can copmare the bucket timestamp against it.&lt;/H1&gt;

&lt;H1&gt;First we need to find today's epoch&lt;/H1&gt;

&lt;P&gt;today=round(time.mktime(datetime.datetime.today().timetuple()))&lt;/P&gt;

&lt;H1&gt;Substract 120 days&lt;/H1&gt;

&lt;P&gt;one_month_earlier=today-120*86400&lt;/P&gt;

&lt;P&gt;logger.info('Started on '+str(datetime.datetime.today()))&lt;/P&gt;

&lt;H1&gt;Getting the hostname so we can prefix the uploaded file name with it to distinguish buckets from different indexes.&lt;/H1&gt;

&lt;P&gt;hostname=os.uname()[1]&lt;/P&gt;

&lt;H1&gt;S3 creds&lt;/H1&gt;

&lt;P&gt;AWS_ACCESS_KEY_ID="xxxx"&lt;BR /&gt;
AWS_ACCESS_KEY_SECRET="xxxx"&lt;BR /&gt;
AWS_BUCKET_NAME="s3://zfu-splunk-pa/"&lt;/P&gt;

&lt;H1&gt;Creating the gpg object&lt;/H1&gt;

&lt;P&gt;gpg = gnupg.GPG(gnupghome=gnu_home_dir)&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 23:12:07 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41035#M1990</guid>
      <dc:creator>Splunk_rocks</dc:creator>
      <dc:date>2020-09-29T23:12:07Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41036#M1991</link>
      <description>&lt;P&gt;here is my error code &lt;/P&gt;

&lt;P&gt;02-09-2019 13:51:36.711 -0500 ERROR BucketMover - coldToFrozenScript   File "/opt/splunk/etc/apps/Encrypt-upload-archived-Splunk-buckets-master/coldToFrozenPlusS3Uplaod.py", line 150&lt;BR /&gt;
02-09-2019 13:51:36.711 -0500 ERROR BucketMover - coldToFrozenScript     sys.stderr.write("mkdir warning: Directory '" + ARCHIVE_DIR + "' already exists\n")&lt;BR /&gt;
02-09-2019 13:51:36.711 -0500 ERROR BucketMover - coldToFrozenScript       ^&lt;BR /&gt;
02-09-2019 13:51:36.711 -0500 ERROR BucketMover - coldToFrozenScript SyntaxError: invalid syntax&lt;BR /&gt;
02-09-2019 13:51:36.715 -0500 ERROR BucketMover - coldToFrozenScript cmd='"/bin/python" "/opt/splunk/etc/apps/Encrypt-upload-archived-Splunk-buckets-master/coldToFrozenPlusS3Uplaod.py" /splunk/index/splunk/noindexdb/db/db_1543106304_1543105443_10' exited with non-zero status='exited with code 1&lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 23:12:09 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41036#M1991</guid>
      <dc:creator>Splunk_rocks</dc:creator>
      <dc:date>2020-09-29T23:12:09Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41037#M1992</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/140799"&gt;@Splunk_rocks&lt;/a&gt;, &lt;/P&gt;

&lt;P&gt;Can you please tell me what your splunk setup looks like? What OS Splunk is installed on? How did you configure the indexer to run the script. What python packages you added and how?&lt;/P&gt;

&lt;P&gt;But first try to use a file path with log_file_path. &lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 23:07:45 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41037#M1992</guid>
      <dc:creator>sbutto</dc:creator>
      <dc:date>2020-09-29T23:07:45Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41038#M1993</link>
      <description>&lt;P&gt;Thanks for checkinh @Sbutto&lt;BR /&gt;
mine is native Linux running &lt;BR /&gt;
I have put your script under /opt/splunk/etc/apps/&lt;BR /&gt;
I have configured in indexes. conf to &lt;BR /&gt;
run python path/opt/splunk/etc/apps/Encrypt-upload-archived-Splunk-buckets-master/coldToFrozenPlusS3Uplaod.py'&lt;BR /&gt;
Can you please help me on this im stuck&lt;/P&gt;</description>
      <pubDate>Sun, 17 Feb 2019 23:58:48 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41038#M1993</guid>
      <dc:creator>Splunk_rocks</dc:creator>
      <dc:date>2019-02-17T23:58:48Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41039#M1994</link>
      <description>&lt;P&gt;As of now its just stand alone splunk running on single instance indexer&lt;/P&gt;</description>
      <pubDate>Mon, 18 Feb 2019 00:03:04 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41039#M1994</guid>
      <dc:creator>Splunk_rocks</dc:creator>
      <dc:date>2019-02-18T00:03:04Z</dc:date>
    </item>
    <item>
      <title>Re: Frozen archives into Amazon S3</title>
      <link>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41040#M1995</link>
      <description>&lt;P&gt;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/54339"&gt;@sbutto&lt;/a&gt; here is my inputs in the script &lt;/P&gt;

&lt;H3&gt;CHANGE THIS TO YOUR ACTUAL ARCHIVE DIRECTORY!!!&lt;/H3&gt;

&lt;P&gt;ARCHIVE_DIR = "/splunk/index/splunk/archiveindex"&lt;/P&gt;

&lt;H1&gt;ARCHIVE_DIR = os.path.join(os.getenv('SPLUNK_HOME'), 'frozenarchive')&lt;/H1&gt;

&lt;P&gt;script_path = '/opt/splunk/etc/apps/Encrypt-upload-archived-Splunk-buckets-master/coldToFrozenPlusS3Uplaod.py'&lt;BR /&gt;
log_file_path = '/opt/splunk/etc/apps/Encrypt-upload-archived-Splunk-buckets-master/'&lt;/P&gt;

&lt;H1&gt;gnu_home_dir = '' #where the gpg directory is. For example /home/s3/.gnupg/&lt;/H1&gt;

&lt;P&gt;gnu_home_dir = ' /home/splunkqa/ '&lt;/P&gt;

&lt;H1&gt;reciepient_email = '' #the email the gpg uses to encrypt the files&lt;/H1&gt;

&lt;P&gt;reciepient_email = ' xyz@@@domain.com '&lt;/P&gt;

&lt;H1&gt;Enabling the logging system&lt;/H1&gt;

&lt;P&gt;logger = applyLogging.get_module_logger(app_name='SplunkArchive',file_path=log_file_path)&lt;/P&gt;

&lt;H1&gt;Finding out the epoch value at four month ago so we can copmare the bucket timestamp against it.&lt;/H1&gt;

&lt;P&gt;FYI - my splunk was setup standalone indexr host splunk is installed  under /opt/splunk&lt;BR /&gt;
my indexr is configured under /splunk/index &lt;/P&gt;</description>
      <pubDate>Tue, 29 Sep 2020 23:22:25 GMT</pubDate>
      <guid>https://community.splunk.com/t5/All-Apps-and-Add-ons/Frozen-archives-into-Amazon-S3/m-p/41040#M1995</guid>
      <dc:creator>Splunk_rocks</dc:creator>
      <dc:date>2020-09-29T23:22:25Z</dc:date>
    </item>
  </channel>
</rss>

