cookbook 's3_file', '= 2.5.3'
s3_file
(34) Versions
2.5.3
-
Follow28
Installs/Configures s3_file LWRP
cookbook 's3_file', '= 2.5.3', :supermarket
knife supermarket install s3_file
knife supermarket download s3_file
= DESCRIPTION:
An LWRP that can be used to fetch files from S3.
I created this LWRP to solve the chicken-and-egg problem of fetching files from S3 on the first Chef run on a newly provisioned machine. Ruby libraries that are installed on that first run are not available to Chef during the run, so I couldn't use a library like Fog to get what I needed from S3.
This LWRP has no dependencies beyond the Ruby standard library, so it can be used on the first run of Chef.
= REQUIREMENTS:
An Amazon Web Services account and something in S3 to fetch.
Multi-part S3 uploads do not put the MD5 of the content in the ETag header. If x-amz-meta-digest is provided in User-Defined Metadata on the S3 Object it is processed as if it were a Digest header (RFC 3230).
The MD5 of the local file will be checked against the MD5 from x-amz-meta-digest if it is present. It not it will check against the ETag. If there is no match or the local file is absent it will be downloaded.
If credentials are not provided, s3_file will attempt to use the first instance profile associated with the instance. See documentation at http://docs.aws.amazon.com/IAM/latest/UserGuide/instance-profiles.html for more on instance profiles.
= USAGE:
s3_file acts like other file resources. The only supported action is :create, which is the default.
Attribute Parameters:
* `aws_access_key_id` - your AWS access key id. (optional)
* `aws_secret_access_key` - your AWS secret access key. (optional)
* `token` - token used for temporary IAM credentials. (optional)
* `bucket` - the bucket to pull from.
* `s3_url` - Custom S3 URL. If specified this URL *must* include the bucket name at the end. (optional)
* `remote_path` - the S3 key to pull.
* `owner` - the owner of the file. (optional)
* `group` - the group owner of the file. (optional)
* `mode` - the octal mode of the file. (optional)
* `decryption_key` - the 32 character SHA256 key used to encrypt your S3 file. (optional)
Example:
s3_file "/tmp/somefile" do
remote_path "/my/s3/key"
bucket "my-s3-bucket"
aws_access_key_id "mykeyid"
aws_secret_access_key "mykey"
s3_url "https://s3.amazonaws.com/bucket"
owner "me"
group "mygroup"
mode "0644"
action :create
decryption_key "my SHA256 digest key"
decrypted_file_checksum "SHA256 hex digest of decrypted file"
end
= MD5 and Multi-Part Upload:
s3_file compares the MD5 hash of a local file, if present, and the ETag header of the S3 object. If they do not match, then the remote object will be downloaded and notifiations will be fired.
In most cases, the ETag of an S3 object will be identical to its MD5 hash. However, if the file was uploaded to S3 via multi-part upload, then the ETag will be set to the MD5 hash of the first uploaded part. In these cases, MD5 of the local file and remote object will never match.
To work around this issue, set an X-Amz-Meta-Digest tag on your S3 object with value set to `md5=MD5 of the entire object`. s3_file will then use that value in place of the ETag value, and will skip downloading in case the MD5 of the local file matches the value of the X-Amz-Meta-Digest header.
= USING ENCRYPTED S3 FILES:
s3_file can decrypt files that have been encrypted using an AES-256-CBC cipher. To use the decryption part of the resource, you must provide a decryption_key which can be generated by following the instructions below. You can also include an optional decrypted_file_checksum which allows Chef to check to see if it needs to redownload the encrypted file. Note that this checksum is different from the one in S3 because the file you compare to is already decrypted so a SHA256 checksum is used instead of the MD5. Instructions to generate the decrypted_file_checksum are below as well.
To use s3_file with encrypted files:
1. Create a new key using `bin/s3_crypto -g > my_new_key`.
1. Create a SHA256 hex digest checsksum of your source file by calling `bin/s3_crypto -c -i my_source_file [ -o my_checksum_file ]`.
1. Encrypt your file using the new key by calling `bin/s3_crypto -e -k my_new_key -i my_source_file [ -o my_destination_file ]`.
1. You can test decryption of your file using `bin/s3_crypto -d -k my_new_key -i my_encoded_file [ -o my_decoded_destionation ]`.
1. Upload your encrypted file to S3 as normal.
1. In the s3_file resource call, provide the string within `my_new_key` as the decryption_key of the resource.
1. In the s3_file resource call, provide the string within `my_checksum_file` as the decrypted_file_checksum of the resource.
Note that when you make the s3_file call, it is best if you make decryption_key a node property and provide it via an encrypted databag or pull the key from the environment. It is not wise to check in your decryption key to your recipe.
To create your cipher, run `bin/s3_crypto -g > my_new_key` and a new 256-bit (32 hexidecimal characters) will be generated for you. Paste that key into a file for later use. DO NOT include an endline in the file otherwise the encryption and decryption will fail.
Try `bin/s3_crypto -g > my_new_key`.
You can use the utility `bin/s3_crypto` to encrypt files prior to uploading to S3 and to decrypt files prior to make sure the encryption is working.
= ChefSpec matcher
s3_file comes with a matcher to use in {ChefSpec}[https://github.com/sethvargo/chefspec].
This spec checks the code from the USAGE example above:
it 'downloads some file from s3' do
expect(chef_run).to create_s3_file('/tmp/somefile')
.with(bucket: "my-s3-bucket", remote_path: "/my/s3/key")
end
Dependent cookbooks
This cookbook has no specified dependencies.
Contingent cookbooks
2015-03-20 version 2.5.3
- Fix deprecated digest call.
- Merged https://github.com/adamsb6/s3_file/pull/41. README enhancements. @eherot
- Merged https://github.com/adamsb6/s3_file/pull/43. Performance fix for rest client install. @scottymarshall
version 2.5.2
- Add retries for downloads
2014-12-09 version 2.5.1
- Merged https://github.com/adamsb6/s3_file/pull/36. Fix compatibility with Chef 12.
2014-10-01 version 2.5.0
- Merged https://github.com/adamsb6/s3_file/pull/31. This provides an optional s3_url value for a recipe to use S3 buckets other than US based ones.
- Merged https://github.com/adamsb6/s3_file/pull/29. Add ChefSpec matcher for testing.
2014-04-17 version 2.4.0
- Merged pull request https://github.com/adamsb6/s3_file/pull/25. This provides new functionality to automatically decrypt an encrypted file uploaded to S3.
2014-03-18 version 2.3.3
- Merged pull request https://github.com/adamsb6/s3_file/pull/24. This corrects documentation for use of X-Amz-Meta-Digest to identify md5 in multi-part uploads.
2014-02-20 version 2.3.2
- Added documentation for multi-part ETag/MD5 issue.
- Added changelog, backdated to 2014-02-14.
2014-02-14 version 2.3.1
- Merged pull request https://github.com/adamsb6/s3_file/pull/22. This fixes an issue in which an :immediately arg to notify would trigger the notified resource before file permissions had been set.
Foodcritic Metric
2.5.3 failed this metric
FC011: Missing README in markdown format: /tmp/cook/37777796ee7d96548e962262/s3_file/README.md:1
FC012: Use Markdown for README rather than RDoc: /tmp/cook/37777796ee7d96548e962262/s3_file/README.rdoc:1
2.5.3 failed this metric
FC012: Use Markdown for README rather than RDoc: /tmp/cook/37777796ee7d96548e962262/s3_file/README.rdoc:1