Mastodon

Client-side S3 File Upload With jQuery File Upload

— 8 minute read

In my previous career I dealt with a ton of different digital photo upload solutions, so for the second project mode at Flatiron School I was interested in building a project that would teach me the ins and outs of file IO for a web application.

For inspiration we looked to a site our instructor uses to quickly share links with the class, ShoutKey. The basic premise being if you wanted to quickly share a handful of files with a group within hearing distance you could generate a url with a simple slug (a short, human readable, one-word url path). This would give us the opportunity to deal with getting user file input, sending those files to Amazon Web Services S3 storage, and then retrieving the files upon user request, both as single files and as multi-file zip archives.

As our app would be hosted on Heroku, I wanted to find a solution that would fit with that service, so I initially planned to execute uploads from the client-side, so as to not run into any problems with the Heroku dynos resetting mid-upload.

jQuery File Upload presented a way for us to perform these client-side uploads for our Ruby on Rails project.

Heroku's own guide on this subject was helpful to a point, unfortunately they don't mention the small matter of file validation, which is relatively important unless you're interested in having your file storage turned into a Bitcoin miner. Their guide is a great place to get started on the details I'll fill in here.

Key gems to include:

I like Figaro for defining my secret key values, since it takes just one command line task to push the production keys to Heroku.

First, in the controller we pre-sign our AWS post call.

As part of setting up S3, you also need to set the target bucket's CORS Configuration, so that the bucket will permit files being sent from your host as an origin. In our application we created a separate S3 bucket that allowed uploads from localhost:3000, and defined these separate buckets in the applicaiton.yml used by Figaro.

class EnvelopesController
before_action :set_s3_direct_post, only: [:show, :edit]

private

def set_s3_direct_post
@s3_direct_post = S3_BUCKET.presigned_post(key: "uploads/#{SecureRandom.uuid}/${filename}", success_action_status: '201', acl: 'public-read')
end
end

The instance variable @s3_direct_post will be passed to the view for use in our form, and the key: "uploads/#{SecureRandom.uuid}/${filename}" defines the path inside our S3 Bucket where these files will live. This pre-signed POST allows all of our users to utilize my AWS keys to authenticate their uploads to my AWS account.

We send our pre-signed POST into our form as a hash, and then our jQuery File Upload function will use these values to send the file to Amazon.

<%= form_for(@envelope, html: { class: 'directUpload', data: { 'form-data' => (@s3_direct_post.fields), 'url' => @s3_direct_post.url, 'host' => URI.parse(@s3_direct_post.url).host } }) do |f| %>
<%= f.label :parchment_url %>
<%= f.file_field :parchment_url %>
<%= f.submit %>
<% end %>

The below jQuery File Upload function sends an AJAX call to AWS based on the pre-signed post and the object a user submits for upload.

The user either drags one or more files to a landing pad which initiates an immediate, AJAX upload to S3. jQuery File Upload has a callback function add which is invoked as soon as files are added to the form file field. It then behaves as a way to iterate over the files when autoUpload: true is set, so you can carry out validations or additional processing of files even though uploads begin immediately.

fileInput.fileupload({
fileInput: fileInput,
url: form.data('url'),
type: 'POST',
autoUpload: true,
add: function(e, data) { // this is the money maker
types = /(\.|\/)(gif|jpe?g|png|bmp)$/i;
file = data.files[0];
if (types.test(file.type) || types.test(file.name)) {
data.submit();
}
else { alert(file.name + " must be GIF, JPEG, BMP or PNG file"); }
},
formData: form.data('form-data'),
paramName: 'file',
dataType: 'XML',
replaceFileInput: false
});

When the user clicks the "Download all" button, it initiates a 'GET' request to a Downloads controller that runs a Zipper service to zip all the requested files into a zip archive, which is saved as a blob in PostgreSQL associated to an envelope. Then the zip file path is sent by send_file as the return of the request and starts a download dialogue with the browser.

A further check against filetype can be carried out by implementing 'S3 bucket policies' on the AWS side. This will be expanded upon in a future blog post.