I recently added file upload support to an Ember application that
talks to a Phoenix API server using the JSONAPI protocol. I wanted a
solution that integrated well with Ember and ember-data
, supported
direct uploads to S3 from the client, included drag and drop support
for choosing files, and had the ability to show upload progress. After
experimenting with a number of options I decided to use the ember-file-upload
plugin.
In this post I’ll show how to build the Phoenix API server call used
by ember-file-upload
to fetch pre-signed S3 upload
credentials. We’ll use a Hex package I wrote
called s3_direct_upload
to help with this task.
Adding file uploads to your Ember app
Following along with the recipe in the ember-file-upload
README
for S3 direct uploads,
we add the library to package.json
and npm install
it, then we
create a bucket and set its CORS configuration as described.
In our template we use the file-dropzone
and file-upload
components provided by ember-file-upload
:
{{#file-dropzone name="file-upload" as |dropzone queue|}}
{{#if dropzone.active}}
{{#if dropzone.valid}}
Upload files.
{{else}}
Invalid type.
{{/if}}
{{else if queue.files.length}}
Uploading {{queue.files.length}} files. ({{queue.progress}}%)
{{else}}
{{#if dropzone.supported}}
Drop files.
{{/if}}
{{#file-upload name="file-upload"
multiple=true
onfileadd=(action "uploadFile")}}
{{/file-upload}}
{{/if}}
{{/file-dropzone}}
Then we add an uploadFile
action that performs three tasks.
- It fetches pre-signed direct upload credentials from the API server.
- It performs the file upload to S3 using those credentials.
- It returns the S3 response including a
Location
header containing the S3 url for the uploaded file.
When fetching the pre-signed credentials we need to set the request
type to JSONAPI and provide an authorization token with the help of
our session
service. The file name and MIME type are sent along as
parameters, but you can add any additional parameters your API needs.
For example: if you wanted the uploaded file to be placed in the S3
bucket using a parent scoped path like
parent/:parent_id/uploads/:filename
you could add parent_id
as a
parameter and use it in your Phoenix controller action to build the
path.
actions: {
uploadFile: function (file) {
let auth = this.get('session.data.authenticated');
let ajaxRequest = {
url: '/api/v1/file_upload_presigned',
type: 'GET',
headers: {
'Authorization': `Token token="${auth.token}", email="${auth.email}"`,
'Accept': 'application/vnd.api+json',
'Content-Type': 'application/vnd.api+json'
},
data: {filename: file.get("name"), mimetype: file.get("type")}
};
RSVP.cast(Ember.$.ajax(ajaxRequest)).then(function (response) {
return file.upload(response.url, {
data: response.credentials
});
}).then(function (response) {
// handle the S3 upload response
// this could be as simple as updating or creating an ember model
// response.headers.Location contains the uploaded file's S3 URL
});
}
}
Generating pre-signed upload credentials in Phoenix
Now we just need to add the /api/v1/file_upload_presigned
route
to our Phoenix API server. We’ll use the s3_direct_upload
library to
help with this. To pull in this library, just add {:s3_direct_upload,
"~> 0.1.1"}
to your application dependencies in mix.exs
and run mix
deps.get
.
In order to use the s3_direct_upload
module we need to configure the
AWS access and secret keys as well as the S3 bucket name. We’ll use
environment variables to supply this configuration, so in
config/config.exs
add something like this:
config :s3_direct_upload,
aws_access_key: System.get_env("AWS_ACCESS_KEY_ID"),
aws_secret_key: System.get_env("AWS_SECRET_ACCESS_KEY"),
aws_s3_bucket: System.get_env("AWS_S3_BUCKET")
Next we need to add a route for the pre-signed credentials in
web/router.ex
. The api
pipeline sets up JSONAPI content type
handling and JSONAPI deserialization. Our app uses the ja_serializer
library to
help with this. The pipeline also authenticates the request using the
token in the Authorization
header.
scope "/api/v1", Api do
pipe_through :api
get "/file_upload_presigned", FileUploadController, :presigned
end
Finally we add the Phoenix controller and action. In this example we generate a path consisting of a simple UUID, but you can build whatever makes sense for your application. With this implementation, the uploaded file URLs will look like:
https://bucket.s3.amazonaws.com/7e087127-4891-4735-b1e5-e56faa2f0cf2/file.ext
defmodule Api.FileUploadController do
use Api.Web, :controller
def presigned(conn, %{"filename" => file_name, "mimetype" => mimetype}) do
uuid = Ecto.UUID.generate()
path = "#{uuid}"
upload = %S3DirectUpload{file_name: file_name, mimetype: mimetype, path: path}
send_resp(conn, 200, S3DirectUpload.presigned_json(upload))
end
end
Simple file uploads in Ember and Phoenix
Using the ember-file-upload
and s3_direct_upload
libraries makes
it simple to add S3 direct uploads to your Ember application and
Phoenix API server. Not only is it easy but the resulting file upload
interface supports drag and drop, handling of single or multiple file
uploads, filtering by MIME type and file extension, and upload
progress feedback. I’m really happy with the result.