Using Amazon S3 to Store Wagtail Media Files
Amazon S3 is a great resource for handling your site’s media files. While Whitenoise is a convenient solution for static files, it isn’t meant for user-uploaded content. Amazon S3 provides a solution that is equally as convenient for your media files. It can also offer some performance advantages over other options.
In this post we will walk through creating a bucket in Amazon S3, setting up an IAM user, and getting Wagtail to upload and serve all user-uploaded content to/from Amazon S3.
If you use this article in conjunction with Deploying Your Wagtail Site on Heroku, you can have a fully-functioning, web-facing deployment of Wagtail without a penny spent. (Of course you’ll need to move to paid versions once you’re ready to handle some traffic.)
Before we start, I'm making the following assumptions:
- You have an Amazon S3 account setup.
- You have a working deployment of your Wagtail site on a Production Platform.
- You are using Whitenoise to handle your site's static files.
Setting up your IAM User
You can link up your Wagtail site with root access to your S3 account, however Amazon advises against it. The recommended route is to use an IAM user. This keeps access limited to specific features, reducing security risks. It also makes it easy to pass off bucket access to someone else in the event that another person takes over your project. Setting up an IAM user will very likely save you from headaches down the road.
First, we’ll navigate to the IAM Management Console and create a user. When you reach step 4, “Complete” you see your access key and your secret access key starred out. Go ahead and download the .csv at this point and put it somewhere you'll remember. You won’t be able to get access to the secret key in the future and will be forced to create new access credentials if you don't hang on to them.
Setting up your S3 Bucket
Now we’ll navigate over to the S3 Management Console and create a bucket.
Unless you know what you’re doing, keep the region that Amazon provides by default when you create your bucket. Changing the region of your bucket will require additional setup procedures that are not covered in this article.
Once your bucket is created, select the “Properties” button then click the “Permissions” dropdown. Open “Edit bucket policy” and enter the policy below:
{ "Version": "2008-10-17", "Statement": [ { "Sid": "PublicReadForGetBucketObjects", "Effect": "Allow", "Principal": { "AWS": "*" }, "Action": "s3:GetObject", "Resource": "arn:aws:s3:::BUCKET-NAME/*" }, { "Effect": "Allow", "Principal": { "AWS": "USER-ARN" }, "Action": "s3:*", "Resource": [ "arn:aws:s3:::BUCKET-NAME", "arn:aws:s3:::BUCKET-NAME/*" ] } ] }
Now check your CORS configuration, we want it to say:
<CORSConfiguration> <CORSRule> <AllowedOrigin>*</AllowedOrigin> <AllowedMethod>GET</AllowedMethod> <MaxAgeSeconds>3000</MaxAgeSeconds> <AllowedHeader>Authorization</AllowedHeader> </CORSRule> </CORSConfiguration>
Your S3 bucket is now ready to roll!
Preparing Wagtail for S3 Media Storage
Next we'll need to install a couple of packages to your Wagtail project. Django-storages, a collection of custom storage backends for Django, and Boto3, an SDK for Amazon Web Services. Jump into your Wagtail project's virtual environment and run the following commands:
pip install django-storages pip install boto3 pip freeze > requirements.txt
Next head over to the base.py file in your project's settings directory and add 'storages' to your installed apps:
INSTALLED_APPS = [ ... 'storages', ... ]
While we're in base.py, let's specify the bucket to be used and add the access credentials for your IAM user. Add the following to your base.py file:
AWS_STORAGE_BUCKET_NAME = '####' AWS_ACCESS_KEY_ID = '####' AWS_SECRET_ACCESS_KEY = '####' AWS_S3_CUSTOM_DOMAIN = '%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME
Don't forget to replace the hashes with your bucket name and IAM access credentials. If you're setting these values with environment variables, don't include the surrounding quotes.
This article is meant to get you up and running quickly with Whitenoise handling your static files and Amazon S3 handling your media files. It is possible to have Amazon S3 handle both but would require writing some custom storage classes to avoid having static and media files storing on top of one another. If you are looking to handle both static and media files on S3, there's a great article by Dan Poirier at Caktus Group that can shed some light on the process. His article utilizes boto3's predecessor, boto, but it should still get you moving in the right direction.
For us, it's easy mode. All that's left is dropping the following into production.py then pushing your project up to your production platform.
MEDIA_URL = "https://%s/" % AWS_S3_CUSTOM_DOMAIN DEFAULT_FILE_STORAGE = 'storages.backends.s3boto3.S3Boto3Storage'
Now give your project a push up to your production platform and voila! Your media files should now be uploading straight to your S3 bucket from the Wagtail admin area. Go give it a try!
This Week in Wagtail
A weeklyish newsletter with Wagtail tips, news, and more.