Hosting a static website on AWS S3 bucket using Github Actions.

Photo by Jess Bailey on Unsplash

Hosting a static website on AWS S3 bucket using Github Actions.

By Busayo Akanni

This is a tutorial on how to host a static website on AWS S3 buckets using Github Actions as our CI/CD tool of choice. By the time you're done, you should be able to host webpages for free on S3 and perform simple continuous integration tasks right from your github page.

Things you need for this tutorial

  • A free tier Amazon account
  • A Github account
  • A simple web app ( here is a link to the repository of my simple portfolio website that I made using a Youtube tutorial - [Busayo's Portfolio Repository] (github.com/Busayo/Portfolio-HTML-CSS). Here is the link to the Youtube tutorial I followed just in case you want to build yours from scratch - Youtube Video. For the purpose of this demo though, I will be using a demo website that a friend built on Github, here is a link to the repository so you can clone it to your local system and also fork it so you can modify it from your repository - demo repository

Okay, I'll start by giving brief high level introductions to the different tools we make use of.

AWS S3 Bucket

This is Amazon's cloud object storage service known for its availability, scalability, security and performance. It's really easy to use and cost effective. You can read more about it here - S3 Buckets

GitHub Actions

This helps you to complete the entire software development lifecycle right from your Github repository. Here, we'll be using it for automated continuous integration and continuous deployments. You can read more about it here - Github Actions

Okay, so after making my portfolio website from scratch, I wanted to deploy it to the cloud the DevOps way, so I began looking into several tutorials to make that happen. I followed several to finally get what I have here - Link to Portfolio

But because I used several resources to accomplish what I have right now, it was difficult to explain to other junior DevOps how to do the same. Imagine being bombarded with several links. It could be discouraging, which is when the idea for this article came up.

First Steps

Fork the demo repository found here - I had the image below when I did that.

Screenshot 2022-04-18 at 16.59.44.png

Next, I wanted to host it on S3 for a start, so I went into my AWS account to create a S3 bucket. (I used an IAM user account with the permissions to perform actions on S3. I'd advice you not to use a root account to do this as it's not the best practice. You can use this link to learn how to get this done - Creating IAM Accounts

This is what the S3 dashboard looks like on AWS.

Screenshot 2022-04-18 at 16.21.46.png

Then we'll click on create a bucket, enter the bucket name, choose the region you want it hosted in, leave every other thing on default and click on Create bucket. As seen below, I have named my Bucket 'demo-web-app-hashnode' (note: Bucket names have to be unique) and I have chosen us-east-1 as my region.

Screenshot 2022-04-18 at 16.41.04.png

Here is the bucket we just created. You can click on it and go into it.

Screenshot 2022-04-18 at 16.43.42.png

To upload objects into S3 buckets, you either drag and drop them or you click on the upload button. I assume you already cloned the repository to a known location on your system. So click on upload and then click on add files, then proceed to add the files inside the repository. Then click on add folders and add the folders inside the repository. You should have this when you're done - Screenshot 2022-04-18 at 17.10.08.png

Click on Upload

image.png

Now under the properties tab of our demo-web-app-hashnode bucket, we will scroll to the bottom where you have 'Static website hosting' and click on edit. If you click on enable, you get a drop down of options with 'Host a static website' already checked. For index document, input 'index.html' and you can leave the error document part blank. You should have this: Screenshot 2022-04-18 at 17.16.45.png

Now under inside the Permissions tab, scroll to the 'Block public access "bucket settings)", click on edit and uncheck the Block all public access checkbox, then click on Save changes. You will be asked to confirm, do this and proceed. Screenshot 2022-04-18 at 17.20.28.png

Still under the permissions tab, we need to Edit the Bucket policy, so click on the Edit button on that section. You should have this: Screenshot 2022-04-18 at 17.27.41.png Click on the policy generator button, you should be redirected to a different page where you can make the following changes.

For Type of Policy, select S3 Bucket Policy.

For Effect, check Allow

For Principal, input *

AWS Service is probably greyed out already

For Actions, scroll and select getObject and getObjectVersion

For ARN, it's back under the permissions tab of the Bucket (check the image above). after pasting, include this after '/*'

Now click on Add Statement and you should have this:

Screenshot 2022-04-18 at 18.07.52.png

Click on Generate policy and copy it out. Replace everything under the policy editor with the statement you generated as such:

  "Id": "Policy1650301677329",
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "Stmt1650301647058",
      "Action": [
        "s3:GetObject",
        "s3:GetObjectVersion"
      ],
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::demo-web-app-hashnode/*",
      "Principal": "*"
    }
  ]
}

Go back to properties, scroll down and copy the Bucket website endpoint link, it should open up the web application now.

This is mine: Weather App

Using Github Actions to create a continuous integration pipeline workflow

Via terminal, navigate to the folder that contains the code repository we are working with. To check that we are working with the most updated version of the code, we can do these:

git remote update
git status -uno

My working branch is up to date. So I'll follow the steps here to start the sync:

S3 sync

Still in terminal, I'll run the following lines individually

mkdir .github
mkdir .github/workflows
touch .github/workflows/main.yml
cd .github/workflows

So we've created an empty main.yml file, so I'll open that file via terminal by running

nano main.yml

The file is open in the terminal, so we'll copy this part of the link I posted above and input it in the file as seen below(note I'll include the --exclude '.git/' to prevent my source code history from being exposed:

name: Upload Website

on:
  push:
    branches:
    - master

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@master
    - uses: jakejarvis/s3-sync-action@master
      with:
        args: --acl public-read --follow-symlinks --delete --exclude '.git*/*'
      env:
        AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

Do CNTRL+X then press Y and Enter to exit the save your changes and exit the nano editor.

On the terminal, do a git status to see untracked changes then

git add -A

then

git commit -a

press i (for insert) to make changes and uncomment these lines

Changes to be committed:
        new file:   main.yml

press the ESC key, then :wq and Enter to save and exit. You should get something like this:

Screenshot 2022-04-18 at 21.57.08.png

Now, go back to AWS and go under IAM and create a new user, you can call the user S3-Github-Sync or anything you want.

Give the user programmatic access. Under Set permissions, you can go ahead and select Attach existing policies directly and search for AmazonS3FullAccess and grant it that permission. Download the key file which contains the accessKeyID and secretAccessKey. Go ahead and create the user.

In Github, under the repository we're working with, go to settings and scroll to secrets, click on Actions secret and click on New repository secret, add the secret for Bucket name as seen below. Screenshot 2022-04-18 at 22.11.28.png

Add those for AWS Access Key ID and AWS Secret Access key as well.

You should have something like this when you're through:

Screenshot 2022-04-18 at 22.14.34.png

Back to terminal, navigate back to inside the main directory, for me, I did

cd ../..

Now do git remote and git push origin main.

Aside, I encountered a weird error 403 while trying to push my changes and apparently it was an environment variable error that got fixed when I set my username with this:

git remote set-url origin https://yourusername@github.com/user/repo.git

Encountered another error(my build failed) Screenshot 2022-04-18 at 22.51.37.png Seems to be an Object Access control list error. Apparently AWS changed some settings earlier this year. Did some checks on Stack Overflow and found this link :

StackOverflow

Went to my Bucket, then Permissions table and under Object Ownership, I enabled ACLs.

Then I went back to github, under Actions, I reran my job and voila! Screenshot 2022-04-18 at 22.54.17.png

My working pipeline. So I'm playing around with it, testing stuff out, merging and watching it change in realtime.

Noticed a few more things as my changes weren't syncing on the website:

For the main.yml file, change branch from master to main and that would fix it.

name: Upload Website

on:
  push:
    branches:
    - main

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@master
    - uses: jakejarvis/s3-sync-action@master
      with:
        args: --acl public-read --follow-symlinks --delete --exclude '.git*/*'
      env:
        AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
        AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
        AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

And now, all my workflows run perfectly when I make changes. Screenshot 2022-04-18 at 23.10.34.png

Resources that helped

Thank you so much for reading. Let me know if something isn't clear enough in the comments below. Also let me know if you want me to cover something else.