Building Hugo Sites Automatically Using AWS CodeBuild

Before AWS CodeBuild and CodePipeline was released I ran a little T2.micro instance with a Jenkins instance on it. This became a paint to patch and update and the recent 0-day was the proverbial straw that broke the camel’s back. In this post I will detail how I moved to using the fully managed tools and not have to care about any of this.

Getting ready for the build

The first thing that we need is a set of instructions for building the Hugo site. Since the build server starts clean every time this includes downloading Hugo and all the dependencies that we require. One of the options that CodeBuild has for specifying the build instruction is the buildspec.yaml file. I choose this choice since it seemed pretty simple and straight forward.

 1 2 3 4 5 6 7 8 91011121314151617181920
version: 0.1

phases:
    install:
        commands:
            - pip install Pygments
            - wget https://github.com/gohugoio/hugo/releases/download/v0.37.1/hugo_0.37.1_Linux-64bit.deb
            - dpkg -i hugo_0.37.1_Linux-64bit.deb
    build:
        commands:
            - hugo
            - echo "******** Uploading to S3 ********"
            - aws s3 sync public/ s3://blog.tryfinally.co.za/ --region us-east-1
            - echo "******** Done! ********"

    post_build:
        commands:
            - echo Build completed on `date`
            - aws sns publish --topic-arn $SNS_TOPIC_ARN --subject 'AWS CodeBuild - Build Completed' --message 'The build has completed. For build details, go to https://console.aws.amazon.com/cloudwatch/home?region=us-east-1#logStream:group=/aws/codebuild/Blog in Amazon CloudWatch Logs.'
My buildspec file

Once Codebuild supports ubuntu 16.04 base image we can just do a normal apt-get install of Hugo but for I fetch the Hugo package manually.

Creating a build project

With that sorted we can create the CodeBuild project like this

Build project setup step 1

Build project setup step 2

Build project setup review

Now we save and start the build.

Build started

We need to do one more thing to make the upload to S3 work during the build process, which is to grant permissions to the build role to upload to S3 by going to the IAM console and attaching the S3FullAccess policy to the role. If you are using S3 for other things you can also scope the policy to only allow access to your blog bucket, but that is beyond the scope of this post.

Giving IAM role correct permissions

Going back to the build we can see that it completed successfully!

Build complete

Next we need to setup a CodePipeline to trigger this build whenever we checking in change.

Creating the CI pipeline

Creating pipeline step 1

Creating pipeline step 2

I am using Github as my repository so I need to setup the connection to GitHub and grant permissions but after that it is pretty simple.

Creating pipeline step 3

Creating pipeline step 4

Creating pipeline step 5

Creating pipeline step 6

Now any change should trigger a build and upload of the build to the S3 bucket.

Pipeline created

And here we have the proof of the pudding:

Pipeline built through

I hope you will have as much fun creating this simple setup as I did. Let me know if you find any additional tips or tricks to make this process simpler.

comments powered by Disqus