Painless CI CD With Hugo and Lambda

I’ve built my blog with Hugo - and I’ve been loving it so far!

Hugo is a static website generator written in Go, it’s blazing fast and super easy to use.

I’ve put the source of my blog on GitHub. Normally you would need to download Hugo and run it locally, and then sync the output to a webserver.

I’ve written a Lambda script that does this for you, and more - such as:

  • monitor your GitHub repo for commits (using Githubs awesosome Webhooks
  • download the latest version of your blog
  • run it against hugo
  • sync the output to an S3 bucket

This allows me to write blog posts anywhere I can run Git, it also gives me a nice history by virtue of Git commit history

Prerequisites

Optional

  • If you’d like this to work out of the box with minimal configuration I recommend you name your github the repo the same as your S3 bucket (which should be your blog URL)

SNS Configuration

In order for GitHub to send you a notification when there is a commit you will need to setup SNS

  • Create an SNS Topic
  • Set the Display to ‘GitHub’
  • Optional: Add your e-mail address if you want to recieve a notification when there is a commit

Create an API Key

GitHub will need credentials to publish to your SNS topic

  • Create a new IAM user
  • Add the AmazonSNSRole policy to your user
  • Save the secret access key somewhere safe (you’ll need this next)

GitHub Webhook

From GitHubs documentation

To set up a repository webhook on GitHub, head over to the Settings page of your repository, and click on Webhooks & services. After that, click on Add webhook.

You’ll need to provide the IAM credentials that you previously created.

Create a IAM role

Your Lambda function will need permissions to sync to S3, in order to do so head over to the IAM Console

create a role policy called s3_blog_upload and attach a policy that is similiar to the following

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:GetObjectVersion",
                "s3:DeleteObject",
                "s3:DeleteObjectVersion"
            ],
            "Resource": "arn:aws:s3:::BUCKETNAME/*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogGroup",
                "logs:CreateLogStream",
                "logs:PutLogEvents",
                "logs:DescribeLogStreams"
            ],
            "Resource": "arn:aws:logs:::*"
        }
    ]
}

Note: the logs actions are optional. Include them if you want the output of the Lambda function to be returned to you

Lambda

Download the code from michaelmcallister/hugo-s3-lambda-sync

  • Create a new Lambda function named BlogSync
  • Select your GitHub SNS topic as the trigger
  • under ‘Lambda function code’ select the code entry type as ‘upload a .ZIP file’
  • select the master.zip file you just downloaded
  • select the role you just created (s3_blog_upload)

Testing

  • Head over to your Lambda function
  • Click on Test and use an event template such as this:
{
  "Records": [
    {
      "EventVersion": "1.0",
      "EventSubscriptionArn": "arn:aws:sns:EXAMPLE",
      "EventSource": "aws:sns",
      "Sns": {
        "SignatureVersion": "1",
        "Timestamp": "1970-01-01T00:00:00.000Z",
        "Signature": "EXAMPLE",
        "SigningCertUrl": "EXAMPLE",
        "MessageId": "95df01b4-ee98-5cb9-9903-4c221d41eb5e",
        "Message": {
          "repository": {
            "name": "REPO_NAME",
            "full_name": "USER/REPO_NAME",
	    "html_url": "https://github.com/USER/REPO_NAME"
          }
        }
      }
    }
  ]
}
  • Watch the Execution result for the messages - there’s enough logging to try and pinpoint where the issue lies. Here’s an example of mine

Lambda Success

comments

comments powered by Disqus