Get in touch!
Back

How I became an AWS DevOps Professional ✨ — and a fun one-day-build to show off what I learned 🌱

By Ram Parameswaran Amazon Web Services Training and Certification DevOps

IMG_0007.webp

Background

This week I gained the Amazon Web Services ‘DevOps Engineer Professional’ certification. This certification focuses on best practices for testing and deploying AWS resources in an automated, secure and scalable way.

I found the training exceptional, and I highly recommend it to anyone who uses AWS services professionally. From it, I learned a number of new skills and I hardened my knowledge of best-practice deployment patterns.

I am keen to put these newly learned skills into immediate practice. So I have decided to do a one-day-build of a weather notification application, built and deployed using the AWS DevOps stack.

This stack comprises CloudFormation for provisioning, CodeBuild for CI, Lambda for compute, Secrets Manager for credentials, Parameter Store for parameters, SNS for text notifications to my mobile, and CloudWatch for logging. I want to share the source code publicly, so I will use Github for version control rather than CodeCommit.


The one day build — a rainfall notification app

First, what is this application even meant to do?

It is simple notification app which send an SMS to a list of recipients (my partner and I) at 8am on days when rain is forecast. Forecasts will come from the (free) openweathermap.org API.

The application code itself is a simple Lambda function. The source code can be found on Github.

import boto3
import datetime
import json
import os
import requestssecretsmanagerclient = boto3.client("secretsmanager")

snsclient = boto3.client("sns")topic_arn = os.environ.get("SNS_TOPIC_ARN")
LAT = -35.2809
LON = 149.1300omw_api_key = secretsmanagerclient.get_secret_value(
                SecretId=os.environ.get("OMW_API_KEY_SECRETSMANAGER_ARN")).get("SecretString")
omw_api_key = json.loads(omw_api_key).get("ApiKey")

def lambda_handler(event, lambda_context):
    # Arguments
    latitude_degrees = LAT
    longitude_degrees = LONOPEN_WM = "https://api.openweathermap.org/data/2.5/onecall"
    parameters = {
        "lat": latitude_degrees,
        "lon": longitude_degrees,
        "exclude": "current,minutely,daily",
        "appid": omw_api_key,
         }
 
response = requests.get(OPEN_WM, params=parameters)
response.raise_for_status()
weather_data = response.json()
hour_list = weather_data["hourly"]weather_12_hour = []

for i in range(0, 11):
    code = weather_data["hourly"][i]["weather"][0]["id"]
    weather_12_hour.append(code)count = 0
 
for i in weather_12_hour:
    if i < 700:
        count += 1

if count > 0:
    tomorrow = datetime.datetime.today() + datetime.timedelta(hours=10)
    current_date = tomorrow.strftime("%d-%m-%Y")
    sns_response = snsclient.publish(
                                       TopicArn=topic_arn,
                                       Message=f"😮 Its going to rain today ({current_date})!
                                                            Make sure you bring an ☂",
                                       MessageAttributes={
                                            "AWS.SNS.SMS.SenderID": {
                                            "DataType": "String",
                                            "StringValue": "rainbyram"}
                                         },
                             )

    return {"statusCode": 200, "body": f"Success. Text message sent to all subscribers via SMS."}

# Else, do nothing and return success code.
return {"statusCode": 200, "body": "No action required."}


Creating a CloudFormation template for the project infrastructure

The first step is to create a CloudFormation template. A CloudFormation template is a declarative way to specify the infrastructure needed for a project. CloudFormation provisions these resources in a “stack”: a logically grouped collection of AWS resources. This allows for easy management of those resources.

Our pipeline will create:

  1. One CodePipeline pipeline;
  2. One S3 Bucket for pipeline artifacts, and an appropriate S3 Bucket Policy;
  3. Two IAM Roles to: i) execute the CodePipeline, and ii) execute CodeBuild;
  4. Two CodeBuild Projects to: i) run unit tests, and ii) deploy to Production using AWS Serverless Application Model (SAM) temlate (see here);
  5. One Amazon SNS topic for notifications.

You can see the full CloudFormation template here.

1_Oxegi9Iquz3CcJqzNitr6Q.png

What is not included in the CloudFormation template? There are a few sensitive parameters which I want to specify manually through the AWS console. These are: i) the OpenWeatherMap API key as a SecretsManager secret; and ii) the SNS Subscription which contains mine and my partner’s phone numbers. I do not want these in version control!

A side-note on using CodeBuild instead of Github Actions

Typically for a project hosted in Github, I would run my CI workflow in Github Actions. This makes for a much simpler build pipeline.

I generally believe that unit tests should be executed as-close-as-possible to the source code (eg. Github Actions for a Github repo, or Azure Pipelines for an Azure Devops repo). This encourages developers to write and test application code that is environment-independent.

However, in this case I want to use as many AWS resources as possible (still not CodeCommit though 😜) so I will still use CodePipeline and CodeBuild.


Creating the CloudFormation stack

To create the Cloudformation stack:

$ aws cloudformation create-stack — stack-name uhohitsgonnarain-stack — template-body file://codepipeline.yaml — capabilities CAPABILITY_NAMED_IAM

This provisions the resources specified in the CloudFormation template. We must also edit the pipeline ‘Source’ stage manually and “Complete the connection” to Github. This will launch an authentication window which allows us to authorise AWS to access our Github account.

We also have to manually create the SecretsManager secret, and configure the SNS Subscription using the ‘sms’ protocol.

Testing

Now we have all the resources deployed successfully!

The CodePipeline pipeline will run on each push to the main branch on Github. The pipeline will run unit tests, and then deploy the Lambda function using AWS Serveless Application Model.

We can now go to the AWS Lambda Console and test the function directly.

And a few seconds later my phone buzzes…

1_GwJrQKf7ky8sCw5BCGqlPA.webp


Wrap-up

Becoming a certified AWS DevOps Engineer Professional taught me a lot about AWS DevOps workflows. It also taught me a lot about non-AWS DevOps workflows.

So which set of tools is better? Well like all good things — it depends. AWS provides fine-grained access control, and configuration management for deployment artefacts. However, there is a lot to like about the simplicity of a batteries-include platform like Github/Github-Actions for simpler CI/CD workflows. 🤷

These are all great lessons to learn — ones which separate mid-level software engineers from senior software engineers.

I cannot recommend highly enough pursuing AWS accreditation. And the DevOps Engineer Professional certification in particular shows is value in spades.

Ram Parameswaran

about Ram Parameswaran

Django & React Developer, Data Geek and Maker-Enthusiast.
Ram builds web applications that help businesses operate more efficiently.