Skip to content

An example of a simple application using AWS services: CLAMS := BAMS-in-the-Cloud

License

Notifications You must be signed in to change notification settings

mikebharris/CLAMS

Repository files navigation

CLAMS

CLAMS => "BAMS in the Cloud"

A personal learning project using a connection to a legacy event management system written in COBOL (BAMS) as a way of illustrating serverless architectures using Go, Python, Fabric, Svelte and Terraform. CLAMS so far employs the following AWS services:

  • API Gateway
  • DynamoDB
  • Lambda
  • S3
  • SQS
  • CloudFront
  • Route53
  • CloudWatch
  • RDS

More

This is primarily a project for me to learn Go to establish and understand patterns for writing service and unit tests. It was used as the basis for a workshop that I first did in Todmorden in June 2022 for HacktionLab.

In the project I also attempt to use best practices around:

  • Test driven development using the stubs, spies and mocks library for unit tests, with extensive reworking to make the tests more refactor proof and test behaviour and not implementation, thanks to input from Hoegrammer.
  • Behavioural driven development tests using Godog with acceptance-test feature files written in Gherkin
  • Clean code - naming of methods, variables, tests, packages, etc.
  • SOLID (where possible given that Go is a little unusual as an OO language preferring, as it does, composition over inheritance)
  • Design patterns
  • Moduliarisation of Go code using packages
  • Infrastructure as code and devops approaches using Fabric and Terraform

Architecture

The architecture of CLAMS

Entity Relationships

An Entity Relationship Diagram for CLAMS

Using CLAMS

To use CLAMS, get the API Gateway endpoint via AWS Console; it's also displayed as the output of the deployment script (see below). There is an example Postman collection that you can use. The endpoints provided are:

  • /clams/attendees - fetch a list of all attendees
  • /clams/attendee/5F7BCD - get a specific attendee's details (the code is for the example attendee defined in )
  • /clams/report - fetch some statistics about the event

To upload data to CLAMS from BAMS, please see the Uploader utility's README and the BAMS Documentation for the Home Screen's Upload to CLAMS functionality .

Getting AWS Credentials

In the following test and deployment sections you'll need to create a pair of credentials. Log in to AWS console for the account you wish to use to deploy the application, go to IAM, and choose your user. Click the Security credentials tab and then the Create access key button. This will create a tuple of AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY for you. You'll need these shortly. Note you can only create two credentials tuples per IAM user, and once you accept you'll not longer be able to view the AWS_SECRET_ACCESS_KEY. For utmost security delete these at the end of a session and recreate them at the next.

The Components

Lambdas

There are five AWS Lambda functions:

  • Attendee Writer - Writes new incoming attendees into the DynamoDB datastore
  • Attendee API - Presents the attendee's details to the world and does some reporting in JSON
  • Authorizer - Provides HTTP Basic Auth access to certain endpoints (i.e. for PUT, POST, DELETE)
  • DB-Trigger - Turns entries in the trigger_notifications table into streamed events
  • Processor - Process db notifications created by the DB-Trigger Lambda

Shared packages

As an example of a packages shared between multiple Lambda. The Lambda functions all use the shared attendee and awscfg packages located in the same parent directory as the Lambdas themselves. This can be used in your own programs along the lines of:

package main

import (
	"fmt"
	"github.com/mikebharris/CLAMS/functions/attendee"
)

func main() {
	a := attendee.Attendee{
		AuthCode:       "ABCDEF",
		Name:           "Frank Ostrowski",
		Email:          "frank.o@gfa.de",
		Telephone:      "0101 0101 01010",
		NumberOfKids:   0,
		Diet:           "I eat BASIC code for lunch",
		Financials:     attendee.Financials{AmountToPay: 10, AmountPaid: 10, AmountDue: 0},
		ArrivalDay:     "Wednesday",
		NumberOfNights: 4,
		StayingLate:    "No",
		CreatedTime:    time.Now(),
	}

	fmt.Println(a)
}

Other files

The Terraform configuration files are in the directory, the frontend (hastily built in Svelte) is built in , and contains a utility to upload the latest group of attendees to SQS. It can be run on the command line or called from within BAMS.

The Database is deployed using Flyway (both to AWS and into a Docker instance for the Service Tests). The command that is run can be found in and the SQL migration (scheme version) files in

Running Tests

There are service/integration-level tests that use Gherkin syntax to test integration between the Lambda and other dependent AWS servies. The tests make use of Docker containers to emulate the various services locally, and therefore you need Docker Desktop running.

To run the service tests, change to the service in the functions directory and type:

AWS_SECRET_ACCESS_KEY=x AWS_ACCESS_KEY_ID=y make int-test

There are unit tests than can be run, again by changing to the service in the functions directory and typing:

make unit-test

You can run both unit and integration/service tests for a given service with:

make test

Deploying

There is a Python Fabric v2 script to help you do this. First authenticate with AWS, either using a SSO integration tool such as XXXX, or by fetching your credentials from IAM.

Prerequisites

The RDS database for CLAM requires two SSM parameters to be set up in the AWS Parameter Store. Create these as /clams/{environment}/db/username and /clams/{environment}/db/password replacing {environment} with your target environment, for example:

  • /clams/nonprod/db/username
  • /clams/nonprod/db/password

Both should ideally be of type SecureString, though it doesn't matter to the deployment scripts.

The Route53 record requires an SSL certificate to be created using Amazon Certificate Manager (ACM).

First time you'll need to run the init process (for example):

AWS_ACCESS_KEY_ID=XXXX AWS_SECRET_ACCESS_KEY=YYYY fab terraform --account-number=111111111111 --contact=your@email.com --mode=init

A plan tests your Terraform config's syntax:

AWS_ACCESS_KEY_ID=XXXX AWS_SECRET_ACCESS_KEY=YYYY fab terraform --account-number=111111111111 --contact=your@email.com --mode=plan

An apply makes your changes so in your target AWS account:

AWS_ACCESS_KEY_ID=XXXX AWS_SECRET_ACCESS_KEY=YYYY fab terraform --account-number=111111111111 --contact=your@email.com --mode=apply

And finally destroy takes it all down again:

AWS_ACCESS_KEY_ID=XXXX AWS_SECRET_ACCESS_KEY=YYYY fab terraform --account-number=111111111111 --contact=your@email.com --mode=destroy

The command line supports the following:

Usage: fab [--core-opts] terraform [--options] [other tasks here ...]

Docstring:
  none

Options:
  -a STRING, --account-number=STRING
  -c STRING, --contact=STRING
  -d STRING, --distribution-bucket=STRING
  -e STRING, --environment=STRING
  -i STRING, --input-queue=STRING
  -m STRING, --mode=STRING
  -p STRING, --project-name=STRING
  -r STRING, --region=STRING
  -t STRING, --attendees-table=STRING

TODO list

  • Write a better front-end
  • Add authentication to the API
  • Add Kitchen reporter utility
  • Add ability to write new attendees to database
  • Add ability to synch bi-directionally between BAMS and CLAMS
  • Add ability to handle GDPR Requests-for-Erasure (RfE)