I first started my serverless journey in early 2019. I was enamored at all the things you could quickly piece together and couldn’t believe what I had been missing out on my entire software career.
Something particularly special to me was CloudFormation, specifically the Serverless Application Model (SAM). SAM lets you define serverless functions, APIs, and event source mappings quickly and easily.
When you deploy a SAM application behind the scenes it is transforming your serverless references to CloudFormation resources and pushes them into AWS.
I loved the idea that I could define the architecture via code and have it deploy successfully into any AWS account. In a sense, it’s kind of like a container. If I know it works I can run it anywhere.
Over time, my use cases began getting more and more advanced. I started putting patterns in place for specific types of resources and had “handcuff” resources - meaning when I create this type of resource I also need to create this other one.
None of this was hard, but it was prone to error. Someone was going to forget to add a resource or get the naming convention wrong.
Then I discovered CloudFormation macros.
Image by Robin Higgins from Pixabay
A CloudFormation macro is a function that runs on deployment that transforms the resources defined in your template. It can do anything - add new resources, remove specific resources, or manipulate existing ones.
If you use SAM, then you’re already using a macro and might not even know it. Defined in every SAM template there is a Transform
property that points to a serverless…..thing in AWS. Well, that “thing” is actually a macro that will turn the shortcuts defined in SAM into CloudFormation resources.
Macros come in handy when you start finding patterns with the way you build resources. Is there a resource that always gets created when you build a new lambda function? Is there a manual process that takes place whenever a new API endpoint is implemented?
Taking the human element out of any manual process should always be a goal because it removes the chance for error. A machine will perform the task the same way every time. A human will not. We’ll get distracted or lazy or think something is unnecessary and won’t do it.
If you can automate your automation, you set yourself up for success over the long run. Development will be done faster and more consistent time after time after time.
Imagine you have a scenario where every time you create a lambda function with a specific environment variable you also want to create an SSM parameter to track the value. This will allow you to see all the values for this environment variable across your application from the SSM console.
This can be done with a macro. So let’s build one to automate this process.
We know that our macro has two look for two things:
When CloudFormation macros run, they are passed in an event with the following properties:
{
"accountId": "", // AWS Account Id
"fragment": {}, // JSON representation of your SAM/CloudFormation template
"transformId": "", // Id of the specific transform
"requestId": "", // Id of this run
"region": "", // Region being deployed into
"params": {}, // Parameters used in the transform
"templateParameterValues": { } // Parameter values used in the template
}
The values contained in fragment.Resources
are all of the resources that are being generated by your template. So for our example we need to filter down the resources to lambda functions with a specific environment variable. That variable will be called LAMBDA_VALUE
.
Our macro will be written as a NodeJS function. To find the lambda resources in a SAM template, we need to write the following javascript:
const lambdas = [];
const resourceKeys = Object.keys(resources);
for (let i = 0; i < resourceKeys.length; i++) {
const resource = resources[resourceKeys[i]];
if (resource.Type === 'AWS::Serverless::Function'
&& resource.Properties.Environment.Variables.LAMBDA_VALUE) {
lambdas.push({
name: resourceKeys[i],
details: resource
});
}
}
This snippet produces an array with all lambda functions that have our environment variable. Now we want to build the SSM parameter with the data from the environment variable.
lambdas.forEach((lambda) => {
event.fragment.Resources[`${lambda.name}Parameter`] = {
Type: 'AWS::SSM::Parameter',
Properties: {
Name: `${lambda.name}Parameter`,
Type: 'String',
Value: lambda.details.Properties.Environment.Variables.LAMBDA_VALUE
}
};
});
This adds a new SSM Parameter resource directly to the list of AWS resources to be created. Finally, we need to return the updated resources in the format the macro expects a response.
So in our function we return:
return {
requestId: event.requestId,
status: 'success',
fragment: event.fragment
};
Now that we have the source code written (view the full source here), we need to create the macro resource. In our SAM/CloudFormation template, we need to create a resource with the AWS::CloudFormation::Macro
type that points to the function we just created.
AddSSMParametersMacro:
Type: AWS::CloudFormation::Macro
Properties:
Name: AddSSMParameters
Description: Adds SSM Parameters for lambdas with the LAMBDA_VALUE env var
FunctionName:
Ref: AddSSMParametersFunction
Macros are simple resources, they take in a name and a reference to a function. Once we build this and deploy it into our AWS account, we will be able to consume it in our other SAM or CloudFormation templates.
At this point the macro is built and ready for our consumption. To add it for consumption in a SAM or CloudFormation template, add it to the Transform
property.
Transform: [AWS::Serverless-2016-10-31, AddSSMParametersMacro]
This property is an array and can run as many macros as you’d like. As long as that macro is in the same AWS account you are deploying into, it will work without issue.
Behind the scenes when the template is being deployed, it will iterate through all of the Transforms in the order specified in the template. In our example, it will first run the SAM transform which converts the quick and easy SAM shortcuts to full-blown CloudFormation. Then it will run our custom macro, which will analyze all the functions and add any new SSM parameters.
After the transforms are run, it will deploy all the resources into AWS.
Pretty easy, right?
Image by Comfreak from Pixabay
Macros are great for adding another layer of automation into your pipeline. For scenarios that include a manual process, macros are perfect.
Keep in mind that macros are lambda functions. You can make them do anything. Write to DynamoDB, call an external API, fire a webhook event, the possibilities are endless.
You can find the full example of our code walkthrough here. AWS has also provided a whole library of macros on GitHub to help your automation journey.
I hope you find some great use cases for macros and make your pipelines even more efficient.
Keep on automating!
Thank you for subscribing!
View past issues.