eliasbrange.dev
How To Separate Your Serverless Infrastructure

How To Separate Your Serverless Infrastructure

2022-07-11
| #AWS #Serverless

Some time ago, you had a brilliant business idea. You started building it on AWS and read that Serverless was the way to go. Good on you! After quickly prototyping an MVP and pitching it to investors, you found yourself surrounded by a team of developers. You worked hard to get the beta version ready. By iterating quickly and listening to customers, the beta got tremendously popular, and you had a successful launch. Money was pouring in, and you felt unstoppable, so you set out to add new features and services to your product to continue to please customers and investors alike.

To move fast, you kept all code and infrastructure in the same repository. Suddenly, something happened. You were not releasing features as quickly as before. As time passed, making changes got more challenging and time-consuming. After continuously adding services and resources, your infrastructure was now a mess. You found yourself with a tangled web of Lambda Functions, API Gateways, Dynamo Tables, SQS Queues, and other resources.

To avoid the tangled web of dependent services, you can separate your infrastructure into more manageable chunks. These chunks can then be deployed individually and have different lifecycles. Even within a single service, say a Lambda-backed API with a DynamoDB table, you might want to separate the stateless API Gateway and Lambda Function(s) from the stateful DynamoDB table.

All of your services are most likely not living in total isolation and will have dependencies on other services and resources. In this article, you will learn different ways to separate and share resources between services in your Serverless infrastructure, with examples for AWS CDK, Serverless Framework, AWS SAM, and Terraform. You can even mix and match frameworks. Perhaps you want to use Terraform for your DynamoDB Table and Serverless framework for your API Gateway and Lambda functions.

1. CloudFormation Outputs

CloudFormation allows you to declare output values that you can import into other stacks. There are some restrictions to keep in mind when using outputs:

1.1. Exporting outputs

AWS CDK

CDK tip

If you have multiple stacks in the same CDK app, you can directly pass resources between stacks.

Use the CfnOutput construct to define an output.

import { Stack, StackProps, CfnOutput } from 'aws-cdk-lib';
import { Construct } from 'constructs';
import * as dynamodb from 'aws-cdk-lib/aws-dynamodb';
export class ExportStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const table = new dynamodb.Table(this, 'Table', { ... });
new CfnOutput(this, 'TableNameOutput', {
value: table.tableName,
exportName: 'ExportedTableName',
});
}
}

AWS SAM

You define your outputs in the Outputs section of your template.

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: SAM Export
Resources:
Table:
Type: AWS::DynamoDB::Table
Properties: ...
Outputs:
TableName:
Description: 'DynamoDB Table name'
Value: !Ref Table
Export:
Name: ExportedTableName

Serverless Framework

Serverless Framework tip

If you have multiple services in the same Serverless project, you can use Serverless compose to share resources between services.

Non-function resources in Serverless framework is defined using CloudFormation syntax in the resources section. Thus, it looks very similar to the SAM example above.

service: sls-export
provider:
name: aws
resources:
Resources:
Table:
Type: AWS::DynamoDB::Table
Properties: ...
Outputs:
TableName:
Description: 'DynamoDB Table name'
Value: !Ref Table
Export:
Name: ExportedTableName

Terraform

Terraform differs from the other frameworks and does not use CloudFormation as an engine to manage your infrastructure. It is thus not possible to create CloudFormation outputs when using Terraform.

1.2. Importing outputs

AWS CDK

You can use Fn.importValue to import the value of an output exported by another stack.

import { Stack, StackProps, Fn } from 'aws-cdk-lib';
import { Construct } from 'constructs';
import * as lambda from 'aws-cdk-lib/aws-lambda-nodejs';
export class ImportStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const tableName = Fn.importValue('ExportedTableName');
new lambda.NodejsFunction(this, 'MyFunction', {
...
environment: {
TABLE_NAME: tableName,
},
});
}
}

AWS SAM

You can use the intrinsic function ImportValue to import the value of an output exported by another stack.

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: SAM Import
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
...
Environment:
Variables:
TABLE_NAME: !ImportValue 'ExportedTableName'

Serverless Framework

You can import exported outputs with the ImportValue intrinsic function. For non-exported outputs, you can use the syntax ${cf:stackName.outputName}.

service: sls-import
provider:
name: aws
functions:
hello:
handler: src/functions/app.lambdaHandler
environment:
TABLE_NAME: !ImportValue ExportedTableName

Terraform

Fetching an exported CloudFormation outputs is done with the aws_cloudformation_export data source.

data "aws_cloudformation_export" "table_name" {
name = "ExportedTableName"
}

2. SSM Parameters

You can also use the Systems Manager Parameter Store to pass values between stacks by creating a parameter and dynamically reference it in a CloudFormation stack or Terraform configuration.

Compared to CloudFormation outputs, SSM does not give you the same guardrails. Nothing stops you from deleting a parameter, even if another stack dynamically references it. By doing so, the next update of the other stack will result in errors. You will also most likely break the service if you also remove the resources referenced by the SSM parameter.

So why would you use SSM parameters over CloudFormation outputs? If you want to mix Infrastructure-as-Code tools and, for example, export values from Terraform and use them in a CloudFormation-based tool.

You can also use SSM parameters during runtime instead of deploy time. You can use the AWS SDK and dynamically fetch parameters in your Lambda functions. It gives you the possibility of updating parameters without having to redeploy consuming functions. However, it makes it harder to follow the principle of least privilege. You will have to use less granular permissions to account for different parameter values (if you, for example, have created a new DynamoDB from a snapshot and intend to switch the parameter to point to the restored table).

Be aware that reading during runtime will result in some overhead and longer execution times in cold starts. It will also increase the execution time of warm lambdas if you do the fetch inside the handler code.

2.1. Exporting parameters

AWS CDK

Create an SSM parameter with the StringParameter construct.

import { Stack, StackProps } from 'aws-cdk-lib';
import { Construct } from 'constructs';
import * as dynamodb from 'aws-cdk-lib/aws-dynamodb';
import * as ssm from 'aws-cdk-lib/aws-ssm';
export class ExportStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const table = new dynamodb.Table(this, 'Table', { ... });
new ssm.StringParameter(this, 'SSMParam', {
parameterName: '/some/path/tableName',
type: ssm.ParameterType.STRING,
stringValue: table.tableName,
});
}
}

AWS SAM

Create an SSM parameter with the AWS::SSM::Parameter resource.

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: SAM Export
Resources:
Table:
Type: AWS::DynamoDB::Table
Properties: ...
SSMParam:
Type: AWS::SSM::Parameter
Properties:
Type: String
Name: '/some/path/tableName'
Value: !Ref Table

Serverless Framework

Creating a parameter in Serverless framework is done the same way as in AWS SAM.

service: sls-export
provider:
name: aws
resources:
Resources:
Table:
Type: AWS::DynamoDB::Table
Properties: ...
SSMParam:
Type: AWS::SSM::Parameter
Properties:
Type: String
Name: '/some/path/tableName'
Value: !Ref Table

Terraform

Use the aws_ssm_parameter in the AWS provider to create a parameter.

resource "aws_dynamodb_table" "dynamo_table" {
name = "dynamodb-table"
...
}
resource "aws_ssm_parameter" "ssm_dynamo_table_name" {
name = "/some/path/tableName"
type = "String"
value = aws_dynamodb_table.dynamo_table.name
}

2.2. Importing parameters

AWS CDK

Use StringParameter.valueForStringParameter to fetch the value of a parameter.

import { Stack, StackProps } from 'aws-cdk-lib';
import { Construct } from 'constructs';
import * as ssm from 'aws-cdk-lib/aws-ssm';
import * as lambda from 'aws-cdk-lib/aws-lambda-nodejs';
export class ImportStack extends Stack {
constructor(scope: Construct, id: string, props?: StackProps) {
super(scope, id, props);
const tableName = ssm.StringParameter.valueForStringParameter(
this,
'/some/path/tableName',
);
new lambda.NodejsFunction(this, 'MyFunction', {
...
environment: {
TABLE_NAME: tableName,
},
});
}
}

AWS SAM

You can use SSM parameters in your template with the '{{resolve:ssm:parameter-name:version}}' pattern, where version is optional. If you do not specify a version, CloudFormation will use the latest version of the parameter whenever you create or update the stack.

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: SAM Import
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
...
Environment:
Variables:
TABLE_NAME: '{{resolve:ssm:/some/path/tableName}}'

Serverless Framework

Reading parameters in Serverless framework is done with the ${ssm:/path/to/param} syntax:

service: sls-import
provider:
name: aws
functions:
hello:
handler: src/functions/app.lambdaHandler
environment:
TABLE_NAME: ${ssm:/some/path/tableName}

Terraform

You can use the data source aws_ssm_parameter to fetch a parameter value.

data "aws_ssm_parameter" "table_name" {
name = "/some/path/tableName"
}

2.3. Reading at Runtime in a Lambda function

The following code uses the AWS SDK for TypeScript to fetch parameter values. Be aware that the function IAM role will need permission to fetch the parameter.

import { APIGatewayProxyEvent, APIGatewayProxyResult } from 'aws-lambda';
import { SSMClient, GetParameterCommand } from '@aws-sdk/client-ssm';
const ssmClient = new SSMClient({});
const input = { Name: '/some/path/tableName' };
const command = new GetParameterCommand(input);
const parameterPromise = ssmClient.send(command);
export const lambdaHandler = async (event: APIGatewayProxyEvent): Promise<APIGatewayProxyResult> => {
const parameter = await parameterPromise;
...

3. Summary

You have now learned how to separate your infrastructure stacks into more manageable chunks with the use of CloudFormation Outputs and/or SSM Parameters. You’ve seen examples for AWS CDK, AWS SAM, Serverless Framework and Terraform.

Now go build something awesome!


About the author

I'm Elias Brange, a Cloud Consultant and AWS Community Builder in the Serverless category. I'm on a mission to drive Serverless adoption and help others on their Serverless AWS journey.

Did you find this article helpful? Share it with your friends and colleagues using the buttons below. It could help them too!

Are you looking for more content like this? Follow me on LinkedIn & Twitter !