Oct 07, 2024

How to Customise the AWS CDK Pipeline

author's image Morten Jensen
6 minutes read

How to Customise the AWS CDK Pipeline

Background

Have you ever wanted the power of the AWS CodePipeline mixed with the convenience of the AWS CDK pipeline and supporting services in a CDK app?

The AWS CDK (Cloud Development Kit) Pipeline is a high-level construct that simplifies the creation and management of CI/CD pipelines using AWS CodePipeline. It abstracts away much of the complexity involved in setting up a pipeline by providing an opinionated approach to defining the stages and actions within the pipeline. This makes it easier to implement best practices and integrate with other AWS services like CodeBuild and CloudFormation.

Using the AWS CDK Pipeline over fully implementing CodePipeline in the CDK offers several advantages. It reduces the amount of boilerplate code you need to write, allowing you to focus on the core logic of your application. Additionally, the CDK Pipeline construct includes built-in support for common tasks such as source control integration, build and test stages and deployment strategies (e.g. Waves), which can save time and reduce errors. This makes it an excellent choice for developers who want to quickly set up robust, scalable CI/CD pipelines without getting bogged down in the details of individual pipeline components.

At Virtuability, we have often customised the CDK pipeline to do things that would otherwise have required the extra heavylifting of implementing a AWS CodePipeline from scratch in the CDK. We have done so in order to add quicker value and offer easier maintenance for our customers.

This blog post demonstrates some of the types of customisations that are possible with the CDK pipeline. The supporting example code for a fully functioning, customised CDK pipeline is provided in the github repository virtuability/cdk-custom-pipeline.

Use new CodePipeline V2 features

With the release of the AWS CodePipeline V2, the CDK pipeline has not yet been upgraded to take advantage of the new features and there is currently a recommendation that to use V2 features one should drop down to using the aws-codepipeline construct library directly.

The V2 pipeline is backwards compatible with the V1 pipeline. In other words, to upgrade to V2 is as simple as overriding the pipeline type.

This can be achieved in code with:

# Materialize the pipeline to allow us to override pipeline configuration
cdk_pipeline.build_pipeline()

cfn_pipeline = cdk_pipeline.pipeline.node.default_child

# Override the pipeline type to a V2
cfn_pipeline.pipeline_type = "V2"

See pipeline/pipeline_stack.py in the repository for the full example.

The V2 pipeline offers a different pricing model. Where the active V1 pipeline costs $1 per month the V2 pipeline is charged by action execution time (excluding manual approval and custom action types). If you have many CodePipelines (such as for landing zones) that are executed infrequently then this can make a cost difference.

Use the pipeline V2 variables

The V2 pipeline can use variables as input to each pipeline execution. The code demonstrates how a pipeline variable is defined, which is used to give the deployed Cloudformation stack a prefix when deployed by the pipeline.

Pipeline variables can be used for a variety of purposes. For instance, variables can be used where artifacts such as container images or software packages are built outside of the pipeline or by third-parties.

Adding a pipeline variable means that a value can be set per pipeline execution. The variable remains the same for all stages and actions that are deployed by the pipelines in that execution.

In this example we provide a somewhat construed example (for simplicity) that simply adds a prefix to a resource name in a stack. The following code demonstrates the definition of the pipeline variable:

        cfn_pipeline.add_property_override(
            "Variables",
            [
                {
                    "Name": "resource_prefix",
                    "Description": "Resource prefix",
                    "DefaultValue": "test",
                },
            ],
        )

This code demonstrates the use of the pipeline variable by the ShellStep, i.e. CodeBuild Project:

        synth_step = pipelines.ShellStep(
            "Synth",
            input=self._input_source,
            env={
                "RESOURCE_PREFIX": "#{variables.resource_prefix}",
                "PIPELINE_EXECUTION_ID": "#{codepipeline.PipelineExecutionId}",
            },
            commands=[
                "echo RESOURCE_PREFIX: $RESOURCE_PREFIX",
                "echo PIPELINE_EXECUTION_ID: $PIPELINE_EXECUTION_ID",
                "python3 -m pip install -r requirements.txt",
                "python3 -m pytest",
                # Carry over the cdk.context.json settings as the file is not checked in
                f"cdk synth -c pipeline_repo={self._pipeline_conf.repo} -c pipeline_branch={self._pipeline_conf.branch} -c pipeline_connection_arn={self._pipeline_conf.connection_arn} -c resource_prefix=$RESOURCE_PREFIX",
            ],
        )

The pipeline also has predefined variables such as #{codepipeline.PipelineExecutionId}, which we output for demonstration purposes.

More information on pipeline variables can be found in the documentation.

Add a custom CDK pipeline action

The CDK pipeline enables users to define custom pipeline actions. The documentation for how to implement custom actions is vague but it involves implementing the abstract class ICodePipelineActionFactory. An example can be found in the Step documentation.

In this example we implement a CloudFormationDeleteStackStep action in the pipeline/pipeline_stack.py file, which deletes a previously deployed Cloudformation Stack. The purpose of this is that pipelines can be used to temporarily create a stack, tests can be carried out against the stack and on completion the stack is deleted in order to save cost.

@jsii.implements(pipelines.ICodePipelineActionFactory)
class CloudFormationDeleteStackStep(pipelines.Step):
    """
    Custom pipeline action that deletes a CloudFormation stack
    """

    def __init__(
        self,
        scope: Construct,
        id: str,
        stack_name: str,
    ):
        super().__init__(id)

        self._discover_referenced_outputs({"env": {}})

        self._action_name = id
        self._stack_name = stack_name

        self._destroy_role = iam.Role.from_role_arn(
            scope,
            f"{id}DestroyRole",
            f"arn:{scope.partition}:iam::{scope.account}:role/cdk-{scope.synthesizer.bootstrap_qualifier}-deploy-role-{scope.account}-{scope.region}",
        )
        self._exec_role = iam.Role.from_role_arn(
            scope,
            f"{id}ExecRole",
            f"arn:{scope.partition}:iam::{scope.account}:role/cdk-{scope.synthesizer.bootstrap_qualifier}-cfn-exec-role-{scope.account}-{scope.region}",
        )

    def produce_action(
        self,
        stage,
        scope: pipelines.ProduceActionOptions,
        action_name=None,
        run_order=None,
        variables_namespace=None,
        artifacts=None,
        fallbackArtifact=None,
        pipeline=None,
        codeBuildDefaults=None,
        beforeSelfMutation=None,
        stackOutputsMap=None,
    ):
        action = codepipeline_actions.CloudFormationDeleteStackAction(
            action_name=scope.action_name,
            run_order=scope.run_order,
            admin_permissions=False,
            stack_name=self._stack_name,
            role=self._destroy_role,
            deployment_role=self._exec_role,
        )

        stage.add_action(action)

        return pipelines.CodePipelineActionFactoryResult(run_orders_consumed=1)

Finally, note that jsii is a framework that enables code in various languages to interact with JavaScript classes. The user documentation can be found here.

Override CDK pipeline artifact S3 bucket properties

Per default the CDK pipeline creates an S3 bucket for artifacts built during the pipeline execution. While an S3 bucket can be fed in to the pipeline on creation, another option is to use escape hatches to override the S3 bucket properties and resource policies:

        cfn_bucket = cdk_pipeline.pipeline.node.find_child(
            "ArtifactsBucket"
        ).node.default_child

        # Include a lifecycle configuration for the pipeline artifacts bucket
        cfn_bucket.add_property_override(
            "LifecycleConfiguration",
            {
                "Rules": [
                    {
                        "AbortIncompleteMultipartUpload": {"DaysAfterInitiation": 7},
                        "ExpirationInDays": 365,
                        "NoncurrentVersionExpiration": {"NoncurrentDays": 365},
                        "Status": "Enabled",
                    }
                ]
            },
        )

        cfn_bucket.add_override(
            "UpdateReplacePolicy",
            "Delete",
        )
        cfn_bucket.add_override(
            "DeletionPolicy",
            "Delete",
        )

Override ShellStep/CodeBuild project defaults

Another option is to override the CodeBuild project defaults (e.g. for ShellStep). For instance, we may require a particular python version such as 3.12 and use the ARM/Graviton architecture instead. In addition, we may want to install additional tools at the install phase.

The following example demonstrates this:

            code_build_defaults=pipelines.CodeBuildOptions(
                build_environment=codebuild.BuildEnvironment(
                    build_image=codebuild.LinuxBuildImage.AMAZON_LINUX_2_ARM_3,
                    compute_type=codebuild.ComputeType.LARGE,
                ),
                partial_build_spec=codebuild.BuildSpec.from_object(
                    {
                        "phases": {
                            "install": {
                                "runtime-versions": {"python": "3.12"},
                                "commands": [
                                    "pip3 install pytest",
                                    "npm install -g aws-cdk",
                                ],
                            }
                        }
                    }
                ),
            ),

Note that the CodeBuild project defaults can be overridden for all CodeBuild projects or individually for the asset publishing, self mutation or synth projects. This can be achieved by setting the respective properties at pipelines.CodePipeline instantiation.

Conclusion

We have demonstrated some of the various ways that the CDK pipeline pipelines.CodePipeline library can be customised to meet the requirements of most pipeline scenarios without the need to drop down to the more laborious aws_codepipeline library.

If your organisation is using AWS CodePipeline and needs assistance with using it, then please feel free to reach out by visiting our contact page or send an email to team@virtuability.com.

We have the tools to understand your cloud and the guidance to make the most of it.

GET IN TOUCH

Schedule a call with a us and find out what Virtuability can do for you.

GET STARTED