To stay competitive, organizations must innovate faster and operate more efficiently, and using dynamic and highly scalable cloud resources can help. IT teams can shift from the mindset of a binary choice between business agility and governance control, to a mindset that includes speed and governance over cost, security, compliance, and more. The following common scenarios demonstrate the business impact of lacking governance controls:

  • Unencrypted storage leads to inadequate data security for valuable data, such as personal information and payment card data.
  • Cloud resources not in use—but kept running—results in unnecessary cost.
  • Lacking database snapshots exposes business to data loss.
  • Excessive permissions do not comply with the least-privilege principle and weaken the control environment.
  • Multiple Availability Zones not enabled imposes an availability risk to business services.

These are merely a few examples. Ensuring compliance with internal technical specifications and external regulations to achieve goals such as security, cost optimization, and high availability service is critical.

Overview of solution

The solution described in this post achieves compliance by default through Open Policy Agent (OPA) policy check integrated with a CI/CD pipeline, and also makes nearly continuous compliance possible with Cloud Custodian doing real-time scanning and auto-remediation. The high-level architecture looks like the following:

Illustration of high level architecture

Before demonstrating the solution, let’s review a few considerations from the preceding diagram. Policy sources for the policy control library could be laws and regulations, external frameworks such as Center for Internet Security (CIS) Benchmark, internal control standards, and so on. The policy control library drives the implementation of preventive controls and detective controls.

Infrastructure as code (IaC) is a prerequisite for the subsequent static policy check module, which could be fulfilled with AWS CloudFormation, AWS Cloud Development Kit (AWS CDK), and partner solutions such as Terraform or Pulumi. The static policy check should be integrated with a CI/CD pipeline of infrastructure code and follow GitOps best practices to prevent deploying misconfigurations and to correct violations early.

Detective controls is to check the non-compliance changes of resources caused by uncontrollable factors, such as manual changes or non-IaC-standard process deployment. The dynamic policy check achieves real-time infrastructure scanning to validate the running state.

Responsive controls is non-compliance event-driven and fulfills auto remediation through serverless function.

In this post, we will implement preventative controls using Open Policy Agent (OPA), an open source general policy engine. The Cloud Native Computing Foundation (CNCF) accepted OPA as an incubation project in April 2019. OPA could be used to validate any JSON-formatted file against policies. If you haven’t used OPA before, learn more from my blog post Realize policy as code with AWS Cloud Development Kit through Open Policy Agent.

The open source tool for detective controls used in this post is Cloud Custodian, a cloud resource-management rule engine that is a CNCF sandbox project with hundreds of open source contributors. To learn more about Cloud Custodian, read my blog post Compliance as code and auto-remediation with Cloud Custodian.

Now to show how the solution works, let’s use one policy as an example to implement both a static policy check and dynamic policy check.

Walkthrough

Assume we get the following requirement:

User story: As a cloud administrator, I would like to have EBS volumes encrypted to ensure data security.

The workflow of our solution will look like the following:

Solution workflow with IaC down to static policy check to cloud infrastructure

  • Implement static policy check with OPA and integrate the check with IaC pipeline.
  • Implement dynamic policy check including auto remediation with Cloud Custodian.

Prerequisites

For this walkthrough, you should have some AWS knowledge, enough to set up a pipeline with AWS CodePipeline.

Static policy check with OPA

1. Create an AWS CodeCommit repo with AWS CloudFormation template and OPA policy file.

Create the following CloudFormation template named ebs-stack.json, which is used to create three Amazon Elastic Block Store (Amazon EBS) volumes. One is configured to be encrypted; the other two are not.

{ "Resources": { "EncryptedVolume01": { "Type": "AWS::EC2::Volume", "Properties": { "Size": "20", "Encrypted": "true", "AvailabilityZone": "us-east-2a" } }, "UnencryptedVolume02": { "Type": "AWS::EC2::Volume", "Properties": { "Size": "10", "Encrypted": "false", "AvailabilityZone": "us-east-2c" } }, "UnencryptedVolume03": { "Type": "AWS::EC2::Volume", "Properties": { "Size": "10", "AvailabilityZone": "us-east-2c" } } }
}

Create the OPA policy file named opa_ebs_policies.rego to check whether the Amazon EBS volume is encrypted. Integrate a policy check with the CI/CD pipeline by conducting the policy check right before deploying the resources.

package opa_policies default allow = false # deny if EBS volumes are not encrypted or not explicitly defined
ebs_not_encrypted [name]{ res:=input.Resources[name] res.Type == "AWS::EC2::Volume" object.get(res.Properties, "Encrypted", "false") != "true"
} allow = true { count(ebs_not_encrypted) == 0
}

Create the following file named buildspec.yaml, which is used in the AWS CodeBuild project of CI/CD pipeline.

version: 0.2 phases: install: commands: - echo install OPA - curl -L -o opa https://openpolicyagent.org/downloads/latest/opa_linux_amd64 - chmod 755 ./opa - | cat >verify.sh <

Commit all of the preceding files into an AWS CodeCommit repo.

repository screenshot showing preventative policies

2. Create a CI/CD pipeline.

Create a pipeline with the previously mentioned AWS CodeCommit repository as the source.

screenshot of pipeline settings showing opa_preventative_policies

screenshot showing add source stage with AWS CodeCommit as source provider and preventative_policies as repo name

Create a CodeBuild project as the pipeline’s build stage using the buildspec.yaml in the repo, configure the environment image as aws/codebuild/amazonlinux2-x86_64-standard:2.0, and keep the other configurations as default.

add build stage screenshot showing AWS CodeBuild as teh build provider and region selected

screenshot with project configuration and project name is preventative_policies_build

screenshot of adding build stage showing successfully created preventative_policies_build in CodeBuild

screenshot showing button to skip deploy stage

Skip the deploy stage and create pipeline.

3. Validate and enforce the Cloud Custodian policy.

After we save the pipeline, the pipeline will start its first fun. The pipeline will fail as the preceding CloudFormation template creates three Amazon EBS volumes, but two of the three are not configured to be encrypted. This is against our policy that all the Amazon EBS volumes should be encrypted.

screenshot shows build failed for 2 unencrypted volumes

Select Details, then view the execution details to review logs of the AWS CodeBuild project execution.

screenshot showing failed build status what initiated it

This shows that allow is false and the logical IDs of unencrypted EBS volumes.

Let’s remove the two unencrypted Amazon EBS volumes in the CloudFormation template, and commit the ebs-stack.json file:

{ "Resources": { "EncryptedVolume01": { "Type": "AWS::EC2::Volume", "Properties": { "Size": "20", "Encrypted": "true", "AvailabilityZone": "us-east-2a" } } }
}

The code changes will automatically launch the pipeline, and the pipeline initiation shows as successful.

The code changes will automatically launch the pipeline, and the pipeline initiation shows as successful.

Let’s check the AWS CodeBuild project logs.

project logs show build state succeeded and entering phase POST_BUILD

The pipeline works; the CloudFormation template passes the policy checking.

Dynamic policy check with Cloud Custodian

1. Creating an Amazon EC2 environment in AWS Cloud9.

We use the AWS Cloud9 environment for the rest of the post. Follow the user guide instructions to create an AWS Cloud9 EC2 environment as the workspace.

Note: Cloud Custodian policy use may alter AWS resources. For the purpose of this blog post and tutorial, do not try this in production. Use a test or sandbox account.

To begin, install Cloud Custodian:

$ python3 -m venv custodian
$ source custodian/bin/activate
(custodian) $ pip install c7n #Install AWS package

2. Create the Cloud Custodian policy

Run the following command to create the Cloud custodian policy:

cat > custodian_ebs_encryption.yml <

3. Validate and enforce the Cloud Custodian policy

Validate Cloud Custodian policies:

custodian validate custodian_ebs_encryption.yml

screenshot of code running and configuration valid

4. Dry run the Cloud Custodian policy

For testing, we will dry run the policy:

custodian run --dryrun custodian_ebs_encryption.yml -s out

policy successfully runs

Get Custodian rule report:

custodian report custodian_ebs_encryption.yml -s out --format grid -p encrypt-unencrypted-ebs-volumes

From the following report, we could see which EBS volume is not in encrypted state.

(custodian) zxuejiao:~/environment/custodian/demo $ custodian report custodian_ebs_encryption.yml -s out --format grid -p encrypt-unencrypted-ebs-volumes +-----------------------+-----------------------------+--------+--------------+------------+
| VolumeId | Attachments[0].InstanceId | Size | VolumeType | KmsKeyId |
+=======================+=============================+========+==============+============+
| vol-xxxxxxxxxxxxxxxxxxx | i-xxxxxxxxxxxxxxxxxxx | 20 | gp2 | |
+-----------------------+-----------------------------+--------+--------------+------------+
| vol-0123456789012345 | i-0123456789012345 | 8 | gp2 | |
+-----------------------+-----------------------------+--------+--------------+------------+

As expected, the Cloud Custodian policy is performing real-time scanning on the encryption state of EBS volumes.

Cleaning up

To avoid incurring future charges, manually delete the resources:

  • Amazon EBS volumes
  • AWS CodeBuild project
  • AWS CodePipeline pipeline

Conclusion

We introduced a solution with a combination of directive controls, preventive controls, detective controls, and responsive controls, so that organizations can simplify end-to-end IT lifecycle management. This same solution helps builders to be more agile while getting the benefits of cloud platforms. To align the software development life cycle (SDLC) to business goals requires an updated governance model that defines the policies to ensure that the SDLC process operates within guardrails.

With the solution described in this article, organizations can use AWS to innovate quickly and achieve their governance and compliance goals. If you need help in the design and implementation of such solution, feel free to reach out to me or your local AWS Professional Services team.