AWS Developer Tools Blog

DevOps Meets Security: Security Testing Your AWS Application: Part III – Continuous Testing

This is part III of a blog post series in which we do a deep dive on automated security testing for AWS applications. In part I, we discussed how AWS Java developers can create security unit tests to verify the correctness of their AWS applications by testing individual units of code in isolation. In part II we went one step further and showed how developers can create integration tests that, unlike unit tests, interact with real software components and AWS resources. In this last post in the series, we’ll walk you through how to incorporate the provided security tests into a CI/CD pipeline (created in AWS CodePipeline) to automate security verification when new changes are pushed into the code repository

Security Tests

In part I and part II of this post, we created a suite of unit and integration tests for a simple S3 wrapper Java class. Unit tests focused on testing the class in isolation by using mock objects instead of real Amazon S3 objects and resources. In addition, integration tests were created to complement unit tests and provide an additional layer of verification that uses real objects and resources like S3 buckets, objects, and versions. In this last post in the series (part III), we’ll show how the unit and integration security tests can be incorporated into a CI/CD pipeline to automatically verify the security behavior of code being pushed through the pipeline.

Incorporating Security Tests into a CI/CD Pipeline

Setting Up

Git and CodeCommit 

Follow the steps in the Integrating AWS CodeCommit with Jenkins blog post to install Git and create an AWS CodeCommit repo. Download the source code and push it to the AWS CodeCommit repo you created.

Jenkins and plugins on EC2

Follow the steps in the Building Continuous Deployment on AWS with AWS CodePipeline, Jenkins and AWS Elastic Beanstalk blog post to install and configure Jenkins on Amazon EC2. Make sure you install the AWS CodePipeline Jenkins plugin to enable AWS CodePipeline and Jenkins integration. In addition, create three Jenkins Maven jobs following the steps described in section “Create a Jenkins Build Job” in that blog post. However, for the Job parameters described below use the values indicated in the table instead.

Jenkins Project Name

SecTestsOnAWS
(maven project)

SecUnitTestsOnAWS
(maven project)

SecIntegTestOnAWS
(maven project)

AWS Region

choose an AWS region

choose an AWS region

choose an AWS region

Source Code Mngt: Category

Build

Test

Test

Source Code Mngt: Provider

SecTestsBuildProvider

SecUnitTestsProvider

SecIntegTestsProvider

Build: Goals and options

package -DskipUnitTests=true
-DskipIntegrationTests=true

verify
-DskipIntegrationTests=true

verify
-DskipUnitTests=true

Post-build Actions:AWS CodePipelinePublisher:Output Locations: Location

target/

target/

target/

Make sure you pick an AWS region where AWS CodePipeline is available.

Here’s an example of the configuration options in the Jenkins UI for project SecTestsOnAWS:

Setting up Jenkins to build S3ArtifactManager using Maven

AWS CodePipeline

In the AWS CodePipeline console, create a pipeline with three stages, as shown here.

AWS CodePipeline CI/CD pipeline with security unit/integration tests actions

Stage #1: Source

  • Choose AWS CodeCommit as your source provider and enter your repo and branch names where indicated.

Stage #2: Build

Create a build action with the following parameters:

  • Action category: Build
  • Action name: Build
  • Build provider: SecTestsBuildProvider (must match the corresponding Jenkins entry in project SecTestsOnAWS)
  • Project name: SecTestsOnAWS
  • Input Artifact #1: MyApp
  • Output Artifact #1: MyAppBuild

Stage #3: Security-Tests

Create two pipeline actions as follows:

Action #1: Unit-Tests Action #2: Integration-Tests
  • Action category: Test
  • Action name: Unit-Tests
  • Build provider: SecUnitTestsProvider (must match the corresponding Jenkins entry in project SecUnitTestsOnAWS)
  • Project name: SecUnitTestsOnAWS
  • Input Artifact #1: MyApp
  • Output Artifact #1: MyUnitTestedBuild
  • Action category: Test
  • Action name: Integration-Tests
  • Build provider: SecIntegTestsProvider (must match the corresponding Jenkins entry in project SecIntegTestsOnAWS)
  • Project name: SecIntegTestsOnAWS
  • Input Artifact #1: MyApp
  • Output Artifact #1: MyIntegTestedBuild

We are not adding a pipeline stage/action for application deployment because we have built a software component (S3ArtifactManager), not a full-fledged application. However, we encourage the reader to create a simple web or standalone application that uses the S3ArtifactManager class and then add a deployment action to the pipeline targeting an AWS Elastic Beanstalk environment, as described in this blog post.

Triggering the Pipeline

After the pipeline has been created, choose the Release Change button and watch the pipeline build the S3ArtifactManager component.

If you are looking for a more hands-on experience in writing security tests, we suggest that you extend the S3ArtifactManager API to allow clients to retrieve versioned objects from an S3 bucket (for example, getObject(String bucketName, String key, String versionId)) and write security tests for the new API. 

Final Remarks

In this last post of the series, we showed how to automate the building and testing of our S3ArtifactManager component by creating a pipeline using AWS CodePipeline, AWS CodeCommit, and Jenkins. As a result, any code changes pushed to the repo are now automatically verified by the pipeline and rejected if security tests fail.

We hope you found this series helpful. Feel free to leave your feedback in the comments.

Happy security testing!