AWS Developer Tools Blog

DevOps Meets Security: Security Testing Your AWS Application: Part II – Integration Testing

This is part II of a blog post series in which we do a deep dive on automated security testing for AWS applications. In part I, we discussed how AWS Java developers can create security unit tests to verify the correctness of their AWS applications by testing individual units of code in isolation. In this post, we go one step further and show how developers can create integration tests that, unlike unit tests, interact with real software components and AWS resources. In part III of this series, we’ll walk through how to incorporate the provided security tests into a CI/CD pipeline (created in AWS CodePipeline) to enforce security verification when new code is pushed into the code repository. Even though we focus on security, the tests provided can be easily generalized to other domains.

S3 Artifact Manager

In part I of this post, we introduced a simple S3 wrapper component built to illustrate the security tests discussed in this post series. The wrapper, represented by a Java class named S3ArtifactManager (full source code can be accessed here), uses AWS SDK for Java APIs to provide a more secure way to store objects in Amazon S3.

Here we illustrate the method upload() implemented as part of class S3ArtifactManager that can be used to securely upload objects to an S3 bucket. For details about this method, see part I of this series. In the next section, we’ll write integration tests for method upload().

public String upload(String s3Bucket, String s3Key, File file) 
   throws AmazonServiceException, AmazonClientException {
   if (!s3.doesBucketExist(s3Bucket)) {
      s3.createBucket(s3Bucket);
   }

   // enable bucket versioning
   SetBucketVersioningConfigurationRequest configRequest = 
      new SetBucketVersioningConfigurationRequest(s3Bucket, 
         new BucketVersioningConfiguration(BucketVersioningConfiguration.ENABLED));
   s3.setBucketVersioningConfiguration(configRequest);

   // enable server-side encryption (SSE-S3)
   PutObjectRequest request = new PutObjectRequest(s3Bucket, s3Key, file);
   ObjectMetadata objectMetadata = new ObjectMetadata();
   objectMetadata.setSSEAlgorithm(ObjectMetadata.AES_256_SERVER_SIDE_ENCRYPTION);
   request.setMetadata(objectMetadata);

   // upload object to S3
   PutObjectResult putObjectResult = s3.putObject(request);

   return putObjectResult.getVersionId();
}

Security Integration Tests

Security integration tests complement security unit tests by making use of real objects and resources instead of mocking behavior. The integration tests use a real S3 client object and perform security verifications by inspecting real S3 resources, such as S3 buckets, objects, and versions. For each unit test discussed in part I of this series, we have created a corresponding integration test using JUnit, a popular Java test framework.

Verifying bucket versioning enablement on S3 buckets

The first security integration test verifies that upon calling method upload(), versioning is enabled for buckets. We use a real S3 client object using local IAM credentials and AWS region configuration. In the code that follows, prior to uploading an object, we create an unversioned S3 bucket in line 6 by calling the s3Client object directly. Our assumption is that the bucket will be versioned later by calling method upload() in line 9. Lines 13-17 verify that the versionId returned for the object exists in the bucket (that is, the state of the bucket has changed as expected after we called method upload()).

Bucket versioning is asserted in lines 20-23 by checking the bucket’s current versioning configuration value against the expected result (BucketVersioningConfiguration.ENABLED). If this security assertion fails, it means the bucket is not versioned. The test should fail and, as in the case of unit tests, block the CI/CD pipeline until developers can figure out the problem.

After each test is performed, the bucket and all versions of all objects are deleted (see the @After annotation in the source code) .

Important: Each security integration test is fully self-contained and will leave S3 in the state it was in before the test was run. To make this possible, we append UUIDs to bucket and object names. This also allows us to run integration tests in parallel because each test operates in a different bucket.

There is a similar test to check bucket versioning enablement of newly created buckets. See the source code for details.

@Test
public void testUploadWillEnableVersioningOnExistingS3Bucket() 
   throws Exception {   
   
   // bucket must exist prior to uploading object for this test
   s3Client.createBucket(s3Bucket);
   
   // call object under test
   String versionId = s3ArtifactManager.upload(s3Bucket, s3Key, 
      createSampleFile(s3Key));
   
   // assert object's version exists in the bucket
   S3Object s3Object = s3Client.getObject(
      new GetObjectRequest(s3Bucket, s3Key));
   assertEquals("Uploaded S3 object's versionId does not 
      match expected value", versionId, 
         s3Object.getObjectMetadata().getVersionId());
 
   // assert bucket versioning is enabled
   BucketVersioningConfiguration bucketConfig = s3Client
      .getBucketVersioningConfiguration(s3Bucket);
   assertEquals(BucketVersioningConfiguration.ENABLED, 
      bucketConfig.getStatus());
}

Verifying server-side-encryption of uploaded S3 objects

The last security integration test verifies that an object uploaded to S3 is encrypted using SSE-S3. Line 6 calls method upload() to upload a new object. In line 9, we retrieve the object’s metadata and check in lines 11-13 that AES256 is set as the encryption algorithm for SSE-S3, a simple, but effective verification. If this assertion fails, it means SSE-S3 has not been used to encrypt the S3 object, which invalidates our initial assumptions.  

@Test
public void testUploadAddsSSE_S3EncryptedObjectToBucket() 
   throws Exception {
   
   // call object under test
   s3ArtifactManager.upload(s3Bucket, s3Key, createSampleFile(s3Key));
   
   // verify uploaded object is encrypted (SSE-S3)
   ObjectMetadata s3ObjectMetadata = s3Client.getObjectMetadata(
      new GetObjectMetadataRequest(s3Bucket, s3Key));
   assertEquals("Object has not been encrypted using SSE-S3 
      (AES256 encryption algorithm)", AES256, 
         s3ObjectMetadata.getSSEAlgorithm());      
}

Running the Security Tests Locally

Setting Up

Follow these steps in your local workstation to run the units tests locally:

  • Install the AWS Command Line Interface (AWS CLI)
  • Configure the AWS CLI tool
  • Run the AWS CLI tool to configure your environment: aws configure

    • Make sure your AWS credentials under ~/.aws/credentials allow the user running the tests to create S3 buckets and upload S3 objects in your account
    • Make sure you specify a valid AWS region in ~/.aws/config file to run the tests (for example, us-east-1)

 

Running the Integration Tests

You can use Maven to run the provided security tests locally.

  • Navigate to the root directory where you installed the source code. (This is where the pom.xml file resides.)
  • Type mvn verify –DskipUnitTests=true to run the security integration tests.

Expected output:

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

Concurrency config is parallel='none', perCoreThreadCount=true, threadCount=2, useUnlimitedThreads=false

Running com.amazonaws.samples.s3.artifactmanager.integrationtests.S3ArtifactManagerIntegrationTest

Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.802 sec

You’ll see in the output that all three security integration tests passed. Integration tests take longer to run (7.8 seconds) than the unit tests (0.15 seconds) covered in part I. We recommend that these two types of tests are run separately, as we enforced here, and have different triggers as we’ll discuss in part III of this series.

Final Remarks

In this blog post, we discussed how AWS Java developers can create integration tests that verify the integrated behavior of software components in their AWS applications. We used real S3 objects and resources (bucket, objects, versions) and triggered test execution using Maven.

In the third part of this series, we’ll walk through how the provided security tests can be incorporated into a CI/CD pipeline created in AWS CodePipeline to enforce security verification whenever new code changes are pushed into the code repository.