DevSecOps: AWS (Amazon Web Service) Code Pipeline

DevSecOps: AWS (Amazon Web Service) Code Pipeline

Posted: 02/07/2023

Code Pipeline Understanding

Amazon Web Service (AWS) GovCloud supports code pipeline which allows continuous integration and continuous delivery (CI/CD) for code deployed in development, QA, stage, and production environments. These pipelines allow for integrating security aspects, static code review, and unit test artifacts to ensure the deployed builds follow the highest desired security and quality. 

These code pipelines can be used to deploy APIs, front-end code, Robotic Process Automation (RPA) Bots, AI/ML models, and any others you may want to deploy on either server or serverless technology. 

We are discussing multi-account deployment strategy in this article. 

Multi-Account Deployments 

AWS cloud allows any organization to have multiple accounts that are managed and built according to their needs. Organizations can register accounts to have separation in their teams and even environments. We will discuss deployment with multiple environments in depth. 

Let us assume NodeJS microservice API must be deployed on AWS Fargate, which supports serverless architecture. By following proper development lifecycles where developers, QA, business users, and actual users of the API, we will be verifying artifacts in respective environments. Code begins in the Dev environment. Upon certification, the build is promoted to a QA environment and then, finally, to a staging and production environment after necessary approvals.  

AWS can have multiple environments in the same account in multiple regions, in the same region, or even multiple accounts. Choose a deployment architecture based upon the organization’s needs. Our focus will be on multiple account deployment for better separation. 

Benefits

Using multiple accounts has following benefits

  • Cost
    • Track and check the cost of an individual account while using multiple services from AWS in that account. 

    • Utilize AWS service quotas and request rate limits supported in every environment as one account holds one environment. For example, AWS will charge a storage fee for Key spaces of $0.30 per GB per month.  

    • Key Space Feature  Provides Benefit To

      Targeted Environments (in different accounts) 

      Quotas per environment 
      Account per Environments 

      Lamba functions, SNS, notifications, S3, Secret Manager, RDS, CloudWatch, and many more 

    •  Help distribute and track a budget for accounts and any added cost decisions can be assessed without having to manage the impact on production or any other environment.
  • Security
    • With separate accounts, ownership, and privileges for a set of users can be set up so that they cannot interfere in other environments. For example, developers can have ownership of their environment for experimentation and other environments won’t be affected. Tweaking of other environments may be needed, but that will be handled by the DevOps team. 

    • Identity and Access Management (IAM) roles can be created according to user requirements with proper privileges to protect sensitive data. 

    • Tracking and monitoring become easier. 

  • Process separation
    • An organization can have multiple teams and different processes. A team can apply applicable processes accordingly for governance, security, operation, and compliance purposes.  

    • Separation limits the scope of impact on adverse events such as misconfigurations, malicious actions, or application related issues.  

  • Code pipeline setup (remove you from everywhere and create person neutral sentences)
    • Create a Tool account and keep all code in a code commit repository and other common services needed for different accounts. Let us assume that NodeJS code repository is available in Tool account.  

    • Establish four different accounts, one each for Dev, QA, Stage, and Production environments. 

    • Consider setting up a code pipeline from Tool account to Dev account, after which we must follow the same steps for other accounts as well. 

    • In Dev account, setup AWS KMS customer managed encryption key needed for security purposes. 

    • In Dev account, create IAM user giving administrative permission to the KMS key created above. 

    • In KMS key. supply usage permission to the Tool account so that a cross-account trust relationship is set up. 

    • In Dev account, create an S3 bucket to hold NodeJS code, which will be downloaded from Tool account as soon as code is refreshed. Set up a suitable bucket policy to allow access. 

    • Create a code pipeline Service role in Dev account and allow that role to have Code Commit repository access living in Tool account. 

    • In Tool account, create a Policy allowing access to the S3 bucket created in Dev account. 

    • Create an IAM role to have Dev account access by supplying an Account ID of Dev Account apart from supplying full Code-Commit access. 

    • Create pipelines using AWS CLI in Dev account by creating a JSON file. This must be done through CLI as the code commit repository cannot be accessed from the user interface.  

    • Create a Code Build which is referenced in Pipeline to complete the process. 

    • Set Event Bridge rule in both Tool and Dev account if you want to kick start the Pipeline as soon as the code is refreshed in repository. 

    • Set up Fargate containers for deployment from ECR as a step in Pipeline. 

    • Include security and unit test code scripts in pipeline. 

    • Pipeline is set up.  Extend the same steps for the rest of the environments. 

APV (AP Ventures) Expertise with DevSecOps

APV regularly partners with federal agencies with DevSecOps as the first approach in every project. APV has helped agencies such as Health & Human Services (HHS), Fed CIV, and the Department of Defense, just to name a few.

 

Reach out to us on digital@apvit.com, for more details on how we can help you!