CI/CD with Bitbucket, AWS SNS and S3 Bucket- The Legacy Pipeline

·

6 min read

CI/CD with Bitbucket, AWS SNS and S3 Bucket- The Legacy Pipeline

Writing code is interesting and fun until that time comes when your application is production ready and you have to deliver your little creation to millions of users around the world, you start to face different bottlenecks setting up your production environment and perhaps you even forgot how you fixed a dependency issue you face while setting up your development environment, that's another round of trouble and another productive time you spend in deploying your code. and maybe you are able to deploy your code successfully at first try, then your user started growing and more ideas started flowing, so you need to start releasing more features to your users as fast as the Falcon X. but then you really just want to focus all your energy of development instead of spending time doing ops kinds of stuff, then this article is for you.

In this article, I will be introducing to you a concept that will solve your problems and make your solution delivery seamless.

To aid your better understanding of concepts and also for you to have a hands-on experience with this, you will need an AWS account and a Bitbucket account, if you don't have one, you can sign up for AWS here and bitbucket here also.

First, you had to set up your production environment to work with the same dependencies that were running on your localhost, luckily, you are running a PHP application :), Probably when you were working in your localhost, you were using the LAMP/WAMP/XAMPP Stack so you didn't know the stress that package took from you until you have to set up your web server yourself, by the time you spend the most of your productive hours setting up PHP with Apache or NGINX you are already rethinking your career. yea every dev has been through this stage, but the question is must we always go through this?… which is why some people (i don't know who they are, you can check that) came up with a process that can ease the pain of code delivery and deployment, and ever since this process has been created, different people have come up with different flows and tools on how to make things these processes easier for their devs.

To follow along with this presentation, you would need to have an AWS account and a bitbucket account, if you don't you can signup for AWS here

The example service we wil be creating is a simple email sender service that is written in Nodejs, you can get the source code here, pull it and lets start begin our journey to wonderland. To make this work you need to add this files to our project file

  • Bitbucket.yml -> This is used by bitbucket to manage our continuous integration

  • S3_upload.py -> You should also see this file in the project directory, this file is a python script that we will request our bitbucket.yml configuration to run, what this script does is just to upload a zip folder to AWS s3

In my case i used two environments, the staging|production environment Lets start with the Bitbucket.yml file

The Problem?

Before we started using this flow, we had some constraints with the deployment strategy we have been implementing. and the constraints are:

  • Its an overkill for deploying small services like a html page
  • Deployment was manual (build, publish/promote, release and deploy)
  • The release pipeline took a long time to be pushed to production
  • No notification on every stage from build to release to notify the devs of the progress of their deployment
  • High coupling between operations and dev teams
  • A lot of release specificities, hard to maintain and operate
  • No release management in place
  • No artifacts was created and packaged

The Legacy Solution

1*yUaQzGvkDTP5HQwrt7Nnow.png Diagrammatic Representation of the Legacy Pipeline

Objectives

  • Significantly reduce manual effort
  • Simplify and streamline build & deploy process
  • Solve dependencies setuo issues amongs devs

What we will get?

A bitbucket repo following a ci/cd flow, whenever a push is made to staging/master branch, will trigger bitbucket pipeline

Step One - Enable Bitbucket Pipeline

(a) Click the Pipelines icon to start the wizard. If you're viewing Pipelines for the first time, click Enable Pipelines in the welcome screen:

1*z3vG-D-KUt13X3-sJS8oNQ.png

(b) Pick a Template: There are different templates available for any programming language of your choice, ranging from Php to Scala to even using Docker, but for the sake of this tutorial, I will be using the Python Template, You can choose any language you are most comfortable with.

(c) Customize your Configuration: The configuration will be based on the bitbucket.yml file, this is where the bitbucket build is being orchestrated, everything follows a sequence of commands that bitbucket server executes to build your code and run any extra commands in it. see an example of my bitbucket.yml file

image: python:3.5.1
pipelines:
  branches:
    master:
      - step:
          script:
            - apt-get update 
            - apt-get install -y zip 
            - apt install ssh rsync -y 
            - pip install boto3==1.3.0 
            - sh env.sh
            - python s3_upload.py
    staging:
      - step:
          script:
            - apt-get update # required to install zip
            - apt-get install -y zip 
            - apt install ssh rsync -y 
            - pip install boto3==1.3.0 
            - sh env.sh
            - python s3_upload_staging.py

If you have issue understanding the way yaml file works and how to properly write your commands following the sequence, you can read this undertanding yaml.

In bitbucket.yml file above, you can write your command to work on any branch you want, we will be building from the staging and master branch. you can read the comments at the end of each commands to understands what each commands does, after we have prepared everything need, the last command is a simple python scripts that is generated by bitbucket to upload your zipped code to aws s3, you can modify as you like.

"""
A Bitbucket Builds template for deploying to S3
v1.0.0
"""
from __future__ import print_function
import os
import sys
from time import strftime, sleep
import boto3
from botocore.exceptions import ClientError

accesskey = os.environ['accesskey']
secretkey = os.environ['secretkey']
appname = os.environ['APPLICATION_NAME']

VERSION_LABEL = appname
BUCKET_KEY = VERSION_LABEL + '.zip'
print (appname)

def upload_to_s3(artifact):
    """
    Uploads an app to Amazon S3
    """
    try:
        client = boto3.client('s3', aws_access_key_id=accesskey, aws_secret_access_key=secretkey)
    except ClientError as err:
        print("Failed to create boto3 client.\n" + str(err))
        return False

    try:
        client.put_object(
            Body=open(artifact, 'rb'),
            Bucket=os.getenv('S3_BUCKET'),
            Key=BUCKET_KEY
        )
    except ClientError as err:
        print("Failed to upload app to S3.\n" + str(err))
        return False
    except IOError as err:
        print("Failed to access" + appname + "in this directory.\n" + str(err))
        return False

    return True     

def main():
    " Your favorite wrapper's favorite wrapper "
    if not upload_to_s3('/tmp/'+appname+'.zip'):
        sys.exit(1)

if __name__ == "__main__":
    main()

To add your environment variables, click on environment variables on the left tab, and add the following: APPLICATION_NAME AWS_DEFAULT_REGION S3_BUCKET STAGING_BUCKET ACCESS_KEY SECRET_KEY