Docker Image for Bamboo Build Agent

A basic Dockerfile for Bamboo agent

FROM atlassian/default-image
RUN curl -sO https://releases.hashicorp.com/packer/1.2.3/packer_1.2.3_linux_amd64.zip
RUN echo 822fe76c2dfe699f187ef8c44537d10453a1545db620e40b345cf6991a690f7d packer_1.2.3_linux_amd64.zip  | sha256sum --status -c -
RUN unzip packer_1.2.3_linux_amd64.zip -d /usr/local/bin/
RUN apt-get -q update && apt-get -q -y install python-pip rubygems-integration
RUN pip install awscli
RUN wget https://github.com/stedolan/jq/releases/download/jq-1.5/jq-linux64
RUN echo c6b3a7d7d3e7b70c6f51b706a3b90bd01833846c54d32ca32f0027f00226ff6d jq-linux64 | sha256sum --status -c -
RUN chmod +x jq-linux64 && sudo mv jq-linux64 /usr/local/bin/jq

Alert on a new EC2 instance creation...

I created new notification system in AWS to alert us when a new instance is created in the account.

Infrastructure details

A new CloudWatch rule is created:

 

This invokes a Lambda function InvokeCreationNotifier.

The lambda function parses the event and sent an email through SNS:

 

 

 

Octopus Deploy is a neat product...

I really like Octopus Deploy, have used it at a Client for deploying .Net applications and I like the visibility it provides for the CD side of the DevOps process.

I am leaving Kloud...

Kloud is one of the companies with a soul ... will be missed

Cloud Team

How do you assemble a Cloud Team, this is one way:

Sync a Local directory with S3

Sync a Local directory with S3

import os
import sys
import boto3
import hashlib
from datetime import datetime
from botocore.exceptions import ClientError

boto3.setup_default_session(profile_name='default')

if len(sys.argv) < 3:
    print("Not enough arguments.")
    print("Usage: python3 py-sync.py [SOURCE_DIRECTORY] [DESTINATION_BUCKET_NAME]")
    exit()

# Init objects
s3_client = boto3.client('s3')

SOURCE_DIR = sys.argv[1]
DESTINATION_BUCKET = sys.argv[2]

def check_file_exists(bucket, key):
    try:
        s3_client.head_object(Bucket=bucket, Key=key)
    except ClientError as e:
        return int(e.response['Error']['Code']) != 404
    return True

def md5(fname):
    hash_md5 = hashlib.md5()
    with open(fname, "rb") as f:
        for chunk in iter(lambda: f.read(4096), b""):
            hash_md5.update(chunk)
    return hash_md5.hexdigest()

print("Filename-Local", end=', ')
print("Filename-S3", end=', ')
print("File-Status", end=', ')
print("Action")

print("--------------", end=', ')
print("-----------", end=', ')
print("-----------", end=', ')
print("------")

for subdir, dirs, files in os.walk(SOURCE_DIR):
    for file in files:
        file_path_full = subdir + os.sep + file
        file_path_relative = file_path_full.replace(SOURCE_DIR + os.sep, '')
        file_key = file_path_relative.replace('\\', '/')

        print(file_path_full, end=', ')
        print('s3://' + DESTINATION_BUCKET + '/' + file_key, end=', ')

        if check_file_exists(DESTINATION_BUCKET, file_key) == False: # File doesnt exists, upload it
            s3_client.upload_file(file_path_full, DESTINATION_BUCKET, file_key)
            print("New", end=', ')
            print("Uploading")

        else:
            response = s3_client.head_object(Bucket=DESTINATION_BUCKET, Key=file_key)
            md5_s3 = response['ResponseMetadata']['HTTPHeaders'].get('etag')
            md5_s3 = md5_s3.replace('\"', '')
            md5_local = (md5(file_path_full))

            if md5_local != md5_s3:
                s3_client.upload_file(file_path_full, DESTINATION_BUCKET, file_key)
                print("Modified", end=', ')
                print("Uploading")

            else:
                print("No-Change", end=', ')
                print("Skipping")