Brené Brown on Empathy
I really like this short Video on empathy and teaches it to the new hires for the course: "Providing Good Customer Experience NHT"
Home of Shariq Mustaquim on the Internet!
I really like this short Video on empathy and teaches it to the new hires for the course: "Providing Good Customer Experience NHT"
Quick and dirty Codebuild project with Terraform
resource "aws_s3_bucket" "example" {
bucket = "shariqexampletestingterrastartup"
acl = "private"
tags = {
Name = "shariqexampletestingterrastartup"
}
}
resource "aws_iam_role" "example" {
name = "example"
assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "codebuild.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
EOF
}
resource "aws_iam_policy" "policy" {
name = "test-policy"
description = "A test policy"
policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "CloudWatchLogsPolicy",
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": [
"*"
]
},
{
"Sid": "CodeCommitPolicy",
"Effect": "Allow",
"Action": [
"codecommit:GitPull"
],
"Resource": [
"*"
]
},
{
"Sid": "S3GetObjectPolicy",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:GetObjectVersion"
],
"Resource": [
"*"
]
},
{
"Sid": "S3PutObjectPolicy",
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": [
"*"
]
},
{
"Sid": "S3BucketIdentity",
"Effect": "Allow",
"Action": [
"s3:GetBucketAcl",
"s3:GetBucketLocation"
],
"Resource": [
"*"
]
}
]
}
EOF
}
resource "aws_iam_role_policy_attachment" "test-attach" {
role = "${aws_iam_role.example.name}"
policy_arn = "${aws_iam_policy.policy.arn}"
}
resource "aws_codebuild_project" "example" {
name = "terraform-cb-project" #var.DOMAIN_NAME
description = "A terrastartup codebuild project."
build_timeout = "5"
service_role = "${aws_iam_role.example.arn}"
artifacts {
type = "CODEPIPELINE"
}
environment {
compute_type = "BUILD_GENERAL1_SMALL"
image = "aws/codebuild/standard:1.0"
type = "LINUX_CONTAINER"
image_pull_credentials_type = "CODEBUILD"
}
logs_config {
cloudwatch_logs {
group_name = "log-group"
stream_name = "log-stream"
}
s3_logs {
status = "ENABLED"
location = "${aws_s3_bucket.example.id}/build-log"
}
}
source {
type = "CODEPIPELINE"
git_clone_depth = 1
}
tags = {
Environment = "Test"
}
}
One of my career ambitions.....
Multiple Pipelines
What we often see is customers setting up one pipeline per environment. This sounds sensible, as it respects the boundaries which an account is designed for! Who has a design like this?
The issue is that it also means that you get drift in the definition of these pipelines. Using CloudFormation is a good practice for defining your pipeline in code, but we still see customers creating different templates for different environments. The problem with this is that when you define the change process differently for each environment, you’re going to get different results! As you move across environments, you’re not building confidence as you promote code changes through the environments. You’re using different build artifacts for different environments, and you’re getting slowed down by the need to process this once for each environment.
We’re starting to consider this a bit of an anti-pattern as we see customers struggling with this.
Snigle Pipeline
A better way to set things up is to have a pipeline that resides in your Dev or Prod environment, with different stages for different environments. Who has this sort of design?
This means that any changes being made to your app/service/microservice is visible in one place – you can visualize the entire SDLC. Of course you want to store it in code, and you want to ensure that the right people have the right permissions to change it.
Setting things up this way will increase your release velocity, as everyone can easily see what’s happening to code changes as they pass through the environments. This means that you don’t need to coordinate across team members and don’t need to manage multiple pipelines.
The downside of this approach is the lack of environmental isolation. If your pipeline is defined in a Dev account, those accounts tend to be a little lighter on security so you have the risk of a bad actor introducing malicious code into the pipeline that can be pushed into Production. And if the pipeline is defined in Prod, it is too hard to access to change the earlier stages of the pipeline! So although this pattern will increase your release velocity, it isn’t ideal from a security perspective.
So what is the best practice?
Pipeline in a "Tools" or "Shared" Account
Well, we’re seeing customers have a lot of success with sticking DevOps tools like CD pipelines in their own account. The pipelines in this account can assume cross-account roles to access resources in the Dev, Test and Production accounts. This imposes best security practices as well as operational practices in how you define your pipelines in code and how you manage the end-to-end flow of your changes to your application.