Tutorial Highlights & Transcript
00:00 - Why AWS CodePipeline Custom Actions
01:27 - Problem - Building Windows Container in CodePipeline
02:13 - Solution - Custom Action
This is how it works. You have your normal CodePipeline. You would have your source stage and your deployment stage. And in the middle, you would usually do a CodeBuild stage to build your container, package, or whatever. But in this case, we introduced a custom build action in the middle. It uses the Lambda function, step functions to orchestrate the whole thing. It uses EC2 instances to run the actual job. It uses Systems Manager to run the commands inside that machine. If I had to build this from scratch, I would probably just build another pipeline on something else for this specific service. But since this was already a packaged solution, it was easy to set up.
03:54 - Demo - Creating Custom EC2 CodePipeline Builder
How do you deploy that? This repo right here has everything you need. The only thing you need to do to set this up in an account is to run these two commands. First, the AWS CloudFormation package gives it a bucket to store the sources. Second, AWS CloudFormation deploys, and it will deploy a CloudFormation stack for you. It’s already set up here. That’s why I have the option in CodePipeline. This is a stack. As I mentioned, it deploys a bunch of stuff – Lambda functions, CloudWatch event rules, step function setup, the roles for the instance, and all that. This is just a default thing that comes in the repo, you just deploy it and it becomes an option in your CodePipeline. Once you have that, then build provider, you get this custom EC2 code pipeline, and you can use it now.
07:29 - Demo - Building Windows Containers in CodePipeline
And for those of you familiar with CodePipeline, when you use the ECS CodeDeploy action, you have to pass it a JSON file with the changes to your task definition. That is what these lines are doing. They are creating that JSON file and calling it Build.JSON. Why am I mentioning this, if we’re not adding a deploy stage yet? It’s because it’s one of the options here. Output artifact. Again, if you have used CodePipeline before, to be able to use files between CodeDeploy and CodeBuild, you need to define them as output artifacts. Because this is a custom action, the way to do it is you tell it here which files should become build outputs in this system, and then you will be able to use them in the next stage of your pipeline. I’m going to need more time to finish creating this one. I am just going to show you the one that is already in place. Let’s take a look at that one.
It has a source to pull the code from CodeCommit, and then it has build and it’s using the custom action right here. You can see the last build succeeded. I am going to trigger one more so we can see what happens. So let’s go back here. Let’s make a change in the Docker file. Change this to demo. Okay, so we have our change, and because the pipeline is set up to watch that branch, it should be starting right now. Okay, so the source stage is already triggered. Now we are building. Once this one starts, it takes a few seconds, but we will get a details link here. There we go. And the custom action is set up so that this link will take us to the step function.
This step function is the one from the custom action. Here we can see what it’s doing. The first part is creating an EC2 instance. In this case, it’s a Windows machine. It’s going to wait until the machine is ready. It reports us ready on Systems Manager. This can take a couple of minutes, then we move on to start command execution. This is the command that we gave it in our pipeline. For this one, I am executing the script I showed you.
On our build, we can edit. This is our configuration – an AMI instance type, output artifact. And here’s the important one, the command. It’s just executing that CI.PS1 PowerShell script. It’s getting the region as a parameter and the repository name. It’s an ECR on this account called Friday Demo. That’s the command execution that it’s going to start here. It looks like it’s already running. It starts and then it’s going to wait, and it’s going to be stuck here for awhile while that Docker build is happening. Then once that completes, it moves on to destroying the EC2 machine that it’s using. Then it sends a report back to CodePipeline. Now, one of the drawbacks of this being Windows and being this whole EC2, Systems Manager, Lambda thing is that it takes longer. For example, let’s go back to the previous execution. This one took 20 minutes, and if you remember the Docker file that I showed you, it’s not doing anything. It just takes 20 minutes to bootstrap everything. Then the Docker build part happens in two or three minutes tops. That’s one of the cons of this solution. But again, if you really needed to build Windows containers in CodePipeline, this is the easiest way to go.
One other thing we can take a look at while this is happening. You notice here it’s waiting. What is it waiting for? It’s waiting for a run command in Systems Manager. That is under Systems Manager here on the left Run Command. We see here in progress. One target because it’s just one instance. But if we go back to command history, this is the successful one from before. We can take a look at this one. This is the output of the PowerShell script. Here, it’s authenticating to ECR, and the login succeeded. Then it’s doing the Docker build, Working Directory, the copy, and then the push. That’s it. If we go back to ECR, my Friday Demo is here. You can see that we have one image right here. As soon as this one is done, we would have another Docker image. As I mentioned, it took 20 minutes to complete. I’m sure nobody wants to sit here and wait for this to complete execution. That is what I have for today. Again, really specific use case, but if you ever need to build Windows containers in CodePipeline, let me know and I can help you out.
Carlos Rodríguez
DevOps Team Lead
nClouds
Carlos has been a Senior DevOps Engineer at nClouds since 2017 and works with customers to build modern, well-architected infrastructure on AWS. He has a long list of technical certifications, including AWS Certified DevOps Engineer - Professional, AWS Certified Solutions Architect - Professional, and AWS Certified SysOps Administrator - Associate.