Categories
AWS CodeDeploy Continuous Delivery Jenkins

Jenkins Plugin – AWS CodeDeploy

Our company uses Jenkins for our continuous integration. It is responsible for building .NET projects, and sometimes deploying our projects. One of the problems we are trying to solve is being able to deploy Windows services. Currently, we manually deploy these services. We copy and paste files to the right server and then run a batch file to install the service. This process is time consuming and error-prone. We decided to improve these manual deployments with a Jenkins plugin called AWS Codedeploy.

First thing that we need to do is install the Jenkins plugin. Go to your Jenkins site, and install CodeDeploy plugin and restart Jenkins.

Now that we have the plugin up and running, let’s update a job and add our code deploy settings. The plugin needs to know: application name, group name, region, revision info (s3 bucket and prefix), and other settings. Once we have our job setup with codedeploy settings, we can start modifying our applications to instruct aws codedeploy what to deploy and how to do deploy it. We need to add a yml file which contains what files need to be deploy. The file needs to be named appspec.yml and it needs to be in the root of project. Take a look at this page to see appspec examples.

In addition to the appspec file, we also need to install the aws codedeploy agent in our on-premise servers or aws instances. In our case, we will install the agent in our on-premise servers. After installing the agent, we need to register the servers with AWS CodeDeploy. In a nutshell, the agent is listening for request for applications to be deployed.

We now have our application and server ready and it is time to deploy our windows services. After Jenkins builds our .NET window service, the codedeploy plugin will register a new revision and start the deployment process. With the help of appspec hooks, we can use powershell to install and start the service. CodeDeploy provides hooks so that developers can integrate with the different events. We can use BeforeInstall, AfterInstall, ApplicationStart, and ValidateService hooks. For our example, we can use the AfterInstall hook to install the service and then start the service.

Now that we have fully automated our .NET deployments, our developers can spend more time adding new features.

Categories
.Net ASP.NET ASP.NET MVC AWS CI Cloud

.NET Build Server using Visual Studio Community 2017

In a previous post, I wrote about building a .NET continuous integration server using Windows Server 2016, Jenkins, and Build tools. For this setup, I avoided installing any Visual Studio software. In my opinion, it was a simple process to get everything installed and building correctly. Now that Visual Studio 2017 has been released, I want to setup a build server using the community edition. Here are the 7 steps I took to get a .NET build server using Visual Studio 2017 Community edition, Jenkins, Git, running on Windows Server 2016.

1. Launch a new Windows Server 2016 instance. I’m currently investing on learning AWS so I’ll use it as my cloud provider. Feel free to use Azure, Google Cloud or your own server.

2. Download and install Visual Studio 2017 Community edition. For this tutorial, I selected ASP.NET and web development option during the installation.

3. Download and install Jenkins. I selected version 2.46.1 for this tutorial. After installing Jenkins, you have the option to install recommended plugins or select them one by one. For this tutorial, I went with the option to installed recommended plugins.

4. Download and install Git. If you have a different source control tool, go ahead and install it. After installing Git, I went to the Global Tool Configuration section and updated the path to C:\Program Files\Git\bin\git.exe.

5. Install MSBuild plugin. Go to Manage Jenkins section and select plugins. From the available tab, find MSBuild and install it. I also updated the path in the Jenkins settings to C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\MSBuild\15.0\Bin\msbuild.exe.

6. Create a new Jenkins job. I used the free style option. For the settings I use Git as my source control tool. Since I’m building a .NET solution, I selected the build action that uses MSBuild and give it the solution or project name.

7. Trigger a new build. If all of the steps above were done correctly, you should have a successful build.

Installing Visual Studio Community edition makes it easier to have a build server setup. After installing it, you get the .NET framework needed plus the MSBuild executable. In my previous setup, I had to install more software. Build Tools 2015, .NET framework 4.5.2, 4.6.2. Hopefully this post helps you setup a reliable build server for your .NET applications.

Categories
AWS Cloud

What resources get created by AWS Elastic Beanstalk?

Creating a new beanstalk application is a simple process. Give it a name and select a platform. That’s it. Simple. However, it is important that we understand what resources get created when we launch a new AWS Beanstalk app. In this post, we will explore all resources created by AWS Beanstalk.

What is AWS Beanstalk?

Before we dive into the details, let’s take a moment and explain what Beanstalk is. This is how AWS defines Elastic Beanstalk, “AWS Elastic Beanstalk is an easy-to-use service for deploying and scaling web applications and services developed with Java, .NET, PHP, Node.js, Python, Ruby, Go, and Docker on familiar servers such as Apache, Nginx, Passenger, and IIS. You can simply upload your code and Elastic Beanstalk automatically handles the deployment, from capacity provisioning, load balancing, auto-scaling to application health monitoring. At the same time, you retain full control over the AWS resources powering your application and can access the underlying resources at any time. There is no additional charge for Elastic Beanstalk – you pay only for the AWS resources needed to store and run your applications.”

Load Balancers

When you launch a new Beanstalk app, you have 2 options available for your scaling needs: single instance and load balancing, auto scaling. If you select single instance, your environment will always run with 1 instance only. You might use this setting to test a simple web app or service. If your app needs to scale out, you can create your environment with load balancing, auto scaling. By default, the minimum instance count is set to 1 and 4 for the maximum instance count. You can also change these values by going to the configuration section.

To see the load balancer created, go to EC2 console and click on the Load Balancers section on the left side menu. Here you can see the load balancer name, dns settings, VPC ID, availability zones, type, and created date. Most of these settings can be controlled by the configuration settings in your Beanstalk console.

Auto Scaling

For the auto scaling settings, we find a launch configuration and auto scaling group created. The launch configuration created contains information about the type of instance required in the auto scaling group. In my example, I see AMI ID ami-365ae520, Instance Type t1.micro, and other settings.

In the auto scaling groups section in the EC2 page, you will see a link to the launch configuration, min and max instances, availability zones, default cooldown, and other sections. Most of these settings can be changes from your Beanstalk configuration.

EC2 Instance

If you go to the Instances section, you will see that a new instance was created. In my case, I see the instance name, instance ID, public DNS, and other settings as well. If you have a scaling trigger set, other instances could be created. The most common option is to scale out based on CPU usage. For example, you can specify that a max of 75% CPU utilization will scale up and a min of 25% will scale down (terminate instances).

In addition to these resources, Beanstalk also creates IAM roles, S3 objects, security groups, and others. Creating a new AWS Elastic Beanstalk is a very simple process but understanding what gets created behind the scenes is crucial.

 

Categories
AWS Cloud General

How to migrate your wordpress blog from Bluehost to Amazon Lightsail

My first blog post was created back in April 25, 2014. I had used Godaddy before but I wanted to try a different host. After my research I decided to host a WordPress blog using Bluehost. Bluehost is a great host and I highly recommend them. One of the great features they have is automatic backup of your WordPress site and MySQL database.

Why migrate?

Lately, I’ve been studying to take the AWS Certified Developer Associate exam. To force myself to learn AWS in depth, you have to use it frequently. That’s why I decided to migrate my blog from Bluehost to Amazon Lightsail. With Lightsail, it is very simple to create a WordPress site. You select the WordPress for your instance image, select a plan, and finally name your instance.

Here are the steps I took to migrate my WordPress blog from Bluehost to Amazon Lightsail:

Website Backup

Since Bluehost provides daily backups of my WordPress site, the only thing I had to do was download the public_html folder. This folder contains all files needed to run WordPress. If you are using a different host, you can find your WordPress folder by searching for wp-admin, wp-content, and wp-includes folders. Once you find these folders, backup the parent folder.

MySQL Database Backup

For the database backup, I also downloaded the backup files provided by Bluehost. There is a sql script to create the database and another sql script that contains the actual data we want to migrate to Lightsail. In my case, I only kept the file that contains the insert statements to migrate the data.

Create Lightsail Instance

This is a simple process. Go to Amazon Lightsail console, and click on Create Instance. Then select Apps + OS, WordPress, select a plan, and name your instance. It is that simple.

Setup FTP Access

By default, Amazon Lightsail has ports 22, 80, and 443 open. However, if you try to connect to your instance, it will not work. To have FTP access to your instance, download your default private key. Once you have the .pem file, upload it to your FTP client. In my case I used Filezilla and was able to connect and copy files.

Copy files and Migrate Data

Since Amazon Lightsail created a new WordPress site, I refused to restore those files from my Bluehost backup. I compared both sites and I noticed files that were only used by Bluehost. I decided to only copy files needed for this migration to work. The first file was wp-config.php, which contains the database username, password and other settings. I modified this file to contain Lightsail’s database information. I also copied the sql script to migrate the data. After copying this file, I was able to connect using SSH and ran the script against the MySQL database.

DNS Changes

With the data migrated from my old Bluehost database to the new database, it was time to make the DNS changes. I first went to Bluehost and update the nameservers to point to Amazon. After doing that, I created a DNS zone in Lightsail console and A records for solutionsbyraymond and www.solutionsbyraymond.com.

As you can see, it was a very simple process to migrate from Bluehost to Amazon Lightsail. At the beginning it took hours to read the documentation but at the end I had no issues with the migration. I hope this helps other developers trying to migrate to Lightsail.

Categories
AWS

Resources to Help you Prepare for the AWS Certified Developer Exam

For the last few months, I’ve been preparing to take the AWS Certified Developer Associate exam. At my job, we use S3, SES, DynamoDB, EC2, CodeDeploy to develop our products. I’m planning to take the practice exam on March and the real exam on April. In this post, I want to share the top 3 resources that have helped me prepare for this certification.

  • Play with the SDK

AWS provides SDK in different languages like .NET, Java, JavaScript, Ruby, and others. Download and setup the SDK. AWS provides sample projects to get you up and running quickly. I highly recommend creating your own projects and calling different methods. For example, the S3 api has PutObject to upload new objects. To retrieve objects from S3, call GetObject method. Play with the different services and methods to gain experience.

  • Amazon Web Services Youtube Channel

This is one of my favorites resources to study for the AWS certified developer exam. The AWS youtube channel has videos on re:invent 2016, architecture, startups, training. When I had the need to learn more about VPC and EC2, I watched the reinvent videos from last year and it gave me a solid foundation on those services.

  • AWS services FAQ

Lately, I’ve been reading the FAQs for the different services so I can take notes for future reference. For example, the FAQ for Amazon VPC has given me a strong foundation on how the service works. There are limitations on what the service can and can not do.

Well this was a short post but hopefully will help other developers pass the certification exam. Wish me luck.

Categories
.Net ASP.NET ASP.NET MVC AWS CI

5 Easy Steps to Setup a .Net Build Server on Windows 2016

winsrv2016

.NET developers have the luxury of using Visual Studio to write code. In my opinion, Visual Studio is one of best IDE in the market today. When you build your applications inside Visual Studio, you are using MsBuild to compile your code.

In the past, setting up a .NET build server without installing Visual Studio was challenging. Now that Microsoft is releasing more software as open source  software, setting up a .NET build server can be accomplish with no issues.

In this post, I want to share the steps I took to setup a .Net continuous integration server running on Windows 2016.

First, launch a new instance with Windows 2016. I’m using AWS but any cloud provider would work as well.

After launching the new Windows 2016 server, it is time to install the necessary software to create our build server.

  1. Install Microsoft .NET Framework 4.5.2 Developer Pack.
  2. Install Microsoft .NET Framework 4.6.2 Developer Pack.
  3. Install Microsoft Build Tools 2015.
  4. Install Jenkins 2.19.3.
  5. Copy files from developers’ machine located at C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\WebApplications to the same exact location on the new Windows 2016 server.

So far I have only tested this build server with a MVC website targeting .NET framework 4.5.2. Let me know if you run into any issues with this setup.

I hope to test different apps and also add support for .NET Core.

If you are using AWS, you can launch the same instance I have created by searching for “Jenkins Build Server 2.19.3”.

Categories
AWS Git Source Control

Getting started with AWS CodeCommit

Amazon is always creating new products and services. On November 2014, AWS announced new tools to help developers manage their source code and deployments. One of those tools was CodeCommit, a managed revision control service that hosts Git repositories.

On July 2015, AWS announced thru their blog that CodeCommit was available for production use.

Let’s create a new repository to see CodeCommit in action:

  1. Create new repository
    – Sign in to the AWS Console, and go to Developer Tools > CodeCommit
    – Click on Get Started link
    – Enter a repository name and description
    CodeCommit
  2. Shared the new repository
    – To connect to your new Git repository, you have 2 options: SSH and HTTPS. To keep it simple, let’s go with HTTPS.
    – Create a customer managed policy for your repository. Follow this article for complete instructions.
    – Create IAM Group and assign users to that group. Follow this article for complete instructions.
  3. Clone CodeCommit repo locally
    – If you are on windows, you can use the command and type:
    $ git clode https://git-codecommit.us-east-1.amazonaws.com/v1/repos/yourRepoName
  4. Now you are ready to start making changes.

If you have used Git before, you can continue using the same tools as before. Github and Bitbucket also allow you to host public and private repositories. With AWS, all repositories are private and need specific permissions to be setup with IAM policies and groups.

I’m still playing with this new AWS service and plan to write more in-depth articles on it.

 

Categories
AWS Code Deployment CodeDeploy

Introduction to AWS CodeDeploy

aws code deploy

It is 2015 and still I see organizations doing software deployments in ways that are error prone. I have seen deployment instructions that span multiple pages. Those documents look like a TV manual. Do you know what we do with those manuals? We don’t read them. That’s right.

No one reads those long boring manuals.

I have also seen manual deployments. In manual deployments, a developer will take the artifacts and copy and paste those files to a production environment. Human errors are likely to occur with these manual deployments.

Don’t worry. I have good news for you.

I want to introduce a tool by Amazon Web Services that solves this issue. The tool is called AWS CodeDeploy. Code deploy allows you to practice continuous deployment. No more manual deployments and no more long documents with instructions on how to deploy your code.

The nice thing about CodeDeploy is that allows you to deploy your applications to AWS instances and also on-premises servers. If your code runs on windows servers or linux servers, aws codeDeploy can handle that for you.

I’m a .net developer so I have experience deploying .net applications to windows servers. In a nutshell, you have to follow these steps to start deploying your apps:

1. Setup IAM user, instance profile and service roles
2. Install the code deploy service agent on your aws instances3. Add an appspec.yml file to your application
4. Setup a new codedeploy application along with a deployment group
5. Finally, add a deployment for your application

I hope this brief introduction will help your software development team consider AWS CodeDeploy to automate your deployments. In future posts, I will go into more details about the setup. Enjoy!

 

 

Categories
AWS Cloud python

What I learned by reading the AWS CLI codebase

aws

I’m a big fan of Amazon, specially the Amazon Web Services. In this post I want to share what I learned by reading the AWS CLI code base. For those of you not familiar with those acronyms, it stands for Amazon Web Services Command Line Interface. AWS CLI allows you to manage your AWS services through a command line interface. You can download the cli at http://aws.amazon.com/cli/. If you are using windows, there is an installer available. If you are using Mac or Linux, you can install it with pip by running this command

pip install awscli

After installing and configuring it, you can use the different services available to manage your resources. For example to list all of my s3 buckets, I can run this command, “aws s3 ls”. See the picture below to see the results.

aws cli

Now that you know more about the aws cli, let’s dive into the code base. The project is hosted at github and you can read the code at https://github.com/aws/aws-cli. It is written in Python. It has unit, functional, and integration tests. It has an extensive set of examples on how to run commands.

The code is integrated with travis ci and it is tested against 4 versions of Python. The code runs against 2.6, 2.7, 3.3, and 3.4 versions. This file also run the installation script and tests scripts.

There is also a CLI Dev Version section on the home page of the project. It gives you enough information to setup your project and start contributing back to the project.

And finally, the project has documentation available. I hope this posts give you a brief introduction to Amazon Web Services Command Line Interface’s code base.