2018 Year In Review

The year is coming to an end. I want to take this opportunity to review what I accomplished this year. I also want to share ups and downs during this year.


December 15 2017 was my last day at MD Buyline. I worked there for 2 1/2 years. For a very long time, I tried to convince my wife that I needed to take a couple of months off from work. She never agree with me on taking a sabbatical. This was the perfect opportunity to take time off and spent time with the family. During this month, my wife became very sick with a cold/flu that refused to go away. I was her nurse. She got better one week and the next one she got sick again. Her immune system was very weak and that’s why she couldn’t recover sooner. I took my son to school everyday and had a great time doing it. After coming back from school, I would prepare breakfast and clean the house. Somedays I would go to the park to walk and relax. This month I wrote couple of posts on AWS CodeDeploy, a continuous deployment service.

Creating AWS CodeDeploy Application Using .NET SDK


Same story with my wife’s situation. Cold/flu refused to go away. She visited different doctors and nothing seem to cure her. She continued resting and taking her medications. I began to worry about my wife’s situation. She was not getting better. I believe she started getting sick from December 2017. It was a long time to be sick. I’m glad I didn’t have a full-time job during this time. I had the opportunity to take care of my wife and kids. My wife’s aunt lives with us and she was able to help as well. My aunt is an excellent cook so food was not a problem for us. With more time in my hands, I was able to write 2 blog posts and also continue gaining hands on experience with AWS. Last week of February, I updated my resume and was planning to look for a job next month. I also wrote some python code and created the following blog post:

Sending emails with AWS Python SDK


My wife was feeling better now. Thank God. I started looking for a job using linkedin, dice, indeed, and my network. I contacted Shawn about any job opening he may know. No response for a couple of days. But I was contacted by Justin, owner of JBHWorks, a call center company based on Lewisville Texas. Justin explained me that Shawn gave him my contact information. After speaking with Justin over the phone, we spoke about technology and software for 30 minutes. The following day we had lunch and both parties agree to work together. I was excited because it was a part-time job working from home. Who likes to deal with traffic?

Before I applied to any jobs my wife told me to get a remote job. She was feeling better but was worried that she might get sick again. If I was a remote worker, I would be able to assist her if needed. Since I’m an obedient husband, I applied to remote only jobs thru linkedin. After securing a part time with Justin, Ryan, owner of TheWarehouseCo, replied to my application. I had a 15 minutes hangout with Ryan and the next week a 1 hour technical interview. We both agreed to work together and started work mid March.

This was my first time working remotely. In other jobs, I was only allowed to work from home 1 day per week. This time it was just me and my mac computer. It took me a couple of weeks to get used to working by myself without human interaction. I really missed those 5 minute coffee break.

How to deal with 2 part-time jobs?

During the mornings, I wrote web forms code for JBHWorks. After lunch, I wrote .net core code for TheWarehouseCo. March 6 was my birthday but didn’t celebrate it because my wife was not fully recovered. Sometimes we have to make sacrifices for the people we love.

I also wrote a blog post on using AWS Python SDK and Simple Queue Service.

Using AWS Python SDK to Read Simple Queue Service Messages


April is one of my favorites months of the year. Weather is perfect. I took advantage of the weather and took my break from work to go to Kiest park. It was awesome experience to walk and listen to the birds. I continued working for both companies during this time. I learned more about .NET Core, React, Docker, and Twilio.


I continued working for both companies. My daughter’s birthday was coming up soon. On a Friday, we were playing outside and she had an accident. The rope broke from her swing and she felt hard on the concrete. She landed on her right cheek. She hugged me and didn’t wanted to let go. Next day we took Abby to see a doctor and the x-ray shows a broken collar bone. We were devastated with these news. Doctor suggested to see an specialist to see if surgery was needed. We went to Medical city and the doctor took new sets of x-rays and same thing, broken collar bone. The doctor explained that no surgery is needed since kids’ bone are growing.

I wrote about a very useful Jenkins plugin to integrate AWS CodeDeploy.

Jenkins Plugin – AWS CodeDeploy


We went to see Abby’s doctor again. The new x-rays shows bones growing and coming together. Great news! As far as work is concerned, I was able to integrated 2 CRM providers: ConnectWise and AutoTask. I had a very difficult time with ConnectWise’s upload documents api and after many tries I was able to integrated with JBHWorks’ services. I wrote a post with sample code to help other developers.

How to Upload Documents using ConnectWise API


During this month, both of my contracts ended. I spoke with my wife and decided to take another break from work. Working as a contractor has advantages and disadvantages. One of the advantage is that you can plan how much you want to work. For example, commit to a 3 month contract and then plan to take a month off after your contract ends. I take this time to spend time with the family. My son was on vacation so I take this opportunity to spend time with him. I continued learning more about AWS. In 2017, I took the AWS certified developer associate exam and failed at it. But this time, I have more time to study and prepare for a re-take exam. Created my Azure account and began using different services. I also wrote a blog post titled Azure Resource Group.

Azure Resource Group


Purchased all supplies for my kids’ school. My daughter started pre-kinder 4. She cried a little at home for the first 2 weeks. It’s normal, right? Now she is having fun learning at school. My son, Samuel, helped her sister during this time. He encouraged her sister to go to school.

I have only used 2 recruiters in my career: Robert Half and Prestige Staffing. Robert Half assisted me when I move from PrintPlace to TailLight. Prestige helped with my transition from Verizon Cloud to MD Buyline. For this transition, I worked exclusively with Prestige and went to couple of interviews. I was offered a job but Plano was too far of a commute for me. I also didn’t perform well on other interviews.

Once you start writing, it’s difficult to stop. In this month, I wrote 3 blog posts:

Understanding IAM policies

Host a website using AWS S3

How I landed my first job in IT


In addition to working with Prestige, I decided to get more help with my job search. I contacted Robert Half and worked with Ashley. After our initial talk, we agreed on salary expectations and a comfortable commute. She gave me a list with 2 companies: Elevate, and Code America. With Code America, I went thru this process: non-technical phone interview, coding problem thru hackerank, and a final face to face interview. With Elevate, I went thru a similar process: technical phone inteview, and a final face to face interview. During the same week, I received job offers from both companies but at the end I decided to go with Elevate. To force me to learn more about AWS service, I wrote a blog post about Lambda using .NET Core.

My First AWS Lambda Using .NET Core

In this month I also became AWS certified developer associate. This was my second attempt and I passed with 86 out off 100. In my first attempt, I received a 55 since I wasn’t prepare to take the exam.


Now with a full time job, things have changed. I’m no longer able to work wearing my pajamas. I have to get up early and dress up. At work, everyone has been very helpful so far. We use Git, Azure, Visual Studio 2017, Angular, C#. Elevate has chosen Azure for their cloud needs so I decided to learn more about it.

Resources to get you started with Azure


I wrote a blog post about failures. Even the cloud providers with their extensive teams, and infrastructure have failed to keep their services running. I think software developers don’t expect enough time designing for failures. What do you think?

Thinking About Failures


For the last few months, I have studied for the AWS certified architect associate exam. I’m using acloud guru, reading AWS white papers, and completing labs. I have my own AWS account so I go thru real life scenarios to gain hands on experience. I’m planning to take the exam next month.

Overall I think 2018 was a great year for me and my family. Happy holidays.



Thinking About Failures

Recently, I was listening to a re-invent talk where Werner Vogels, AWS CTO, mentioned that everything fails all the time. As I was getting close to home, I started thinking about failures. Power failures. Hardware failures. Software failures. Software developers also encountered failures while building applications. One day might be a database issue. In other situations, it might be a hardware issue that it’s preventing you from getting things done. One of the main problems is that developers are not wired to think about failures. We are builders. We get paid to create new software programs. If you take a look at job descriptions, you will not find many references to failures. In many instances there is no planning for failures. We can learn a lot from Netflix as they have pioneer chaos engineering. Netflix relies on AWS infrastructure to run all its operations. At the beginning, every outage was an opportunity to ask questions without blaming anybody. It was an opportunity to improve the system. By asking the difficult questions, Netflix learned more about its strength and weaknesses. Soon Netflix realized that they needed to test these failures with complete control. A different team was created to bring chaos to Netflix’s system and processes. Without notice, an availability zone was removed. In another day, it was removing an entire AWS region. When these failures occurred, Netflix’s system will stop traffic to the failing region and start re-routing traffic to a good standing region.

We, as software developers, need to spend more time thinking about failures because they will happen sooner or later.

What do you think? Do software developers think enough about failures?


Resources to get you started with Azure

The cloud is here to stay. And with that in mind, we, as software engineers, have to keep our cloud skills up-to-date. AWS and Azure lead the cloud computing space in terms of services and revenue. For the last few years, I have gained hands on experience with AWS and was able to get certified as a developer associate. Now it is the perfect time to gain a deeper knowledge about Azure services. In this post, I’m going to share resources to get you up-to-speed with Azure services.

First, open up an account with Azure by visiting the Azure home page. The home page has a lot of resources to learn more about Azure solutions, products, documentation, pricing, training, marketplace, partners, support, blog, and more.

One of my favorite resources is to visit the get started guide for Azure developers page. It contains quickstarts, tutorials, samples, concepts, how-to guides, references, and other resources. I highly recommend downloading the sdk and building small apps. Nothing beats getting your hands dirty with code that calls Azure services. Currently Azure has SDKs in .NET, Node.js, Java, PHP, Python, Ruby, and Go.

Another resource that I use frequently is to read Azure applications hosted on Github. When I’m unable to come up with a solution, I search on Github for existing solutions.

Azure Friday is another great resource to learn more about Azure services and offerings. Scott Hanselman and company have created high-quality videos showing new features. On average, these videos are 15 minutes long.

A Cloud Guru has courses to help you get started with Azure. They have an introduction course and also courses that help you achieved certifications.

That’s it for this post. In future posts, I will target specific services and share my adventures learning Azure.

AWS Lambda

My First AWS Lambda Using .NET Core

As I prepare for the AWS Certified Solutions Architect – Associate exam, I have a need to play with more services. It’s crucial to gain hands on experience with these AWS services. It’s not enough to just read white-papers and faqs. I’ve heard good thing about AWS Lambda and now it’s time to build something with it. In this post I want to share how I was able to create my first Lambda function using .NET Core.

Before we dive into AWS Lambda, let’s understand what it is. Lambda is a service that allows you to run code without thinking about provisioning or managing servers. You upload your code and AWS handles the rest. Nice! Here is the official summary, “AWS Lambda lets you run code without provisioning or managing servers. With Lambda, you can run code for virtually any type of application or backend service – all with zero administration. Just upload your code and Lambda takes care of everything required to run and scale your code with high availability. You can set up your code to automatically trigger from other AWS services or call it directly from any web or mobile app.”

Now that we know what Lambda is, let’s install the require software to create a Lambda function using .NET Core. First install the lambda templates using nuget. Using a terminal or command prompt, type “dotnet new -i Amazon.Lambda.Templates”. It will install lambda templates so you can get up and running quickly. To test it, type “dotnet new” and press enter. You should see the following templates:

As you can see from the screenshot above, there are 2 categories of templates: lambda functions and lambda serverless. To keep it simple, I’m going to use a simple lambda function that integrates with S3. Now we need to install the AWS Lambda global tool. Using a terminal/command, type “dotnet tool install -g Amazon.Lambda.Tools”.

With the required software installed, it’s time to create our first Lambda using .NET Core. Using a terminal/command, create a new directory called “firstLambda” and cd into it. Now type dotnet new lambda.S3 to create a new function using AWS lambda templates. After creating the function, we need to update a config files with a profile and region. Using a text editor or IDE, open up the new project and update the profile and region setting in aws-lambda-tools-defaults.json.

AWS Lambda will use these settings to deploy and run your function. Let’s take a look at the Function.cs file.

The constructor takes an IAmazonS3 object and the async method FunctionHandler is where our main logic lives. Our lambda function is triggered by a S3 event like a delete object or put object event. Using the same event information, we retrieve the object’s metadata using GetObjectMetadataAsync and finally returning the contentType.

Let’s deploy our first lambda function to AWS using the CLI. Using a terminal/command window, type “dotnet lambda deploy-function agileraymond-1st-lambda”. I’m using agileraymond-1st-lambda as my function name. This command uses the profile and region in our config file so you have to make sure permissions are set correctly. Otherwise you will get errors. Also the command will ask you to provide a role or give you the option to create a new role. If you want to verify that your function made it to AWS, check the AWS lambda console.

To test our new lambda function locally, we can use the test project that was created along with our new function.

Go back to the terminal window and type “dotnet test” to run the integration test. If everything is setup correctly, you will see 1 passing test. That’s it for this post. In a future post, I’m going to test it using the AWS console.


Understanding IAM policies

One of the most critical components in any system is security. In AWS, security is at the top of their list. With Identity and Access Management, you can create users, roles, policies, and groups to secure your AWS resources. In this post, I’m going to share how to secure a S3 bucket by creating a new user with limited access. Let’s get started.

Create a new user

To create a new user, sign in to the aws console and select IAM. Select users from the left menu and click Add User. Add a user name and select programmatic access in the access type section.

Click Next. Since we don’t have a policy in place, click Next again.

Now it’s time to review our new user. Notice that aws is displaying a warning message that this user has no permissions. Click next.

We’re in the final step in creating our new user. Click on the Download .csv button. This file will contain the access key id and secret access key. We’ll use these items in the aws cli tool to access S3 buckets. You can also click on the show link below the secret access key header.

Now that we have our user ready, it’s time to create a new policy with limited permissions to a S3 bucket. Click on the Policies link on the left side menu. Click on Create Policy.

There are 2 ways to create your policy: using the visual editor and using a JSON file. For this exercise, I’m going to use a JSON file to specify the policy. Click on JSON tab next to Visual editor tab and paste below JSON.

This simple policy is allowing access to S3 PutObject action to a bucket named agileraymond-s3. As you can see, this policy is limited to what it can perform. AWS recommends that you follow the principle of least privileges. Only give access to the resources that your application needs. Click on Next and finally create your new policy.

With our new user and policy in place, we have to link our user to this new policy. Select your user and click on Add permissions button.

Click on the attach existing policies directly tab and filter policies by selecting customer managed from the filter menu next to the search input.

Click next and review your changes. And finally add permissions. We’re ready to test our new user and its permissions. Let’s use AWS CLI to test our new user. Using a terminal/command prompt, type aws configure and add access key, secret access key, region, and format. Make sure you select the same region where your resources are. In my case, I selected us-east-1 because that’s where my bucket resides.

Now, type “aws s3 ls” in your terminal window. You should see an error since we don’t have permissions to list. We only have access to PutObject for a bucket. To upload a file to our S3 bucket, type aws s3 cp myfile.txt s3://yourbucketname. If you go back to the aws console, you should see myfile.txt inside your bucket.

In conclusion, you have to secure your resources by default. Create new users with limited permissions. Give them access to resources that they need. See you next time.

AWS General

Host a website using AWS S3

Simple Storage Service was one of the first services offered by AWS. With S3 you can store your files in the cloud. In addition to storing your files, S3 allows you host a static website. In this post, I will share how to accomplish this task using the S3 console.

First, login to the aws console. Now go to the S3 console and create a bucket. To keep it simple, a bucket is like a folder or directory in your computer. For this example, I’m using agileraymond-web for my bucket name and US Virginia for my region. Click create button to create your bucket. With our bucket in place, we can enable it to host a static site. Select your bucket and click on properties tab.

Now click anywhere in the static website hosting section and select Use this bucket to host a website. I’m going to use index.html for my index page and error.html for my error page. Click save. Go ahead and create these 2 html files. To upload these files, click on the overview tab and click upload.

Add your files and click on upload button. In the overview section of your bucket, you will see 2 files. Currently the bucket and these 2 files are private. Since we are hosting a static website and other people want access to this site, we have to update the bucket permission. Go to the bucket permissions’ tab and select bucket policy. Copy and paste the below policy. Make sure to update the resource name. In my case, my bucket name is agileraymond-web but your’s will be different.
"Principal": "*",

Click save. After saving your policy, you will see the following message: “This bucket has public access. You have provided public access to this bucket. We highly recommend that you never grant any kind of public access to your S3 bucket.” For now, ignore this warning message since this bucket is acting as a public website. This policy allows all object placed in my bucket read access. It is time to test our new website. To get the URL, go to bucket properties and click on static website hosting. Next to the endpoint you will find the url. Copy and paste it in a new browser window and add /index.html to the end of the url. If everything is setup correctly, you will see the index.html page.

To test the error page, go ahead and delete index.html. After deleting index.html, try to browse to index.html. And now you should see the error page since index.html doesn’t exist anymore. As you can see, it’s very easy to create a static website using S3. See you soon!


How I landed my first job in IT

Before I tell you about my first job in IT, let me give you some background information. During my last year at Southern Methodist University, I got my resume ready to start applying for different IT jobs. I was able to attend a couple of job interviews but none of those interviews resulted in job offers. I graduated in May of 2001 and decided to take a break from my job search. I decided to continue working with my parents in their small furniture store. From 2001 to 2008, I devoted my time to improve the store and increase sales. However, the store was in a bad financial position. My brother, JR, secured a job with the City of Dallas as a code inspector. After my brother left the store, I also started applying for IT jobs. I was desperate to get into IT. So I started applying to dozens of places and went to dozens of interviews. Most of the hiring managers told me that they were looking for more experienced developers. My only experience at that time was school projects and applications I built for the furniture store. I was very disappointed and almost gave up my job search again. But this time I was determined to get a job as a software developer or any position in IT. I posted my resume in different job sites like dice, monster, and others.

I received a called from James Paul, co-founder of I couldn’t believe that someone was calling me about a job in IT. He gave me a brief description of the job and asked me to come to their offices for a face to face interview. The next morning I met James and Nic. The interview went well and the final step in the process was to speak with John. He was the software architect and I answered most of the questions correctly. Finally I spoke with Shawn, founder of PrintPlace and he offered me the job. I was so happy. Finally I was going to start my career as a software developer. In this role, I wore many hats, desktop support, setup phones, setup servers, and some .NET coding.

Now it’s your turn. How did you landed your first job in IT?

Azure Cloud

Azure Resource Group

AWS has been my cloud provider for many years. I have used it to host .NET applications, SQL Server databases, and other services like email, storage, queues. I have gained valuable experience in this platform but it’s time to play with Azure. In this post, I want to share how to create resource groups and their benefits.

First, let’s create a resource group. After you login to the Azure portal, click on Resource groups on the left menu. Now click on the Add button and enter a valid resource group name, select a subscription and location. For this example, I’m using dev-resource-group, pay-as-you-go, and South Central US to create my resource group.

A resource group is a container that hold resources like web sites, storage, databases in a group so we can manage them as a single unit. I like to think of resource groups as my dinner plate. I use a plate to hold my food (meat, vegetables, desert, etc) and once I’m done eating I can throw away the plate along with any food that is left.

Now let’s add a new app service. Click on App Services link on the left menu and click add. In the web apps section, select WordPress on Linux and click Create button. Enter required fields and make sure you select resource group created in the previous step.

Just to verify that our resource group is associated with our new wordpress site, click on Resource groups again and select the resource group. I see 3 items associated with my resource group: app service, azure database for mysql server, and app service plan.

Let’s create a new app service. Choose web app, create, and add all required fields. Make sure you select the same resource group from previous step. In the OS section, I select Docker and configure your container.

Now our resource group has a total of 4 items in it. These items belong to my dev environment and now I’m ready to delete this environment since I no longer need it. Select the resource group and click on the 3 dots and select delete resource group.

Type the resource group name and click on Delete button. After a few minutes, the 4 items associated with our resource group will be deleted along with our resource group. As you can see, resource groups allows us to group resources in a cohesive way so we can manage these resources in a better way. I have seen resource groups for different environments like dev, test, and production. Once you are done with your resources, just delete the resource group and it will save you a lot of time and effort.


How to Upload Documents using ConnectWise API

For the last couple of months, I have been using the ConnectWise API to integrate it with our custom software solution. It was fairly easy to add new companies, customers, tickets, and opportunities. Recently I was asked to add the ability to add documents. After reading the documentation, I was able to code the solution but it didn’t work as I expected. After many trials and errors, I was able to add documents using the ConnectWise API. In this post, I want to share my c# code to add system documents using ConnectWise API.

Here is the c# code to upload a document.

I hope that someone else can use this code and be able to upload documents. There is still room for improvement in ConnectWise’s system/upload documentation. The C# SDK does not have an upload sample code. I also want to mention that most of the c# code used to upload the document was taken from the internet. See you soon.



Last month I started working as an independent software developer. I was able to find 2 contracts writing .NET code. I was able to find the first job thru a referral. The other job was listed in linkedin. Both of these jobs allow me to do my work remotely. In this post I want to provide a quick update on these 2 contracts.

CRM Integration

In this job, I’m integrating 2 CRM providers ConnectWise and AutoTask. These CRM providers allow you to create companies, contacts, tickets, opportunities, etc. I’m using Visual Studio Team Services as our source code management tool. We’re using web forms for this project. On the JavaScript side, we’re using jQuery, Knockout, Bootstrap.

SMS marketing platform

In this job, we’re creating a new sms marketing platform. We’re using Twillio, Docker, AWS, .NET Core 2, ServiceStack, and Git. One of the challenge I have faced in this job is to learn these technologies since my knowledge was limited in these areas. Let me give you an example. Every time we commit code, our code base is built using Travis. After code is packaged, it gets deployed to Docker. Since I’m new to Docker, I had no idea how to debug code in a Docker container. The strange thing was that my code worked locally but the same code was not working on QA. However, after asking other developers, we came to the conclusion that the issue was not the code. It was an issue with our deployment. During a git merge, a line of code was removed that affected our deployment logic.

That’s it for now. See you next month.