Writing automation tools for Amazon Web Services


Amazon Web Services (AWS) is one of the leading public cloud providers and is used by a wide range of businesses, from neophyte startups to big companies like Netflix.

At codebender we use Amazon Web Services to host our infrastructure. We use EC2 for our servers, S3 for our user libraries and RDS to host our MySQL databases.

In this article, I am going to show how you can use Amazon's Python SDK to automate your routines in AWS and make your day more productive.


AWS Management Console is the most common way to manage your resources, but it has some limitations. When you have to perform many actions with different services, the user interface slows you down.

This is where the command line comes in.

Amazon already provides a command line management tool called awscli. The tool is pretty handy if you want to perform basic operations manually, but for repetitive tasks with different parameters and options, it becomes cumbersome to use so you end up writing shell scripts. A small shell script using awscli commands could be enough to achieve the task you want, but when you have to manage multiple services, you need more control and flexibility.

Enter Boto.

Boto is the Python SDK for Amazon Web Services. It provides an easy to use API for the majority of Amazon services like EC2 and S3.

The boto project is comprised of various modules. The most well-known are:

  • Boto, which is the module that is used to write AWS Python tools to interact with the services.
  • Boto 3, which is a rewrite of Boto, and the next stable version of Boto.
  • Botocore, which is a low-level interface for AWS. Boto3 and awscli are built on top of Botocore.

Some basic cases where you can use Boto:

  • Start/Stop EC2 instances
  • Assign Elastic IPs
  • Register/De-register instances from Elastic Load Balancer
  • Create snapshots for your instances
  • Check the state of your instances

The cases do not stop there. You can perform many more complex tasks with the SDK.

Now, I am going to scratch the surface of boto3 and show you a code example. For the purposes of the article, I wrote a small tool that retrieves basic information from the running EC2 instances:

The expected output is:

Name: instance-name-0
Type: m3.medium
State: running
Private IP:
Public IP:
Launch Time: 2015-09-20 04:44:16+00:00
Name: instance-name-1
Type: m3.xlarge
State: running
Private IP:
Public IP:
Launch Time: 2015-10-01 08:27:00+00:00

Thanks to the extensibility of Python, you can integrate other modules into your tools that interact with external APIs. You could add, for instance, support for Slack notifications or email reports via Intercom. Your tools could be also used in conjunction with cron daemons to run frequently on your servers.


  1. While implementing a tool, have security in mind. Try to either rotate your AWS Access Keys on a regular basis or use temporary security credentials. If a machine is compromised, then your access keys are compromised too.

  2. Always create strict IAM Policies for your tools. You should ensure that your custom information extractor tool for S3 buckets cannot delete or add any object.

  3. Although writing your own custom tools is great, do not try to reinvent the wheel. There are plenty of open source AWS tools on Github. Some examples are s3cmd and aws-missing-tools.

  4. Do not try to replace other Amazon services (i.e OpsWorks, Beanstalk, CloudFormation)—it’s more trouble than it’s worth. Either use these services or add other mature alternatives in your stack.


Identify the patterns of your repetitive tasks on AWS Console and try to automate them if possible. Use Python's power and Boto's modular architecture to write awesome tools and share them with the open source community.

Now go automate your routines in AWS and focus on your primary tasks!