Continuous integration with Docker Deployments: the players

Docker is a relatively new open platform for building, shipping, and running distributed applications. Initially it was mainly used for the creation of development environments, allowing applications to be easily tested in controlled, reproducible environments. More recently, as people got a better feel for what it could do, it’s also being used for continuous integration, Platform as a Service (PaaS), and production deployments.

In this blog post I will discuss the ingredients needed for effective continuous integration and deployment using Jenkins and Docker. In a later post, we’ll talk about the process itself.

Continuous integration

CI is an organizational practice that aims to improve software quality and development speed by applying regular and automated unit tests against new code. Using a version control system, many development teams will regularly push new code from a project’s branches back to the main branch, allowing tested code to be quickly merged with the project and verified as deployable. A popular unit testing framework for Java code is JUNIT.

Continuous deployment

Continuous deployment is an automated process that ensures your application is always ready to deploy to production or development environments. By using both continuous integration and continuous deployment, development teams can always be ready to quickly deploy reliable builds and patches.

Ingredients

  • Jenkins is an open source continuous integration server.
  • Docker containers allow developers and system administrators to quickly and easily port applications with all of their dependencies and get them running across systems and machines.
  • Docker files are scripts run within the Docker environment to customize and configure new containers at launch time.
  • Amazon EC2 Instances are used to host multiple Docker containers.
  • Amazon S3 buckets are used to store build artifacts.

Jenkins

How does Docker work

  • Docker was designed using Linux containers (LXC). LXC is an operating system-level virtualization tool for running multiple isolated server installs (containers) on a single control host. The main difference between KVM virtualization and Linux Containers is that virtual machines require a separate kernel instance to run on, while containers can share the host operating system kernel. It is similar to a chroot, but offers much more isolation. Docker works much like a virtual machine, wrapping everything (file system, process management, environment variables, etc.) into a container. Docker really does let you “Build once, configure once, and run anywhere.”

docker_container

 

Docker Architecture Components

  • File system A container can only access to its own sandbox file system.
  • Users namespace A container has its own user databases, which means a container’s root is not the same as the host’s root account.
  • Process namespace Processes within a container cannot access or see processes in the host machine or other containers.
  • Network namespace A container gets its own virtual network device and IP address.

Common use cases of Docker

  • Automating the packaging and deployment of applications.
  • Creation of lightweight, private PAAS environments.
  • Automated testing and continuous integration/deployment.
  • Deploying and scaling web apps, databases and backend services.
  • Sharing your containers through the Docker index.

Continuous integration: conclusion

So far we covered terminology and players involved in the process of continuous integration using Docker and Jenkin. In a future post, we will explain practical workflow and processes and how we can use them with docker.

Nitheesh Poojary

Nitheesh Poojary

My professional IT career began 9 years back when I was just out of my college. then Moved as infrastructure management engineer, where I was working with a great team to manage hundreds of enterprise application servers. Finally my passion met with the opportunity when I got assignments on Cloud technologies. I'm addicted to AWS Cloud Services, DevOps engineering, tools and technologies which make life easy for the engineers. Currently I am working as a Technical Specialist in a reputed firm in Bangalore. I'm a Certified AWS and SysOps Engineer, happily helping fellow engineers across the globe with blogs (http://nitheeshp.blogspot.com) and answering questions in various forums. When not solving problems for my projects, I play sports, hang out with friends or travel to places that are in my wish list.

More Posts - Website

Follow Me:
TwitterLinkedIn