gjp311 - stock.adobe.com
Top Docker best practices for container management
Many admins use Docker for container management, so they should explore best practices such as container backup procedures and exploring Dockerfile commands.
Containerization in and of itself offers various benefits, such as reduced overhead, improved portability and better application development. Docker helps increase the advantages of containers while also providing repeatable development, build, test and production systems.
To ensure successful Docker deployments, admins should implement several Docker best practices that include the use of Dockerfile commands, container backup procedures, Nginx load balancer basics and cloud container deployments.
Containerize applications with Dockerfile commands
Though there are numerous Docker commands admins can use to help manage their containers, Dockerfiles are useful for defining those commands, providing efficient application development, limiting resource contention and increasing deployment. Some of the more commonly used Dockerfile commands include:
- ADD. Copy files from a source on the host to a specific destination on a container's filesystem.
- CMD. Execute a command within a container.
- ENTRYPOINT. Set the default application that the system will use every time admins create a container with the image.
- ENV. Set system variables in a similar fashion to a startup script.
- EXPOSE. Reveal a specific port for networking to and from a container.
- FROM. Define the base image where they can initiate the build process.
- RUN. Run the central executing directive from Dockerfiles.
- USER. Set up a user ID to own the container.
- VOLUME. Enable access from a container to a path name on a host machine.
- WORKDIR. Set the working directory the system should use when executing the command.
Admins must account for memory usage as they build containers. If admins have memory limitations, they should use a more rigorous base image, such as Alpine, rather than a Dockerfile.
In addition, admins should familiarize themselves with Kubernetes. Though Docker does offer Docker Swarm as an orchestration product, a majority of major cloud vendors support Kubernetes compared to Docker for container orchestration.
5 container backup best practices for Docker
Without proper management techniques, admins risk complicating their container deployments. If backups are not correctly handled, admins risk misplaced restorations and incompatible components. There are a few container backup best practices that admins can use to ensure their Docker containers remain organized and available.
Keep track of version numbers. Admins who track their version numbers of each Docker container ensure admins do not restore a backup of an older Docker container component, which can lead to incompatible components.
Back up a single manager node periodically. Backing up at least one manager node ensures that admins can keep track of restorations as they occur. Each node includes identical Docker Swarm and Universal Control Plane (UCP) data, so admins only have to back up a single manager node.
Check manager availability prior to backups. Because backups affect UCP, any time admins execute a Swarm or UCP backup, the UCP interface displays a warning. This creates a pause in the UCP manager process as the backup executes. As a result, admins must provision clusters with at least five managers so the clusters can function if a manager stops.
Back everything up separately. Docker requires admins to separately back up each container component, such as Swarm, UCP and Docker Trusted Registry.
Restore a node backup as a last resort. Admins who use a high-availability deployment should only restore a node backup if no other options are available. Docker recommends replacing unhealthy nodes with new ones rather than restoring a node to ensure the node stays in a healthy state.
Use the Nginx load balancer to control Docker containers
The Nginx load balancer is a tool for admins to better control Docker containers and achieve both high availability and scalability. To install the Nginx load balancer, admins must first install two Ubuntu instances on separate hosts, then configure the hosts with static addresses.
Once completed, admins should use the sudo apt-get update, sudo apt-get upgrade and sudo apt-get install docker.io -y commands to start up a basic Docker instance running the Nginx load balancer.
Admins can then deploy a Docker instance but should store any webpages outside of the Docker instance prior to deployment; this action ensures that all Docker instances run the same code.
Once the webpages are set, admins can configure the Nginx load balancer. To do this, admins must install a third Ubuntu server on a VM, then use the sudo apt-get update, sudo apt-get upgrade -y and sudo apt-get install nginx -y commands to install the necessary Nginx components. Afterward, admins should reboot the VM and configure the Nginx web server from a VM to a load balancer.
From there, admins must back up the configuration file, called nginx.conf, by deleting the contents of the file and replacing them with the following code:
http {
upstream backend {
server 192.168.0.97;
server 192.168.0.98;
}
server {
listen 80;
location / {
proxy_pass http://backend;
}
}
}
To restart the load balancer, admins can run sudo systemctl reload nginx.
Deploy a Docker container to the cloud
When admins deploy a Docker container to the cloud, they should know that AWS and Azure do not offer stand-alone instances of Docker-optimized hosts. Rather, both services offer Kubernetes clusters that perform much of the deployment process for admins.
Admins who decide to deploy single container instances should create a standard VM with sufficient resources such as an Ubuntu VM.
Next, admins should use the public key authentication to secure the publicly exposed Secure Socket Shell (SSH) port and mitigate any security risks. The SSH client connects the bare host and admins can use the sudo apt-get install docker.io -y command to install Docker.
Once admins install Docker, they can deploy the application with the docker run docker/whalesay cosway command.
Once this process is done, admins can then deploy their containers to the cloud. In terms of Azure, admins may find it easier to use web interfaces of the Azure command-line interface.
For AWS, Amazon offers admins the AWS container registry that provides Docker image storage.