Adopting Docker in Development and Production
Software development process has seen trends come and go throughout the years. More recently we have seen the rise of the “No Code” movement, which leverages advances in AI/Machine Learning to speed up and simplify the development process. No code development is essentially drag-and-drop development, and it allows anyone to be able to build applications. But we are not here to talk about how machines will take over everyone’s jobs, we are here to discuss the ever-growing adoption of docker.
Docker:
The past couple of years, docker has seen more and more usage by companies and hosting services… and for a good reason. It’s the quickest and easiest way to containerize your applications and that has a huge advantage for pushing apps to servers directly. It has proven very beneficial to developers, providing multiple advantage:
- Lightweight: running virtual machines is way more demanding than running multiple containers with docker since containers virtualize the OS rather than the hardware. As such, containers are faster to create and lighter to run.
- Portability: running your application on a container allows you to test it on the same environment that it’ll run on when deployed to a server. As containers are not dependent on the OS, the same container will behave in the same way on any machine.
- Isolation: containers run isolated from other processes and as such you can run multiple containers with different versions of the same languages, and they would all still run.
- Scalability: you can always create new containers when you need to. This makes docker perfect for running a microservice architecture if you want. Allowing containers to communicate with each other as needed.
Of course, docker is not without flaws. I mean, it’s pretty annoying to keep up with containers, especially on large scales. Managing multiple containers running and communicating with each other becomes overwhelming and difficult to manage manually. And that’s where container orchestration comes in.
So, what is container orchestration?
Container orchestration is basically the automation of a container’s life cycle. It is important to manage the deployment, the networking aspect, the health of the container and much more. In large scale systems where containers might operate multiple other containers through microservices, orchestration keeps the complexity manageable.
Containers are at their full potential when paired with orchestration. The added benefits include:
- Simplified operations: by removing the need for manual work, the complexity of containers decreases immensely, making them easily manageable.
- Increased security: since the processes are isolated, the security is instantly scaled up. Orchestration tools also make sure of the shared resources, providing security for the resources that should not be shared.
- Optimized microservices: orchestration allows the containers to work efficiently with the containerized microservices, perfecting the flow of work within the system.
- Efficient resource allocation: optimizing the resource between the different containers is essential in order to keep everything working nicely.
There are multiple tools out there, the most widely used include Kubernetes, docker swarm which run directly on the servers, and some cloud-based tools like Google Container Engine and AWS.
Developing with containers
Developing locally also benefits a lot from docker with the help of docker-compose and minikube. They allow you to run container tools locally, making full use of containers on your machine. Experimenting with new technologies becomes easy and unlocks new ways of creating proof of concepts. Implementing your new ideas can be instantly tested the same way they will run on your server.
Docker creates a perfect development environment as it allows the developers to test their applications on the same environment that will run on the server regardless of the developer’s machine. Clearing this need for global dependencies and environment variables allows for the optimal testing environment.
Developing with containers offers a lot of benefits:
- “It works on my machine” is obliterated, since we’re not dependent on the OS anymore.
- We can easily test different databases and technologies to make sure what’s best for us.
- Creating and shutting down development environment is a relatively quick process.
But as everything, issues still exist. Experience helps you tremendously in this situation, but as a developer who hasn’t worked with docker and dockerfiles, you will have a hard time understanding how everything works at first. The process can get tedious and it creates a sort of barrier for newcomers.
How myceleum follows this trend
myceleum follows the same ideology of making your workflow easier. It incorporates multiple tools that you need and gives you easy access to them through one interface. You click a few buttons, and your development environment will be ready to use and containerized. This means that it will not break, nor be affected by other environments or applications.
myceleum sort of recreates what orchestration tools do, as in myceleum allows you to manage all your containers seamlessly and effortlessly.
For developers, myceleum opens up new horizons with the easy to setup containerized developments. All you need to do is download docker and download a spore from the application without having to write your own dockerfiles. All the docker images are managed by myceleum and you don’t have to worry about that – which is especially amazing if you’re unexperienced with docker.
Written by Serge