Should I be using docker?
5 votes c/freepost Posted by vaeringjar — 5 votes, 9 commentsSource

Over the last two years at my day job, my team has standardized tests and builds using docker containers. Overall this has helped quite a bit for me because it demonstrates that something can pass tests (or not when it fails) and run. In the past I’ve used other tools to harden builds (make/autotools/CMMI, jails, guix, nix). But a lot of newer developers don’t seem to want to do that, which I don’t specifically blame them. So containers are here to stay at my day job, or at least the container files.

So all that said, for me, should I keep using docker or should I switch to something else for my containers? Does anyone have opinions on a better alternate to docker?

Podman :-)
https://podman.io/

why

I’ve never used podman, but its benefits look very interesting compared to docker: https://www.smarthomebeginner.com/podman-vs-docker/

Cool thanks, I’ve seen it but haven’t tried it yet. I’ve heard it’s daemonless, which is nice because I swear I have to restart the docker service on my system at least weekly. I’ll check out podman.

Aside from that (assuming I’ve go it correct) are there any feature highlights you think I should know about?

Sorry I’d like to give you a straight answer but I don’t know a whole lot about Docker (or the whole containers landscape, that is). All I wanted to say is that I think of containers as operational/deployment tools rather than development tools. For development, my mind goes to other CI/CD tools such as Jenkins whose job is to trigger new builds, run tests, and merge changes (if no errors), when it detects new commits.

Can I ask you briefly what your build-test-deploy loop with Docker looks like? If I get this right, you are sharing a Dockerfile in the repository (or maybe more, for different configurations). This file contains a series of RUN steps, which execute make or other build commands. You build a new image every time, locally from your PC with docker build, and if there are no errors you push your code or deploy the image. Is this correct?

Basically, at the very least, I push changes, the CI runs the tests, and won’t deploy (the CD part) if the tests fail. I always run my tests before pushing changes, but it isn’t necessary. It’s just embarrassing to have failed builds that are linked to the repo commit history.

But there is another step that’s important to do locally especially with some of the old code (I’ve got things that here and there that go back decades), which I do occasionally: kill the network and run the tests (sometimes w/o the container) so that if any IO is actually trying to run during a test, we know about it. This part is sometimes more like testing the tests.

If I understand correctly, the CI runs the tests with/inside the container, and then the container is deployed. Is this right?

Yes, that way it prevents something with failing tests from being deployed.

In this case I think the answer to “should I keep using docker or should I switch to something else” is yes, you should probably use docker unless the majority of your coworkers agrees to change :)