You are here

Docker - automated build for my Ubuntu image

logoAfter having played with Docker a little bit, it's now time for me to try using Docker Hub with automated build. The first image I'll push is my Ubuntu image, slightly modified so that it uses nano, instead of emacs.

Beware: I'm still a naive beginner in Docker and Git...

GitHub repository

On GitHub, I create a new repository, named docker-ubuntu. Configuration: repository initialized with a README, and Apache License 2.0.

Populating repository

On my development machine, I create files for automated build:

  • clone the repository:
git clone https://github.com/PascalBod/docker-ubuntu
  • in docker-ubuntu directory, create following Dockerfile, copied from the tag 14.04.1 version of official ubuntu image, and modified to add nano editor:
FROM scratch
ADD trusty-core-amd64.tar.gz /

# a few minor docker-specific tweaks
# see https://github.com/docker/docker/blob/master/contrib/mkimage/debootstrap
RUN echo '#!/bin/sh' > /usr/sbin/policy-rc.d \
	&& echo 'exit 101' >> /usr/sbin/policy-rc.d \
	&& chmod +x /usr/sbin/policy-rc.d \
	\
	&& dpkg-divert --local --rename --add /sbin/initctl \
	&& cp -a /usr/sbin/policy-rc.d /sbin/initctl \
	&& sed -i 's/^exit.*/exit 0/' /sbin/initctl \
	\
	&& echo 'force-unsafe-io' > /etc/dpkg/dpkg.cfg.d/docker-apt-speedup \
	\
	&& echo 'DPkg::Post-Invoke { "rm -f /var/cache/apt/archives/*.deb /var/cache/apt/archives/partial/*.deb /var/cache/apt/*.bin || true"; };' > /etc/apt/apt.conf.d/docker-clean \
	&& echo 'APT::Update::Post-Invoke { "rm -f /var/cache/apt/archives/*.deb /var/cache/apt/archives/partial/*.deb /var/cache/apt/*.bin || true"; };' >> /etc/apt/apt.conf.d/docker-clean \
	&& echo 'Dir::Cache::pkgcache ""; Dir::Cache::srcpkgcache "";' >> /etc/apt/apt.conf.d/docker-clean \
	\
	&& echo 'Acquire::Languages "none";' > /etc/apt/apt.conf.d/docker-no-languages \
	\
	&& echo 'Acquire::GzipIndexes "true"; Acquire::CompressionTypes::Order:: "gz";' > /etc/apt/apt.conf.d/docker-gzip-indexes

# enable the universe
RUN sed -i 's/^#\s*\(deb.*universe\)$/\1/g' /etc/apt/sources.list

# add my own stuff RUN apt-get update \ && apt-get install -y nano \ && rm -rf /var/lib/apt/lists/* # overwrite this with 'CMD []' in a dependent Dockerfile CMD ["/bin/bash"]
  • in same directory, download daily Ubuntu release:
wget http://cdimage.ubuntu.com/ubuntu-core/trusty/daily/current/trusty-core-amd64.tar.gz
  • check downloaded file with md5sum.
  • update repository:
git add Dockerfile
git add trusty-core-amd64.tar.gz
git commit -m "Added Dockerfile and base Ubuntu tar.gz"
git push -u origin master

Pushing a large image to GitHub is not a good practice. In a next version, perhaps I could perform the wget of Ubuntu base image from the Dockerfile. But this would mean that my Ubuntu Docker image would change every time a new build is performed. I have to think more about this...

Linking DockerHub and GitHub

As explained in Docker documentation, I add a repository for automated build. I select Limited for the access type. Before giving more rights to DockerHub, I prefer to understand how things are done.

For repository name, I use ubuntu. On GitHub, I add Docker service. By testing it, or on any modification of the GitHub repository, DockerHub builds my image. Wonderful!