Hello again 🙂
After this little break of just three years, I’m coming back to share with you some good ideas.
Almost from the beginning, I’ve been observing the progression of containerization from the distance, and never before I had to look at it as a fundamental part of my next-five-years-toolbox. Maybe because the implementation of the solution was not mature enough, maybe daily wars didn’t give me the chance to check it properly. Nowadays my opinion on this topic
dramatically changed evolved.
I have the intention within the next posts to share with you some quick tips on hands-on containerization. The idea is to show narrow-focused techniques instead of developing a big architectural debate. Let’s start from the beginning.
During these last years, my work has been focused on helping teams to deliver value, preferably as software. Whether as a developer, consultant or software craftsman. This kind of work has given me the opportunity to collaborate in different types of business, with different maturity levels and problems to solve. When this diversity becomes a constant in your daily work you tend to perform a series of effective procedures that, over the years, have become repetitive.
I always like to start from the beginning: analyzing the code. You can obtain, with a little effort, a lot of data from the code, about the team, business, processes… The matter is whether or not we are capable of transforming all these data into information to make a proper analysis. But that’s another story.
For this first analysis, I like to use SonarQube. It provides an excellent and totally configurable way to perform a statical code analysis. It’s powerful because it gives you good information immediately, even if you’re not familiarized with the tool nor the code quality idiom. In the long term, using this tool puts you on a path that, with a bit of perseverance, will take you to improve your code quality. Playing with this application you’ll gradually acquire valuable knowledge on the code quality field.
Infrastructure as a tax
After many years trying different methods to obtain as much information as possible from the code, in the shortest time, SonarQube has revealed as one of my favorites but… it comes with a tiny drawback: You’ll need to maintain an infrastructure, even if you only want to execute the analysis once. You need to have a server, maybe a virtual machine with SonarQube running and, if your base code is large, you’ll consider also to use a proper database instead of the embedded H2. On top of that, you should maintain this infrastructure updated and working.
I think SonarQube should be a key piece in every continuous delivery pipeline, but in our case, we’ll use this tool only to obtain a first quick code quality draft.
The lazy way
First on a physical server and then using virtual machines, this solution worked for me for a long time. Even having to pay this infrastructure extra cost, was worth it.
But, as a good lazy and dumb developer, I’m always looking for easiest and cheaper ways to do, so I wonder whether was a way to run SonarQube that allows me to reduce, as much as possible, its infrastructure.
Suddenly the opportunity of using Docker appeared.
Fire the Sonar!
As of this writing, running a containerized SonarQube is a matter of minutes. You can access official images for both 5.x and 6.x versions. And for this example, I’m also using a PostgreSQL containerized database to leave the door open to be able to persist the result of the analysis on next iterations.
First of all, we need a database. Have a container running a PostgreSQL instance is easy as executing:
docker run --name sonar-postgres \
-e POSTGRES_USER=sonar \
-e POSTGRES_PASSWORD=secret \
It creates and runs a container based on the latest official PostgreSQL image. The first time we run this command Docker will download the image for us.
Now we are ready to run the SonarQube container executing the following command:
docker run -d --name sonarqube_6 \
--link sonar-postgres:pgsonar \
-p 9000:9000 \
-p 9002:9002 \
-e SONARQUBE_JDBC_USERNAME=sonar \
-e SONARQUBE_JDBC_PASSWORD=secret \
-e SONARQUBE_JDBC_URL=jdbc:postgresql://pgsonar:5432/sonar \
One of the most interesting points of this statement is the deprecated legacy –link flag. It defines a host alias for the database container and establishes a secure tunnel between the containers that doesn’t need to expose any ports externally on the container.
It’s worth remembering that containers are sharing the default bridge network, but are not mapping/publishing any port by default. In this case, the database container is totally isolated.
An alternative is to create a user-defined network, where the containers have the ability, among others, to resolve container names to IP addresses.
SonarQube provides different ways to analyze code, depending on our needs. In my case most of the time I work with Java, so I have opted for the use of Maven because it’s well extended and allow me to keep this example simple.
To start the analysis we need to go to the code folder and execute the following command:
If your operating system does not support the native version of Docker you’ll be forced to use boot2Docker. It installs VirtualBox and creates a minimal virtual machine that hosts all the managed containers, what it means that all ports will be mapped to it. Running the Docker Quickstart Terminal we could easily obtain which IP was assigned to the host so we can adapt our analysis command like this:
mvn sonar:sonar \
In my case, boot2docker is using the private IP 192.168.99.100 so my command looks like this:
mvn sonar:sonar \
Now we can use our favorite browser to access our local brand-new-ephemeral SonarQube. Remember that depending on our Docker installation, the website could be at different locations. If we’re using native docker on our Mac or Windows machine we should introduce in our browser:
If we had installed boot2docker we should use the IP assigned to the default VM on VirtualBox. In my case I should put:
It is inspiring that using well-known tools in totally different ways from how they were being used we can obtain amazing outcomes. Even with the actual transition from boot2docker to native Docker distributions, this solution becomes a quick and painless way to have a broad vision of what is happening under the hood.
Let’s continue exploring together how to face usual situations with this little good ideas.