New Pisi Package Compile System as a Continuous Integration Example

What is the Motivation for the Change?

When the ‘pisi bi’ command; that is used for building packages from source code, is executed in versions of Pisilinux 1.2 or older, all of the header files included in the ‘system-devel’ group is also installed whether or not they are required. Installing all header files without any filtering makes it really hard to find out the package dependencies properly.

With Pisilinux 2.0, this habit of installing all dependencies is dumped. The decision of compiling packages only with the required set of files/headers that package needs is given. Also, all packages should be compiled in a sterilized environment. Developers should install a lot of different versions of the same library. They have to be sure that, a package compiled for the pisi repository must compiled against the libraries in the repository. In the old way, developers uses their systems to compile a package, which is open to mistakes of using the wrong versions of libraries that are not exist inside the pisi repository, but installed in their system. Testing of such a packages passes in their system, but will break the repository because of the version mismatches. To avoid such mistakes, a sterilized environment is a must.



Github and Webhooks

The github website provides us a lot of utilities that we can use during the development process. Webhook is one of this utilities. A person having the right access rights should setup a webhook easily. A webhook creates a task at the github site to send the information that we selected to the URL we are providing. The information should include all the details for the commits. The system we provided relies on the webhook to operate properly. Also, all the events are triggered by the content that the github webhook sent to our web site.

All of the package details for the Pisilinux distribution is on github. All the developers uses github to update the content of package descriptions. All pull requests triggers the webhook, and details of merge and commit operations transferred to our web server. The application running on the server uses this information and finds out which packages are modified. The modified packages are added to the compile queue.

Flask, Python Microframework

The application that receives the information sent by github webhook is written in flask. Flask is a web microframework written in  python. It handles all http GET and POST requests.

Docker for the Sterilized Compilation Environment

What is Docker ?

Description from docker web site is “Docker containers wrap up a piece of software in a complete filesystem that contains everything it needs to run: code, runtime, system tools, system libraries – anything you can install on a server. This guarantees that it will always run the same, regardless of the environment it is running in.”

This approach is lighter than classical virtualization techniques. Most of the virtualization systems provides us full hardware virtualization. Docker provides us application level virtualization in the form of containers.

How are We Using Docker?

Using docker, we prepared a minimal container that can run pisi package management application using 80 pisi packages. With using the right parameters, any application can be compiled inside this docker container without affected by the libraries installed on that system.

We also provided a volunteer application, which people can install their systems and compile pisi packages inside docker containers. Volunteer application connects to the website, and checks for the compile queue if there is a package to be compiled exists. If there is a package, volunteer application starts a docker container and provides the name of the package to be compiled. All the error and stdout logs created by the compile process are recorded and sent back to the website. If the compile operation is successful, the binary pisi package also sent. By providing the actual compile process logs, any developer will have all the information about the compile process. If there is a need to share the logs with the upstream, only a url is needed for all the logs.

The binary packages sent by the volunteers are put in the test repository. After testing the new packages, developers moves these new packages to stable repositories. Old package compile system requires all packages installed in the system. New approach simplified the needs for the servers needed for the compile process. With the help of more volunteers, cpu power for compiling packages can easily increased.

All the source code for this process, including backend and volunteer application is given in the links below:

The web site for system:

No Comments

Post a Comment