- There are 25,268 commits in total merged into master thanks to the work of 1,873 different developers.
- In order to have that code into master, it was necessary the effort of 2,239 people that submitted at least one patchset to Gerrit. That means that 83% of them are actual contributors of Liberty release.
- In terms of community, Launchpad activity shows that 9,919 people helped participating in the bug tracking process, opening, commenting and closing tickets.
- The mailing lists are a busy channel of communication with 1,742 participants, but IRC seems to be the preferred channel with more than 6,000 detected nicknames.
- Ask.openstack.org is also an interesting communication channel where there have been 1,386 people participating and around 2,200 different questions.
As a disclaimer, this post will not focus on organizations participating in the OpenStack development, but in the software development process. Organizations information can be easily retrieved in the Activity Board. We believe that process in the OpenStack community is important and even more when we are talking about a team of more than 2,000 different contributors.
Efficiency of the community closing changesets
As mentioned, there have been more than 2,200 people in Liberty release that aimed at contributing with a patchset in the OpenStack community. And only a subset of those pieces of code were eligible to be part of the project and merged into master. However, not all of the people that were part of the that set were even reviewed. Each quarter, the community leaves around a 20% of the changesets population open. This can be observed in the following chart. The y-axis represents the percentage of changesets that were closed (abandoned or merged) per quarter (x-axis).
As this is observed, the community has been quite stable during the last 8 quarters, even when the activity has increased a lot in a lively community such as OpenStack. This has forced the community to keep growing in terms of reviewing activities as we can see in the evolution of the active core reviewers (those that can vote +2 or -2 in the community). This chart shows a sustained increase during the last 8 quarters.
Indeed, this is also observed in the time to merge of the community. During the last quarter, a 50% of the changesets that landed into master were merged in less than 3 days since the review process was opened. These numbers are quite important for those companies doing continuous deployment to their customers. Having a stable number of days to merge and fix issues into code helps to work on internal policies related to the deployment of new features directly coming from the project.
Of course these numbers are part of the community when developers are reviewers but also submitters. In the following two charts, we depict the time to action from the submitter side: those sending patchsets and reworking them. And the reviewer side: those actively reviewing with a +1, -1, +2 or -2. We observe how the time waiting for a reviewer action has been stable during the last three quarters. This has been measured in less than 2 days the time to react in median. This means that for the fastest 50% of the reviewers, an action take place in less than 2 days. Similar numbers, even better are found in the submitters side for the the fastest 50% of them. Their reaction is close to one day, although there seem to be slower submitters than reviewers as the mean is higher, going up to 17 days.
As a conclusion, the community is managing pretty well the huge increase of activity during the last releases that still takes place in the project. Although the activity in terms of commits merged into master for the last year is similar to the total activity measured for the previous year, the review activity keeps increasing, what has forced the community to have more and more developers in the core reviewer side. There are not visible bottlenecks, although this is a basic analysis scratching the OpenStack surface. A more in depth analysis per project shows increases in the time to review in some core projects such as Nova or Neutron, while others keep stable such as Heat or Cinder. And of course, there are others improving their time to review such as Keystone, Glance, Ceilometer or Swift.
In any case, if you’re attending the Tokyo OpenStack Summit, we’d love to hear from you about this post and other metrics of interest for you. We’re indeed having a BoF related to engineering metrics, just in case you’d like to participate, add comments or even say that this means nothing. You’re welcome!
See you in Tokyo!