Once again, Bitergia will be at OSCON, held this time in Portland, at Oregon convention center. OSCON, one of the biggest open source conventions in USA, has been ground zero of the open source movement and nowadays continues to be the catalyst for innovation among companies.
The Xen project is an open source software project that does pre-commit peer code review. This means that every change to the source code follows a code review process, in which any developer can participate, before being accepted in the code base. During that process, the patch is carefully inspected and improved thanks to the contributions of the reviewers. The process takes place in the xen-devel mailing list.
Code review is a very important process to improve the quality of the final product, but it can also be very time-consuming, and impose too much effort on the reviewers and the developers themselves. Therefore, when a few people in the Xen Project detected an apparent increase in the number of messages devoted to code review, they become concerned. They decided to use the services of Bitergia to determine what was happening, and how it could impact the code review process of the project. And the report “Code review process analysis in the Xen Project” was born.
The main objective of the analysis (composed of two stages) was to verify if this apparent increase was really related to the code review process, and to study other parameters to determine if the code review process was deteriorating in some way. The first stage has already been completed, with three key findings:
- Time-to-merge, probably the most important parameter to express the toll that a project is paying for reviewing code, is under control. Time-to-merge is counted from the moment a change is proposed, to the moment that change is merged, after running its corresponding code review.
- Time-to-merge increased from 2012 to the first semester of 2014, running from about 15 days (for 75% of the changes), to close to 30 days. But since then, time-to-merge has decreased: 28 days in the second semester of 2014, and 20 in 2015 (again, for 75% of changes).
- The trend of time-to-merge is similar despite the size of the change. The same trend is observed for changes composed of one, two, three, four or more than four patches (individual components of a change).
Currently, the second stage of the analysis is being performed. It is expected that this stage will produce actionable dashboards with detailed information that will allow to track in detail the main parameters of the Xen code review process.
[This post is based on the executive summary of the 2015-Q5 OpenStack Community Activity Report, sponsored by the OpenStack Foundation]
The October-Devember 2015 penStack Community Activity Report shows a stable growth of the OpenStack Community. As new repositories and teams keep being added, the number of projects keeps growing. On the other hand it is worth mentioning the decrease in activity during the latest quarters in project teams such as Nova, or stabilization of some others, such as Horizon or Cinder. This is a clear signal of the maturity reached by the some of the project teams in the OpenStack Foundation.
Active Core Reviewers reach a new peak
Although Git activity (changesets merged in the code base) does not show a large increase of activity, if compared to Gerrit, the development effort in the project keeps increasing. This last quarter of 2015 the number of Active Core Reviewers reached a new record of 449 different developers.
Time to merge keeps decreasing
During the third quarter of 2015, a small increase in time to merge seemed to signal a change in trend. However, this last quarter of 2015 keeps the previous decreasing trend. During this period the median time to merge a changeset into master decreased from 2.91 days down to 2.38 days.
Efficiency closing tickets decreases
It is noticeable the decrease of the relative number of tickets closed in OpenStack projects (also known as the efficiency of the ticket closing process). Previous quarters topped at about 60% of closed tickets (with respect to tickets opened during the same period), while this quarter shows a much lower 44%. This could be seen as a poor performance indicator of the project teams.
However, the efficiency closing changesets in the review system (number of changesets merged or abandoned with respect to number of new changesets being proposed for review) remains stable at around 80%.
In a large free, open source software development community, not all developers are equal. Some are more experienced, some are better known, some know better how to adhere to the uses and customs of the project, some write code that is more easily accepted by others. One of the areas where these differences are more noticeable is the code review process. It is not equally easy for all developers to push their patches through code review. Even when the process itself tries to be fair and unbiased, well, “all developers are equal, but some are more equal than others”.
Fortunately, we have plenty of data in the code review system. We can use analytics on that data to learn about how difficult it is for developers to get their patches accepted. We will use a OpenStack code review dashboard, composed with data from the OpenStack Gerrit instance to illustrate the process.
So you decided to use metrics to track your free, open source software (FOSS) community. Now comes the big question: Which metrics should I be tracking? The post “Top five open source community metrics to track” that I wrote for OpenSource.com deals exactly with answering that question.
Based on our experience at Bitergia, the post proposes five families of metrics: activity, size, performance, demographics, and diversity. I’m sure that some important aspects may be missing, but still, this could be your first list of metrics to track, should you be interested in knowing about how your pet project is behaving. Go and read the full post if you are interested in more details.
- Users community keeps growing. With an increase of more than a 300% in the total number of questions posted in ask.openstack.org, this is the unique communnication channels with such measured increase. Other channels such as IRCor mailing lists also grow but slower. Although this is the activity measured for the last year, the last quarter shows a significant activity drop.
- Active Core Reviewers are increasing. Although mid-release cycle analysis usually show drops of activity in most of the areas, the total number of active core reviewers has reached a new peak. Up to 339 core reviewers participated in the review process. This shows an increase of around 9% if compared to the previous quarter analysis.
- Process keeps stable. The time to merge patches for the main projects show similar numbers than in previous quarter. However, Nova seems to be the project out of the common time to review with up to 10 days. On the other hand, Glance is more in line with the rest of the projects if compared to previous quarter. In any case, Glance shows the second highest meadian time to review with up to 7 days.
- IRC activity recovers old activity. Due to unknown issues, the total logs analyzed in the last two previous quarters indicated a huge drop of activity. However, last analysis on this data source shows activity in line with previous quarters. Although this is still an unknown issue, the IRC activity has recovered expected activity.
Project teams are also covered in this quarterly report. This quarterly report follows the Governance file for projects, but ignores specs files. Previous versions of the quarterly report such as 2015-Q1 or 2014-Q4 divided projects into the integrated project, specs, clients and others, with specific sections.
[Updated results based on methodological changes]
Kilo, the new OpenStack release, shows a continuous increase of activity if compared to Juno. From Icehouse to Juno, there was an increase of 6.22% in the number of commits and 17,07% in the number of unique authors. From Juno to Kilo, there’s a higher jump in terms of commits (11,23%) and a lower increase in terms of authors (11,16%). However, with this increase, there is a new peak in the number of unique authors contributing to the OpenStack Foundation projects with close to 1,600 different people participating in its development.
After the continuous increase of activity from release to release that we observed in the past, Kilo, the latest release of OpenStack is showing some stabilization. The differences between Juno (the previous release) and Kilo are the lowest in the history of the analysis we’ve performed for the OpenStack Foundation. Although this release has reached a new peak in contributors, close to 1,500 different persons, the increase from Juno to Kilo was of around 900 commits and 200 authors while from Icehouse to Juno it was of 700 commits and 70 developers.
The list of organizations participating in the development of OpenStack keeps growing as well: close to 170 different organizations have contributed with at least one commit to the development of Kilo.
As the top ten contributors, we find the following organizations:
Regarding to the community itself, the timezones analysis shows a widespread activity around the world. OpenStack is a truly 24 hours-a-day continuous development community. There are three main groups of activity: America, on the left side of the chart, Europa/Africa in the center and Asia, on the right.
Ignoring the UTC 0 activity, that may be biased by developers using UTC 0 as their timezone with independence of their point of residence, the rest of the activity shows North America East and West coasts as the main contributors in number of commits. Europe/Africa is quite close to this activity (most of it due to Europe), although biased by the UTC peak of activity. India could be represented by the the small peak in UTC+5, and finally the rest of Asia, with China and Japan in first place, which is consistent with the localization of some contributing companies.
- Some of the repositories under the OpenStack project have been removed of the analysis. As an example, specification projects are not counted for this analysis. The full list of repositories is available at the last quarterly report sponsorized by the OpenStack Foundation.
- Developers are counted as the actual authors of the piece of code merged into upstream.
- The time of commit takes into account the time when that piece of code is merged into upstream.
- Each release, new repositories are added to the list of analyzed projects. This partially explains the continuous increasing activity in the OpenStack Foundation projects.
[This post is part of the lightning talk presented at FOSDEM 2015. The talk was titled as “Data, data and data about your favourite community” whose slides are available in the Bitergia’s Speakerdeck place. The ipython notebook used for visualization purposes is accesible through nbviewer and can be downloaded in GitHub. This is a basic introduction to GrimoireLib.]
GrimoireLib aims at providing a transparency layer between the database and the user. This helps to avoid the direct access to the databases while providing a list of available metrics.
This is a Python-based library and expects an already generated database coming from some of the Metrics Grimoire tools. CVSAnalY, MailingListStats, Bicho and most of the tools are already supported by this library.
Within a few hours the OpenStack Juno release will be delivered. At the moment of writing this analysis the OpenStack Activity Board shows 91,317 commits spread across 108 repositories. All of this activity was performed by close to 2,600 developers, affiliated to about 230 different organizations. In addition, around 75,000 changesets have gone through code review, submitted by 3,082 developers.
With respect to community communication channels, there were more than 3 million messages exchanged in the IRC channels, close to 10,000 questions asked on the Askbot instance of the OpenStack Foundation and about 3,600 people posting to the mailing lists.
Focusing on the Juno six-month release cycle, activity was intense:
- 18,704 commits, 8.65% increase if compared to the previous release cycle, Icehouse.
- More than 130 organizations contributing.
- Close to 1,420 developers, an increment of about 16%.
- IRC channels grew from 926K to 1,024K messages, 10% more than for Icehouse.
- Mailing lists reduced activity (8.8% less messages). However, there is an increase of 18% in the number of new questions in Askbot, with a total of about 3,000 new questions during the Juno release cycle.