OpenStack gender/diversity technical contributions analysis

Since last OpenStack Summit in Tokyo we were wondering at Bitergia if we could mix our knowledge on Software Development Metrics and some research we had done about gender/diversity contributions in Debian long time ago. Daniel started working on it, and he submitted a talk for OpenStack Summit in Austin that got accepted!

Updated: Vídeo and slides already available..

Continue reading

On the Importance of Quarterly Reports : OPNFV and OpenStack as use cases

Public quarterly reports are used for understanding the performance of companies. And so, quarterly reports done by Bitergia fill the gap of understanding the performance of open source communities. This type of analysis focuses on those that are still interested in metrics, but do not have the time to play with the dashboards. This indeed provides a full overview of the current quarter, but adds a comparison with the previous quarters. This allows to have some extra context about where the community is heading.

Continue reading

Talks for OpenStack Summit Austin, VOTE FOR THEM!

Another OpenStack Summit is comming, and as usual, we have submitted some talks based on the analytics work done on its development data. Now it’s time for the community to vote for them:

Both of them are going to be a good chance to see the possibilities that our Open Development Analytics platform offers for Open Source projects and foundations in terms of transparency and perfomance metrics.

Meanwhile, you can play with the early preview dashboard published for FOSDEM’16 as proof of concept of our future dashboards:

OpenStack PoC Dashboard for FOSDEM 2016

The Mitaka OpenStack mid-cycle quarterly report

[This post is based on the executive summary of the 2015-Q5 OpenStack Community Activity Report, sponsored by the OpenStack Foundation]

The October-Devember 2015 penStack Community Activity Report shows a stable growth of the OpenStack Community. As new repositories and teams keep being added, the number of projects keeps growing. On the other hand it is worth mentioning the decrease in activity during the latest quarters in project teams such as Nova, or stabilization of some others, such as Horizon or Cinder. This is a clear signal of the maturity reached by the some of the project teams in the OpenStack Foundation.

Active Core Reviewers reach a new peak

Although Git activity (changesets merged in the code base) does not show a large increase of activity, if compared to Gerrit, the development effort in the project keeps increasing. This last quarter of 2015 the number of Active Core Reviewers reached a new record of 449 different developers.

Time to merge keeps decreasing

During the third quarter of 2015, a small increase in time to merge seemed to signal a change in trend. However, this last quarter of 2015 keeps the previous decreasing trend. During this period the median time to merge a changeset into master decreased from 2.91 days down to 2.38 days.

Efficiency closing tickets decreases

It is noticeable the decrease of the relative number of tickets closed in OpenStack projects (also known as the efficiency of the ticket closing process). Previous quarters topped at about 60% of closed tickets (with respect to tickets opened during the same period), while this quarter shows a much lower 44%. This could be seen as a poor performance indicator of the project teams.

However, the efficiency closing changesets in the review system (number of changesets merged or abandoned with respect to number of new changesets being proposed for review) remains stable at around 80%.

Some developers are more equal than others

In a large free, open source software development community, not all developers are equal. Some are more experienced, some are better known, some know better how to adhere to the uses and customs of the project, some write code that is more easily accepted by others. One of the areas where these differences are more noticeable is the code review process. It is not equally easy for all developers to push their patches through code review. Even when the process itself tries to be fair and unbiased, well, “all developers are equal, but some are more equal than others”.

openstack-gerrit-db

Fortunately, we have plenty of data in the code review system. We can use analytics on that data to learn about how difficult it is for developers to get their patches accepted. We will use a OpenStack code review dashboard, composed with data from the OpenStack Gerrit instance to illustrate the process.

Continue reading

The numbers of the Open Cloud (Tokyo edition)

During the last OpenStack Summit, in Tokyo, we presented the talk “The numbers of the Open Cloud“. Now it is online, including video and slides. The talk presented some aspects of the development process of OpenStack, Apache CloudStack, OpenNebula and Eucalyptus. It included some details about OpenStack, and some new Kibana-based dashboards for them (you can also have a look at our more recent OpenStack Gerrit activity and OpenStack Git activity dashboards).

Screenshot from 2015-12-08 22-42-13

This is the latest presentation of our series on quantitative analysis of the main free, open source software projects producing cloud infrastructure systems. See for example the first in the series, which was presented in OSCON 2014, and compare with the current situation.

Continue reading

Understanding the code review process in OpenStack

As a part of our tests with Kibana and Elasticserch as frontends for our MetricsGrimoire databases, we’ve set up a dashboard for understanding the code review process in OpenStack (be sure of visiting it with a large screen and a reasonable CPU, otherwise your experience may be a bit frustrating).

Screenshot from 2015-10-22 00-24-53This dashboard includes information about all review processes (changesets) in OpenStack, using information obtained from their Gerrit instance. For each review, we have information such as the submitter (owner), the time it was first uploaded and accepted or abandoned, the number of patchsets (iterations) needed until it was accepted, and the time until it was merged or abandoned. With all of them we have prepared an active visualization that allows both to understand the big picture and to drill down looking for the details. Follow on reading to learn about some of these details.

[Note: this is our second post about our dashboards based on Kibana. If you’re interested, have a look at the first one, about OpenStack code contributions.]

Continue reading