Dashboard celebrating 25 years of Linux development

To celebrate 25 years of Linux kernel development, we at Bitergia have produced the Linux development history dashboard. This dashboard visualizes the current Linux git repository from two points of view: the history of all commits (changes to the source code) up to now, and the history of all lines in the current version. The dashboard visualizes the main parameters about the development (the who, when and what) are visualized, and allows for drilling down in the data, for example finding the specific commits that lead to a specific part of the code.

linux-dashboard-blame

Do you want to learn about when the lines in the current kernel were authored? Who has participated in specific areas of the kernel? How many files have remain untouched for more than 10 years? Play with the dashboard and find your own interesting details!

The dashboard was produced using only free, open source software tools (among them, GrimoireLab, our tookit for software development analytics). If you want to learn more details, check the slides I intended to use for my presentation at LinuxCon, which unfortunately I couldn’t attend. Those provide some more insight about how it was produced, some examples about how it can be used, and some curiosities found by exploring it.

Continue reading

Dashboards for the Eclipse community

We’ve been maintaining a software development dashboard for the Eclipse community for a while. Now that EclipseCon is running, it is a good moment to visit it, to explain some of its peculiarities, and to comment on future directions.

Eclipse software development dashboard

Eclipse software development dashboard

The dashboard shows activity in the four main type of repositories with information about software development (git, Gerrit, Bugzilla and mailing lists) for all the projects in Eclipse. You can browse the specifics of all of them (click on the button right of “Eclipse Foundation” on the top bar), and select between a view of the whole history of the community, or restrict it to the last five years (unfold the option by clicking on “All history”, again in the top bar).

But before commenting some more details, let’s visit the future: a simple PoC of the upcoming GrimoireLab-based dashboards, showing Eclipse data as of two days ago for dashboard for git data and dashboard for Gerrit data.

GrimoireLab-based dashboard for Eclipse git data

GrimoireLab-based dashboard for Eclipse git data

The information in these new dashboards will be much more actionable, with the visitor being able of filtering by just clicking on charts and tables. These dashboards are still early demos, which although show real data, still need a lot of polishing of the user interface. For a more complete (but still proof-of-concept) demo, have a look at the one we presented during FOSDEM.

Continue reading

Analyzing code review in Xen

The Xen project is an open source software project that does pre-commit peer code review. This means that every change to the source code follows a code review process, in which any developer can participate, before being accepted in the code base. During that process, the patch is carefully inspected and improved thanks to the contributions of the reviewers. The process takes place in the xen-devel mailing list.

Code review is a very important process to improve the quality of the final product, but it can also be very time-consuming, and impose too much effort on the reviewers and the developers themselves. Therefore, when a few people in the Xen Project detected an apparent increase in the number of messages devoted to code review, they become concerned. They decided to use the services of Bitergia to determine what was happening, and how it could impact the code review process of the project. And the report “Code review process analysis in the Xen Project” was born.

Time-to-merge in Xen, per semester

Time-to-merge in Xen, per semester

The main objective of the analysis (composed of two stages) was to verify if this apparent increase was really related to the code review process, and to study other parameters to determine if the code review process was deteriorating in some way. The first stage has already been completed, with three key findings:

  • Time-to-merge, probably the most important parameter to express the toll that a project is paying for reviewing code, is under control. Time-to-merge is counted from the moment a change is proposed, to the moment that change is merged, after running its corresponding code review.
  • Time-to-merge increased from 2012 to the first semester of 2014, running from about 15 days (for 75% of the changes), to close to 30 days. But since then, time-to-merge has decreased: 28 days in the second semester of 2014, and 20 in 2015 (again, for 75% of changes).
  • The trend of time-to-merge is similar despite the size of the change. The same trend is observed for changes composed of one, two, three, four or more than four patches (individual components of a change).

Currently, the second stage of the analysis is being performed. It is expected that this stage will produce actionable dashboards with detailed information that will allow to track in detail the main parameters of the Xen code review process.

These findings and more will be shown in our talk at OSCON. Remember that we will be exhibiting there and you can get a discount using BITERGIA25 code… Don’t miss the chance to visit us there!

OSCON 2016

Check our Kibana-based development dashboards!

While we’re developing and testing our new toolchain for producing Kibana-based software development dashboards, we’re producing a good collection of them, with real data from real projects.

Bitergia Kibana dashboards screenshots

Bitergia Kibana based dashboards with data from some FOSS dev communities

Just in case you are interested in having a look at them and provide some feedback, here is a partial list:
Continue reading

Some developers are more equal than others

In a large free, open source software development community, not all developers are equal. Some are more experienced, some are better known, some know better how to adhere to the uses and customs of the project, some write code that is more easily accepted by others. One of the areas where these differences are more noticeable is the code review process. It is not equally easy for all developers to push their patches through code review. Even when the process itself tries to be fair and unbiased, well, “all developers are equal, but some are more equal than others”.

openstack-gerrit-db

Fortunately, we have plenty of data in the code review system. We can use analytics on that data to learn about how difficult it is for developers to get their patches accepted. We will use a OpenStack code review dashboard, composed with data from the OpenStack Gerrit instance to illustrate the process.

Continue reading

Metrics to track a FOSS project

So you decided to use metrics to track your free, open source software (FOSS) community. Now comes the big question: Which metrics should I be tracking? The post “Top five open source community metrics to track” that I wrote for OpenSource.com deals exactly with answering that question.

activity-metrics

Based on our experience at Bitergia, the post proposes five families of metrics: activity, size, performance, demographics, and diversity. I’m sure that some important aspects may be missing, but still, this could be your first list of metrics to track, should you be interested in knowing about how your pet project is behaving. Go and read the full post if you are interested in more details.

The numbers of the Open Cloud (Tokyo edition)

During the last OpenStack Summit, in Tokyo, we presented the talk “The numbers of the Open Cloud“. Now it is online, including video and slides. The talk presented some aspects of the development process of OpenStack, Apache CloudStack, OpenNebula and Eucalyptus. It included some details about OpenStack, and some new Kibana-based dashboards for them (you can also have a look at our more recent OpenStack Gerrit activity and OpenStack Git activity dashboards).

Screenshot from 2015-12-08 22-42-13

This is the latest presentation of our series on quantitative analysis of the main free, open source software projects producing cloud infrastructure systems. See for example the first in the series, which was presented in OSCON 2014, and compare with the current situation.

Continue reading