Project Quality Dashboard (Grafana, Sonar and Gitlab)
“For you to sleep well at night, the aesthetic, the quality, has to be carried all the way through”. ~ Steve Jobs
Who does not love sleeping well ;)?? I recently had a few sleepless nights after looking at the increasing technical debt in our products.
After having a few rounds of discussions with team members, I understood quality always comes secondary in sprint deliveries and the quality parameters are scattered in different tools. That’s where I decided to have a unified dashboard built which reflects the current state of the product’s internal quality.
To build this dashboard I started working on “quality bot” which collects and stores the data from Sonar and Gitlab. This data is then used in Grafana for visualization.
Quality data scraping bot is a spring boot application which is using Sonar web service client and Gitlab client for collecting data from respective interfaces. This boot has four main components — UI, Connectors, Scheduler, and data REST API.
Scheduler
Scheduler takes care of scheduling the data requests to connected to the data sources. It ensures that it does not overload the data sources unnecessarily. Overall responsibilities of the scheduler -
- Schedule job for each interface
- Fetch historical data
- In subsequent request-pull only delta data
- Retries in case of failure
- Fail-safe against the work scheduling configuration
REST API
This module provides a REST interface for manual changes as well as for configuring the projects
- CURD Project operation
- REST API documentation
- CURD operation for custom data (not available via a connector)
Sonar Connector
This connector uses the Sonar web service sonar-ws client to make a call to the Sonar server. Right now this connector is collecting project, code smells, and coverage data.
Sample code for collecting the project details from Sonar -
HttpConnector connector = getHttpConnector();
WsClient wsClient = WsClientFactories.getDefault().newClient(connector);
SearchMyProjectsRequest projectSearchRequest = new SearchMyProjectsRequest();
projectSearchRequest = projectSearchRequest.setPs("10");
Projects.SearchMyProjectsWsResponse response = wsClient.projects().searchMyProjects(projectSearchRequest);
response.getProjectsCount();
log.info("Configured response page size.. {}", response.getPaging().getPageSize());
response.getProjectsList().forEach(project -> {
try {
projectService
.create(ProjectEntity
.builder()
.description(project.getDescription())
.createdDate(new Date())
.key(project.getKey())
.name(project.getName())
.isActive(false)
.build());
} catch (RunTimeException e) {
log.error("unable to parse the incoming response..", e);
}
});
Gitlab Connector
I started with collecting the pipeline data (historical and latest) to provide the health and trend of the module’s CI/CD pipeline which indicates the quality of check-in, build times, current status, built quality, etc.. This helps in understanding the maintainability of the module.
Sample code for collecting the pipeline metrics from Gitlab -
GitLabApi gitLabApi = new GitLabApi(url, token);
try {
String name = p.getKey().substring(p.getKey().indexOf(":") + 1);
log.info("Making Gitlab call for project {}", name);
gitLabApi.getProjectApi().getProjects(name).forEach(project -> {
try {
gitLabApi.getRepositoryApi().getContributors(project.getId()).forEach( contributor->{
});
gitLabApi.getPipelineApi().getPipelines(project.getId(), 10).first().forEach(pipeline -> {
pipelineService
.create(PipelineEntity
.builder()
.id(pipeline.getSha())
.key(p.getKey())
.status(pipeline.getStatus().name())
.finishedDate(
pipeline.getFinishedAt() == null ? pipeline.getUpdatedAt() : pipeline.getFinishedAt())
.build());
});
;
} catch (GitLabApiException e) {
log.error("Error while fetching pipeline data.. ", e);
}
});
} catch (GitLabApiException e) {
log.error("Error while fetching group data.. ", e);
}
Scheduler
This module schedules the job every 10 minutes to poll the data. If scrapper is running for the first time then it fetches the historical data and later only delta data.
Dashboard
The quality dashboard is created using Grafana. To build is dashboard PostgreSQL database is configured as the data source. All of the panels are created using in built Grafana panels like gauge, bar chart, and table.
This dashboard is covering
- Code coverage
- Average code coverage across modules
- Time series data for code smells for critical, and blocker
- Total issues
- CI/CD Pipeline metrics
- Spring velocity
This is just the start, going forward I will add more reports to this dashboard for capturing the product quality and hopefully publish them here.. :)