Digging deeper into analysis results

If you need more background information about the measures and indicators used in the charts and tables in the dashboard, the Indicators, Measures and Findings tabs can provide more details about the statistics recorded for the current artefact. Note that these tabs are not displayed by default. If you want to see them in Squore, go in the Explorer Settings and Tabs Display Settings section. Then enable the desired tabs, as follows:

SUM explorerSettings ManageTabs
Figure 1. Access the Explorer Settings
SUM tabManager
Figure 2. Show or hide tabs from the Tabs Display Settings section.

Note that depending on your configuration, it might not be possible to hide certain tabs.

Understanding indicators

If you want to understand the scale used for a particular indicator, to see for example how close you are to moving up the rating scale, you can check the scale used for this indicator in the Indicators tab of the dashboard.

Log in and search for the artefact DB5_backup.c in the Neptune project, where the indicator Maintainability Non-Conformity is rated E. While this tells you about the current rating for this artefact, this does not tell you how to improve it. In order to learn how to improve this score, let’s first take a look at the scale used for this indicator. Click the Indicators tab of the Explorer. The table of indicators opens, as shown below:

SUM indicatorsTable
Figure 3. The indicators table for DB5_backup.c

The table lists all the indicators available for the artefact over several pages. The scale and levels available for an indicator can be viewed in a tooltip by placing your mouse over the information icon. Using the "NAME" filter, look for the entry named Maintainability Non-Conformity, then click on its value or on the chart icon next to it. The scale for the indicator indicates that the artefact is rated E because the value of the indicator is 472.09. In order to improve the score, the value would need to decrease to under 250 to be rated D, as shown below:

SUM indicatorScale
Figure 4. The scale used for the Maintainability Non-Conformity indicator

To understand how to improve the rating, you need to know how the indicator’s value is computed. Clicking the indicator name in the Indicator Tree shows the following explanation in the indicator popup:

SUM Neptuneindicator
Figure 5. The indicator popup for the Maintainability Non-Conformity indicator

The computation, i.e. the formula used to calculate the rating is 1000*(WEIGHTED_NC_MAI/ELOC), meaning that the indicator computes a ratio of broken Maintainability rules. To find out what these rules are, click the Findings tab.

Understanding findings

Squore displays all the findings for a particular artefact in a table in the Findings tab. Next to the finding’s label is a number of occurrences followed by a colour-coded delta value (red for more occurrences, green for less) compared to a previous analysis.

If you want to find out which rules are taken into account by the Maintainability Non-Conformity indicator, select Maintainability in the ISO CHARACTERISTIC filter to see the corresponding rules, as shown in the picture below:

SUM NeptuneFindings
Figure 6. The findings table for DB5_backup.c

You can filter violations according to many criteria, including relaxation status, origin, artefact type and other characteristics from the analysis model

The rules BWGOTO, STDIO, NOGOTO, RETURN and COMPOUNDIF are the rules that should be fixed in order to improve the Maintainability rating of DB5_backup.c.

You can expand the BWGOTO rule to show each occurrence of the rule being broken, and also review the location in the source code that breaks the rule, as shown below:

SUM NeptuneNoGoto
Figure 7. The location of the broken occurrences of the BWGOTO rule

The list of findings indicates if a finding is New, Closed or Modified since the reference version. Findings are traceable through time, so even if your code is modified, you can to go back to the version in which it was first detected.

Finally, clicking on the line number for each rule-breaking occurrence opens the source code viewer in full screen, so you can carry out your code review:

SUM NeptuneSourceCode
Figure 8. The source code viewer highlighting the first occurrence of BWGOTO

The source code viewer allows comparing the code against another version of the code. Select a version name in the Compare with: list to switch to diff mode, as shown below:

SUM NeptuneSourceCodeCompared
Figure 9. The source code viewer in diff mode

In diff mode, use the top arrows to switch the left and right panes, and the bottom arrows to turn synchronised scrolling on or off. Characters that were removed are underlined in green, while characters that were added are underlined in red.

Analysing findings helps to improve the quality of the code in your project. There is much more you can do with the Findings tab by using the built-in filters to detect regressions and improvements:

  • Rules with findings: displays all the rules containing violations in this version

  • Lost Practices: displays all the rules containing new violations in this version compared to the reference version

  • Acquired Practices: displays all the rules not containing any violations anymore in this version compared to the reference version

  • Deteriorated Practices: displays all the rules containing more violations in this version than in the reference version

  • Improved Practices: displays all the rules containing fewer violations in this version than in the reference version

  • New Findings: displays all the new violations since the reference version

  • Fixed Findings: displays all the violations fixed since the reference version

  • All changed findings: displays all the rules where a change in the number of violations was detected, essentially providing the combination of New Findings and Fixed Findings in one list

  • Project ruleset: displays all the rules checked by the project, i.e. the violated ones as well as the ones that are not

By default, the Findings tab displays violations compared to the previous analysis, but you can refine the search by adjusting the Reference drop-down list (under the Explorer Settings menu) that contains all the versions analysed in your project.

You can learn about more automated ways to review and fix code in Reviewing Action Items.

You can click the Export button at the top of the list of findings, to generate a CSV file of the findings displayed in the user interface. The contents of the file reflect your current filter selections on the page. The following is a CSV export for the Findings of the Earth project.

SUM findings to csv
Figure 10. A CSV export of the findings of the Earth project

If the Export button is grayed out, your license does not include the option to export data to CSV files.

Understanding measures

If you have doubts about the measures computed by Squore and their meaning, they can usually be solved by looking at the Measures tab of the Explorer.

The content of the measures tab always reflects the data for the current artefact. It is organised in a table displaying the measure’s name, value and other useful information for the current selection, as shown below:

SUM measures
Figure 11. The table of measures for the DB5_backup.c

The table also tells you which Data Provider reported the metric with the Provided by column.

The measure’s mnemonic is displayed below the name if it is different. And then, between parenthesis we display the measure’s id if it is different from the mnemonic.

Possible values for the measure’s status are:

  • Default Value: This measure has the default value defined in the analysis model

  • Ok: A value was computed successfully for this measure

  • User-defined: The value was set by the user (either via a tag on the command line or in the Forms tab of the web UI)

  • Definition error: The value could be computed because of an error in the analysis model. Check the Model Validator to learn more.

  • Incomplete: The value could not be computed because of an error (maybe a division by zero?). The analysis model should probably be updated to avoid this in the future. This error is also available in the project’s build.log.

  • Warning: The value could not be computed, but there is nothing wrong with the measure definition in the analysis model. Maybe you are trying to do a COUNT on descendants but there are no descendants? In such cases, the error is not serious, but you can improve your analysis model to handle the warning if needed.

  • -: This measure was not found in the project. It did not exist at the time of the analysis.

  • Unknown: An unexpected error happened while computing the measure’s status

For all error statuses above, the metric is assigned the default value defined in the analysis model.