Quo vadis, Dolphin? More results from the user study.

We conducted a large study about strength and weakness of file managers in may 2013. In this article we present the first part of the statistical analysis.

Introduction

In may 2013 we asked users about their preferences for file managers. The online survey did not contain questions regarding Dolphin specifically, but addressed file managers in general. Thus we are able to compare usability and user experience of different tools. After an article about descriptive results this article features the first part of the statistical evaluation.

Task

The study started with a question about the task or situation in which a particular file manager is being used. The idea behind it was to compare, for instance, the use of command line interface during development with the use of a graphical tool in leisure activities. It turns out that an answer to this question is rather difficult, probably due to the fact that users seldom work in defined ‘situations’ and do not explicitly chose a file manager for it. The results show, if at all, only very small effects between task and any other variables.

File manager

As reported in the first post we did not receive enough responses for all file managers. To ensure reliable results only those were analyzed that have at least twenty answers. The original set of twenty three different file managers is hereby reduced to only seven.

Reasons

In our evaluation the selection of the preferred file manager was combined with the question about the reason for using it. Responders had one choice out of seven options. Of course all options are reasonable and several people remark that they chose their tool because of all aspects, but we forced users to choose the most relevant reason. Results suggest three major reasons:

    • unspecific, i.e. user do not deliberately chose the file manager and take what they get,
    • efficiency, which is mostly relevant for CLI, but heavily correlated to functionality.

Figure 1: Heatmap of reasons to chose a file manager: Red parts denote higher percentage.

Figure 1: Heatmap of reasons to chose a file manager: Red parts denote higher percentage.

Dolphin is the default browser of KDE but often chosen for it’s functionality too. Many of the free text responses emphasize this point. For example:

While Dolphin is part of KDE (my DE) I do not use it because of it being the default. I use it because it has all the features and I can configure it how I choose.

Rating of Functionality

Users were also asked to rate the features of their file manager. The results are illustrated in figure 2; the higher a value the better this feature is represented in the program. The statistical analysis shows strong effects for two factors: filtering of data and customizing the interface, but almost all other variables yield minor effects as well.

Figure 2: Average rating of functionality by file managers with 95% confidence interval.

Figure 2: Average rating of functionality by file managers with 95% confidence interval.

Table 1: Statistical results; effect size (eta) is used for cell shading: small effects>=0.01 (light yellow), medium size=>0.06 (dark yellow), and large>=0.14 (ocher color).

Value F df eta p
Displaying of data 21.10 642 0.029 <0.001
Filtering data 96.61 639 0.129 <0.001
Sorting data  25.02  640  0.035 <0.001
Setting file properties  25.62  643 0.035 <0.001
Viewing file properties  9.85  646  0.012 <0.01
Preview media  6.35  642  0.007 <0.05
providing network connectivity  20.54  630  0.028 <0.001
searching for files  15.51  643  0.021 <0.001
browsing through files  7.91  644  0.009 <0.01
customizing interface 156.15 643 0.1929 <0.001

Worst results were achieved by Microsoft Explorer, followed by Nautilus. Especially the option to configure both for personal preferences was rated below medium. Really good results in general gets the command line interface (except for the preview of media), but most likely this result will not withstand a broader and less tech affine audience. Dolphin ratings all score above average. The best value is assigned to browsing, the lowest to searching. Compared to its antecessor Konqueror filtering was improved but connectivity diminished.

Overall satisfaction

These results are confirmed by the question about overall satisfaction (F=60.595; p<.001; eta=0.0826). Lowest values were given to Microsoft Explorer and Nautilus, Thunar scores a little bit better, and the rest is almost perfect.

Figure 3: Average rating.

Figure 3: Rating.of overall satisfaction.

Next steps

We will report about the rating on usability according ISO 9241-110 in the next article. And a third article will follow that deals with personal estimation based on motifs related to other variables. So stay tuned.

Addendum

If you want to follow our analysis step-by-step you can download the raw data (QuoVadis Results.tar.gz) as well as R scripts (QuoVadis R-Scripts1.tar.gz).