Re: [Tails-ux] Report on user testing of the Additional Soft…

Nachricht löschen

Nachricht beantworten
Autor: Bernard Tyers
Datum:  
To: Tails user experience & user interface design, sajolida, Alan, intrigeri, segfault, bertagaz, emma.peel
Betreff: Re: [Tails-ux] Report on user testing of the Additional Software beta
Hi Sajolida,

Well done on a really nice piece of work! It's really great to see this
work being done and also shared publicly. Will it be put on the Tails UX
wiki?

About the "clear boundary between what is part of the Additional
Software project and what is not", it's probably better *not* to make
incorrect assumptions which might lead you to false solutions.

Is there a possibility to run this usability testing again? Could make
probe where that breakdown happens?

It could be in the language used in the documentation (if I understand
correctly that the user may need to use some external documentation?).

If you suspect there could be some difficulties in the content, here's a
really simple method for testing content:

(you can do this as part of the mission testing, or separately)

- provide the user with the required content, preferably printed out
- give them a coloured highlighter
- ask them to highlight words they don't understand

If the same words keep coming up as problems, you know there is an issue.

Here's a good blogpost on the method:
https://userresearch.blog.gov.uk/2014/09/02/a-simple-technique-for-evaluating-content/


Here's a few (possiby) useful blogposts on testing content:

https://gds.blog.gov.uk/2016/04/06/guest-post-looking-at-the-different-ways-to-test-content/
https://insidegovuk.blog.gov.uk/2015/11/16/guerrilla-testing-content-it-makes-it-better/

Hope that's useful. Again well done!

best wishes,
Bernard


On 06/05/2018 11:48, sajolida wrote:
> Here is a small report on the findings from the user testing of the
> Additional Software beta that I did this week. By lack of official
> beta, I used f25dce19c6.
>
>
> Protocol
> --------
>
> I did in-person moderated tests with [5] participants. By lack of time I
> recruited amongst friends who I already knew would match the profile I
> wanted (even if it's not a best practice to do so).
>
> I reused the 4 missions that we used for the paper prototyping in
> January (in attachment).
>
> [Why 5?]:
> https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/
>
>
> Demographics
> ------------
>
> I aimed at people who:
>
> - Already knew what Tails was and tried it (but were not regular users).
> - Could understand our interface in English (to varying levels).
>
> As a matter of fact, all were also using Linux as their regular
> operating system but with very different technical levels. I expected us
> to confirm and explore serious problems around the administration
> password and installation techniques, even for people who were already
> familiar with Debian, so I thought that for this round it was best to
> fix issues for this public already than to do more catastrophic tests
> with people coming from other operating system. Given the little time I
> had for recruiting, it's also the public that was easiest for me to find.
>
> You'll find more demographics in the spreadsheet.
>
>
> Summary
> -------
>
> The interface that we designed in January worked really well. Only one
> participant had a very bad time and failed some of the missions.
>
> The others took on average 30 minutes to add Mumble to their additional
> software.
>
> Most issues are around the interface itself: they are integration issues
> or issues in other parts of Tails, in GNOME, or in the navigation on our
> website and documentation.
>
> The average [SUS] score is of 68 which is considered right on average
> compared to other industry products.
>
> The questions which scored below average are:
>
> - I would imagine that most people would learn to use this system very
> quickly.
> - I think that I would like to use this system frequently.
> - I needed to learn a lot of things before I could get going with this
> system.
> - I found the system very cumbersome to use.
>
> So Tails seemed complex to understand but in the end not super hard to
> use either for these people.
>
> [SUS]:
> https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html
>
>
> Spreadsheet
> -----------
>
> You'll find a rainbow table with the list of issues in attachment.
>
> - The "M" column correspond to the mission in which the issues occurred.
> - The "B", "C", and "B/C" correspond to a rough cost/benefit ratio based
> on a punctuation of severity of the issues (1, 2, or 3) and cost.
> It's the first time I try this, so let's see if it's useful :)
> - Issues in gray are affecting parts of Tails that we are not working on
> as part of this project.
> - Issues in bold are the top issues.
> - I marked '"' when several issues could share a single solution.
> - I marked '?' when I didn't know how to solve an issue.
>
> Top 10 issues
> -------------
>
> It's hard for me to find a clear boundary between what is part of the
> Additional Software project and what is not, since most of the issues
> are related to integration and other bits of Tails.
> To reported on the tests I decided to not make a distinction for now, as
> an issue in our offline doc or GNOME can hurt someone trying to use
> Additional Software as much as an issue in the Additional Software
> configuration itself.
>
> But I understand that we can't fix all these issues ourself and will
> also to triage them more with the rest of the Additional Software team.
>
> In the spreadsheet I'm also listing bugs that should be fixed even
> though they didn't hurt usability during the tests.
>
> In rough order of importance:
>
> 0. Looks for Synaptic but can’t find it (without admin password)
> 1. Clicks on a notification to get more info
> 2. Doesn’t know where the notification tray is
> 3. Doesn’t understand why `apt install` fails (without lists)
> 4. Doesn’t know which admin password to enter when prompted
> 5. Doesn’t find the user documentation in the offline doc
> 6. Doesn’t read well the documentation on admin password
> 7. Doesn’t notice persistence in Greeter at first
> 8. Expects data from session to be saved to persistence when created
> 9. Doubts what the [X] button does
>
> Next steps
> ----------
>
> - Alan and intrigeri should investigate the cost of fixing these issues.
> - The team should agree on which issues to tackle and in which order.
> - Do a bit more Redmine on the tickets I created to encode all this.
>
> All this might be done faster in a voice meeting.
>



--
----------
Bernard Tyers
Independent User Researcher & Interaction Designer
PGP Key: https://keybase.io/ei8fdb
Twitter: @bernardtyers

User-Centred Design, Open Source and Privacy