Skip to main content

This site requires you to update your browser. Your browsing experience maybe affected by not having the most up to date version.

 

Usability Testing for SS3

One month ago, Senior Project Manager Julian Meadow (aka J), along with colleagues Paul Clarke...

Posted in UX/Design

Comments 3

by Kerstin Schuman

Posted 9 May 2012

Read post

One month ago, Senior Project Manager Julian Meadow (aka J), along with colleagues Paul Clarke and Ryan O'Hara, usability tested SilverStripe 3.0 beta 2 with a number of clients around Wellington, to better understand if the user interface is intuitive or if there are areas that could be improved. I sat down with J to hear more about the long and winding road that is usability testing.

Why did you decide on a usability test for SilverStripe 3.0?

After more staff within SilverStripe became involved with the development of SilverStripe 3.0 and started playing with the early beta releases, we realised there were a number of usability issues that needed to be addressed.

Who do you think should do usability testing? Does every tool and website need to get tested?

I’d encourage anyone building something online that has some form of complexity or workflow to do some usability testing, even if it’s just showing it to a couple of other colleagues and getting their feedback.

What exactly did you test?

We tested the main sections of the back-end interface, the main parts a web administrator would use on a daily basis. We did this by preparing a list of typical tasks, including adding a new user, adding and organising files, creating, editing and publishing a new page, rearranging and deleting pages.

How did you do the testing?

We decided to only test with users that had previous experience in using SilverStripe, and to test at their desk where possible. Two of us from SilverStripe attended each test session, one to run through the tasks with the participant, the other to observe and take notes.

After the testing sessions were completed we (Paul, Ryan and myself, i.e. two designers and a PM with usability experience) reviewed all the results, agreed and grouped our findings and presented these to key staff working on the development of SilverStripe 3.0. During this presentation we made a point of not discussing solutions and instead focussed on agreeing on the main findings i.e. agreeing on areas that needed to be addressed because participants found them confusing or hard to use.

We (again Paul, Ryan and myself, plus Sean for technical input) then worked on developing a number of recommendations. Efforts were focussed on suggestions of how to fix the usability issues raised. We presented these and together agreed on the resolutions and order in which they should be fixed i.e. which ones to fix prior to a full stable release (in beta 3) versus which ones to fix later in version 3.1. Prioritisation was based on a number of factors, including technical complexity and user impact.

How many people did the test and how much time did they have to fulfill the tasks you gave them?

Initially we tested with four individuals. We were prepared to test with more participants if needed (typically up to six participants is adequate), but found after testing with four we had collected enough feedback to work with.

Were the results even throughout the group? Meaning did different people have similar issues or things they liked?

The results followed the 80/20 rule, 80% of the issues were common and reported (reinforced) by all participants, the other 20% of the issues were scattered and more specific.

What were the main findings from the usability test?

  • Pages section and site tree - participants didn’t understand the site tree’s page tips, found drag’n’drop and multi-selection confusing, struggled to make the connection between multi-selection and the actions dropdown and expected more bulk actions.
  • Editing and publishing pages - participants found the save/draft/publish button layout confusing, too busy and didn’t provide feedback (a general comment throughout the site). Plus participants were unsure if they were previewing the draft or published site.
  • Adding and modifying a user - participants found adding and searching users and a number of the related icons confusing. Plus all participants failed to define a group when adding a new user.
  • Adding and organising files - participants were unsure if the file they had selected was uploaded. Plus they struggled with editing file names and navigating back to the parent folder. Also, when all participants clicked on a folder name in the site-tree/file-tree they expected it to expand to show nested pages/folders.

Were there any surprises for you and the SilverStripe team? Anything that you think would work really great and didn’t or the other way round?

We were surprised how consistently all the test participants tried to use the search user fields to create a new user. A number of factors combined to confuse the participants; bad button labelling and positioning, confusing search boxes and icons. As a result we’re going to rename the “Add New” button to “Add User” and move it from the far right to the left of the screen (following the original designs), plus we’re going to tidy up the user search interface.

Also, none of the participants could log out. One user spent two minutes looking at the interface and finally had to give up. Noone understood that the power icon was the logout button.

Participants were confused by the filters pull-out, most of the time it’s there, but when editing a page it gets replaced by the site-tree.

The biggest finding for us was that all participants felt they weren’t getting enough feedback after completing an action, to confirm it had been done. Examples include participants uploading the same file multiple times because they didn’t get confirmation each time they completed the task and participants were unsure if their draft page had been saved.

On the positive side, all participants liked the new and improved look of the SilverStripe 3 back-end interface.

What are the main tasks coming out of the of the usability test?

We’re looking at a number of ways to address the issues we found. Most changes are small cosmetic things, like changing button labels, using less ambiguous and more familiar icons, and changing positioning so it’s more intuitive with the task. We’re revamping the actions buttons at the bottom of the page editing area, improving labels and tips on the site-tree, improving options to drag’n-drop and multi-select, and a bunch of other stuff to make the user interface more user friendly.

By when will they be completed?

We’re presently resolving the high priority issues (due in the beta 3 release) and will then work on resolving the other outstanding issues, but will most likely hold-off introducing these until release version 3.1

What’s next? Will you do more testing?

Yes, we’re planning to usability test beta 3 after it’s been released, to check the issues we’ve resolved have improved the SilverStripe 3.0 interface and to explore some of the ideas we’re proposing to develop for later releases.