Usability Testing

From DevSummit
Revision as of 00:08, 25 November 2015 by Jucsanch (talk | contribs) (Created page with "Resources used for session can be found here: *bit.ly/npdevusability1 *bit.ly/npdevusability2 *bit.ly/npdevusability3 =Introductions= Mixed audience. Some with previous exper...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Resources used for session can be found here:

  • bit.ly/npdevusability1
  • bit.ly/npdevusability2
  • bit.ly/npdevusability3

Introductions

Mixed audience. Some with previous experiences at usability study labs; others with fewer resources; using Google Analytics as basic testing; or manual collecting of individual surveys. (fancy tools/$0/DIY)

Most important thing! DO IT! Any testing of any kind is more valuable than ignoring it.

The ultimate goal of usability testing is adoption: getting more users on your platform.

About the word "users": Some people don't like it (negative connotations), but it's the most inclusive word for different kinds of people/systems may be utilizing your software.

User stories: not always the best approach for developing a platform. Clients and users may unintentionally describe functionality they neither want nor would actually use.

Test the desire without building the functionality.

  • Example: Recipe website where users were saying they wanted a "compare recipes" ability. They put a button on the website that said "Compare", but when clicked it just went to a page that said "Coming soon!" and provided a form with questions about how they would like such a feature to work. Then track the percentage of users that clicked the button, indicating real user desire.

Be clear/transparent with your users. Let them know if your product is still in development.

5 Steps to Usability

Prep

Have something the users can respond to: mockups, wireframes, proof of concept, beta software, etc. Plan for/recruit 3-5 users to test. Law of diminishing returns applies here. More tests start resulting in repetitive. Plan for using a tool to record the user test, DO NOT HOVER when doing tests. Use a screenshare tool even if you're in the same physical location.

Screenshare tools: Gotomeeting, Join.me, Google Hangouts, Skype, Zoom.to. Have user arrive early and make sure technology is working before the scheduled time.

Find diverse participants. Represenative of the types of users you are likely to have using your software. If within your organization there are people that are vocal critics, use them as participants to make them feel a part of the project and reduce the likelihood of poisoning the project.

Tasks

Create a set of tasks for which your users will try to complete. e.g. "Post a comment on a blog post", "Reset your user account password", "Delete a contact from your address book". Start specific to get them on a page and into the software. The list of tasks should be in a Google doc or presentation in which they can access, so they can read and interpret the task besides having it read allowed to them.

Prep

Prepare your note taking device (which is separate from your observation device). Practice recording. Prepare open-ended questions. "Do you see any other ways to solve this problem?", "Is there anything else you'd like to look at on this page?"

Test

Ask users to express their thoughts as they are testing. Recording facial expressions is useful (by using webcam) but definitely optional. Do not comment with "that's great" or "ugh! that's wrong". There are no right or wrong answers as this isn't a test of the user. The user cannot be wrong, as they are just expressing their intuition when using the software. Test for total of 30 - 60 minutes, including set up and finishing questions. That leaves 10-30 minutes of deep actual testing.

Log issues

Consolidate the largest problems and log them as problems in your issue tracking software (use an issue tracker!) Iterate on the changes and then repeat the process.

Usability testing is not training!