Archive — Usability testing

Repeat after me: Preference testing is not A/B testingDavid TravisUserfocus

Person holding up two photographs

Reasons why you shouldn’t simply ask users to choose which design they prefer.

It turns out people aren’t good at answering this kind of question. People don’t know why, or they don’t care enough to answer, or they may not want to tell you. When asked for an opinion, most people will form one on the spot. Such opinions aren’t carefully considered or deeply held. It’s not that UX researchers don’t care what people like: it’s just risky making important design decisions based on fickle opinions.

User experience isn’t about discovering what people think they want. It’s about finding out what they need.

Comment

Encouraging self-service through improving content at the University of EdinburghLauren TormeyGatherContent

Diagram demonstrating process of continuous improvement

My awesome colleague Lauren Tormey wrote this blog post about a brilliant project she’s been involved in. She has been collaborating with our Information Services Helpline to reduce unnecessary support calls by iteratively improving content with a regular cycle of usability testing.

Over two summers, we had done work to improve content related to getting a student ID card. This was another case of turning long pages with giant paragraphs into concise step-by-step pages.

From July to September 2017, the IS Helpline received 433 enquires related to student cards. For this same period in 2018, they received 224, so the figure nearly halved. I repeat: halved.

Comment

Improving student experiences in Learn: usability testing showcase and workshopInformatics Learning Technology Service

Prioritised usability issues

My colleague Alex Burford from the University of Edinburgh School of Informatics has written this great blog post about some usability testing we have conducted in support of the Learn Foundations project.

I thoroughly enjoyed working with Duncan Stephen on this mini project. The feedback was informative, encouraging, and a call to action. I’m looking forward to embedding similar practice across the School for alternative platforms for content delivery.

You can read my own reflections on this work at the Website and Communications team blog.

Each month we are working with a different school to conduct usability testing in Learn, the virtual learning environment, to inform improvements to the Learn service.

This is just one strand of a huge amount of user research I’ve been carrying out for the Learn Foundations project. It’s been a fascinating and very enjoyable project to work on. I’ve been pretty lax at writing about it yet — but I’ll be posting much more about it soon.

Comment

The hunt for missing expectations

The hunt for missing expectations

Jared Spool tells the story of a bookkeeper who became frustrated using Google Sheets because it didn’t have a double underline function.

To keep [usability] testing simple and under control, we often define the outcomes we want. For example, in testing Google Spreadsheet, we might have a profit and loss statement we’d want participants to make. To make it clear what we were expecting, we might show the final report we’d like them to make.

Since we never thought about the importance of double underlines, our sample final report wouldn’t have them. Our participant, wanting to do what we’ve asked of her, would unlikely add double underlines in. Our bias is reflected in the test results and we won’t uncover the missing expectation.

He suggests interview-based task design as a way of finding these missing expectations. Start a session with an interview to discover these expectations. Then construct a usability test task based on that.

I recently ran hybrid interviews and usability tests. That was for expediency. I didn’t base tasks on what I’d found in the interview. But it’s good to know I wasn’t completely barking up the wrong tree. I plan to use this approach in future.

Comment