Doing some very important A/B testing.
Archive — A/B testing
Reasons why you shouldn’t simply ask users to choose which design they prefer.
It turns out people aren’t good at answering this kind of question. People don’t know why, or they don’t care enough to answer, or they may not want to tell you. When asked for an opinion, most people will form one on the spot. Such opinions aren’t carefully considered or deeply held. It’s not that UX researchers don’t care what people like: it’s just risky making important design decisions based on fickle opinions.
User experience isn’t about discovering what people think they want. It’s about finding out what they need.
Hooked and booked
Following on from an article I linked to a few weeks ago about the dark patterns used by Booking.com to pressurise its users into making decisions, Jeremy Keith follows up with this reflection on why A/B testing used badly makes things worse.
A/B testing is a great way of finding out what happens when you introduce a change. But it can’t tell you why.
Part of this is also about a narrow focus on the wrong metrics. If a business decides it simply wants to increase the percentage of people hitting a partiuclar call to action on a webpage, this is the path they will end up on.
If, however, they can find a more sophisticated way to measure long-term customer satisfaction, surely users will feel less stressed, and the business will improve more in the long run.
A/B testing ain’t for settling your disagreements
We’re running experiments based on ideas we’ve had and ignoring the very real possibility that the thing we’re testing doesn’t actually matter to our customers.