UX in universitiesUX Soup

UX Soup logo: "Easily listen and subscribe for free in your preferred podcast player"

I have been interviewed for the podcast UX Soup. The host [Chris Schreiner](https://www.linkedin.com/in/chrisschreiner/) was interested in the [User Experience Service’s work at the University of Edinburgh](https://www.ed.ac.uk/information-services/user-experience). He spoke with me about:

* how the consultancy model works in a higher education context
* the history of our service
* the projects we get involved with
* the methodologies we follow
* the specific challenges we face working in higher education

It was good fun being interviewed. Please have a listen if you have the time. Thank you to Chris for the opportunity.

Contextual enquiry with members of staff working with course materials digitallyWebsite and Communications Blog

Staff foam board

As part of our comprehensive programme of user research in support of the Learn Foundations project, the User Experience Service has conducted contextual enquiry to better understand the contexts and needs of staff members working with Learn. This blog post summarises our findings.

Interviews with students to understand users’ needs and contexts around LearnWebsite and Communications Blog

Foam board summarising insights from interviews with students

Summarising the key findings from a set of user interviews I conducted with students on their needs around accessing course materials digitally. Just one of the strands of the Learn Foundations project, which I still have much more to write about.

After analysing and synthesising the insights gathered through the interviews, we built up a picture of how and why students’ experience with Learn varies throughout the year as students attempt to complete different tasks. This is presented as a semester in the life of students using Learn.

The hunt for missing expectations

Jared Spool tells the story of a bookkeeper who became frustrated using Google Sheets because it didn’t have a double underline function.

To keep [usability] testing simple and under control, we often define the outcomes we want. For example, in testing Google Spreadsheet, we might have a profit and loss statement we’d want participants to make. To make it clear what we were expecting, we might show the final report we’d like them to make.

Since we never thought about the importance of double underlines, our sample final report wouldn’t have them. Our participant, wanting to do what we’ve asked of her, would unlikely add double underlines in. Our bias is reflected in the test results and we won’t uncover the missing expectation.

He suggests interview-based task design as a way of finding these missing expectations. Start a session with an interview to discover these expectations. Then construct a usability test task based on that.

I recently ran hybrid interviews and usability tests. That was for expediency. I didn’t base tasks on what I’d found in the interview. But it’s good to know I wasn’t completely barking up the wrong tree. I plan to use this approach in future.

Presenting findings of our user research for the API Service

User experience research for the University of Edinburgh’s API Service

I have been leading some user research for a project at the University of Edinburgh to develop API Service. [This post on the University Website Programme blog outlines the steps we went through in the first phase of the research](https://website-programme-blog.is.ed.ac.uk/user-experience-research-for-the-api-service/). This included interviewing developers, running workshops, and developing personas and journey maps.

This has been a successful and rewarding project. It has been particularly interesting for me to do some UX work that wasn’t necessarily to do with a website. There will be a couple more blog posts about it to come.