Thoughts on Recent Usability Testing

One of the things I find most rewarding about working in user experience is the inherent opportunity to talk to customers on a regular basis. Quite simply, it’s built into the process. It feels invigorating to be able to engage in frequent customer conversations as a matter of course, not as a separately-planned activity or a “nice-to-have”.

Several weeks ago I had the opportunity to interview five students about using a new form of online content review as part of their study plan. I’d interacted with many similar students in the past, mostly for purposes of gathering feedback about their study experience for marketing purposes. While it’s always great to hear nice things from a user after their experience has ended, it’s so much more empowering to speak with someone still engaged with the product. As designer and part-shaper of what they are experiencing, I actually get the opportunity to hear what’s going well and what isn’t while it still matters—and take actionable steps to pivot the product in a better direction sooner rather than later.

Our foray into usability testing went well, despite it coming together in a short timeframe and taking place remotely. For this first pass, we went with a GoToMeeting setup that allowed for screen-sharing, and handled audio/video capture through Adobe Connect. (We’ve since upgraded our GoToMeeting account for native recording capabilities.) Here are five things I learned during the process:

1. There’s a lot of power in thoughtful pauses. #

It’s really tempting to try and fill in spans of silence, especially when wanting to be helpful or just to avoid awkwardness. But erring on the side of letting the user elaborate—in particular, knowing just when to wait for them to add to or expand upon something they’ve said—can produce much more meaningful insights than assuming they’ve reached the end of their thought trail and moving on to the next question or prompt.

2. Capturing key takeaways quickly is important. #

Although we recorded our sessions and had the complete audio transcribed later on, my co-moderator and I wound up organically texting big “a-ha” moments to each other as they were happening during testing. We also had a quick de-brief immediately after each session. This really helped us cement emerging themes and develop a framework around which we based our detailed findings.

3. Engagement on both sides matters. #

Likely due in part to the nature of remote testing, we had a couple of initially detached participants who warmed up nicely as the session went along. How did we do it? We started out by asking about them—who they were, why they decided to embark on a career in their chosen field, how they handled their day-to-day challenges, what they hoped to achieve in the future. Make the time to relate to your user as a human being and they will usually respond, even if they’re not the most enthusiastic participant at the outset.

4. It’s good to have back-up plans when doing remote testing. #

As mentioned, we went with GoToMeeting for our first remote session because of its relative ease of use. GoToMeeting allows the user to screen share while their likeness stays visible on the lower left-hand corner of the screen. As we hadn’t yet set up a paid account, my co-moderator joined the meeting through Adobe Connect, which allowed for video and audio capture.

It was a simple enough setup, but it wasn’t without its hiccups. We had some browser compatibility issues (particularly with Firefox on PC) and issues with screen redraw speed and audio/video sync. In cases where it became difficult to hear the user through their computer, we had them dial into a conference line; when we couldn’t see what someone was doing because of a stalled screen, I brought up the interface we were discussing on my end and followed along with the user as they described what they were seeing.

These solutions were not optimal, but they made it possible to keep the feedback flow going as opposed to having to give up completely.

5. Be prepared for users to ask questions of you. #

Our users were already engaged with the product prior to this specific session, so they had already formed impressions on aspects of the interface, the curriculum, and the experience. We had to keep our focus on guiding participants towards the areas where we wanted their feedback, but still be receptive to broader questions they had. I can appreciate how general or unrelated user questions can derail a usability session. On our part, we tried to provide succinct, earnest responses to questions while committing to post-session followup on any pressing issues to keep things moving.

 
19
Kudos
 
19
Kudos

Now read this

On Prototyping

The concept of developing an early, unrefined representation of an idea for feedback and iteration is not a new experience for many traditional designers. Those of us “classically trained” were taught in design school to put pencil to... Continue →