January 10, 2014
This is the first of three posts about my experience building a discount usability testing and user research lab with the team at Fullscreen. I’ll talk about why we’re doing this and touch on the software that we’ll use to capture sessions.
Google Ventures-style product design sprints seem to be popping up all over the place these days. Design sprints have become an important piece of our process toolkit at Fullscreen, not just because I love me a good trendy methodology but really because they’ve helped us us immensely with quick parallel concept development and rapid validation.
In a product design sprint, you bring ideas from concept to low-fi prototype in less than a week and then get them in front of people—preferably, real users of your product—to see how they hold up. So you need a place to work with the users during these sessions as well as a way to capture the sessions so the whole team can review and “score” them.
Now our relatively new office, despite its many lovely features including a human-size terrarium, doesn’t have one of those fancy interview chambers with a one-way mirror and HD simulcast capabilities to the adjacent observation room’s three flatscreen displays. What we do have, though, is our first design sprint of 2014 starting next week. And we need a way to conduct and capture those user sessions by Friday.
So we’re building a user testing lab that makes use of our flexible space and can be set up and taken down with extremely minimal effort, that can handle both desktop and mobile prototypes, and that costs less than $1000 including devices. I’m going to document this exercise for posterity starting with the software we’ve chosen to capture the sessions, then the hardware and the space itself and finally what will hopefully be a triumphant recounting of the first user sessions we run in it.
Our discount usability lab needs to record two things: the interaction with the prototypes onscreen and the user’s face while they’re interacting with it. We want this captured in a single video file so it’s easy to review and discuss later.
A few years back I used an app called Silverback to record some user testing done while building a group messaging app called Volly. Silverback doesn’t seem to have changed altogether too much, but that’s a good thing. It’s designed for doing exactly what we need, which is recording two streams of video—a screencast and the user’s face so we can get the full brunt of their reactions and thought process. It’ll use your laptop’s built-in iSight by default but getting it to recognize a USB webcam is as easy as plugging it in. It even records hotspots whenever the user clicks so you can tell exactly what’s going on.
I’ve also heard that ScreenFlow gets the job done despite having an old-timey website, and it also has built-in editing features which could be handy if you also wanted to use it for something else, like recording product demos or tutorials. ScreenFlow’s extra features have a $99 price tag, though; since Silverback is only $70, that’s what I went with.
We’re sorted for capturing desktop user sessions. But what about testing mobile apps and prototypes? My first thought was to try awkwardly clamping a USB document camera to the mobile device. But this post by one of the creators of Silverback points out how ridiculously simple it is to mirror any iOS device using the $13 Reflector app.
With Reflector, you can mirror whatever’s happening on any iOS device that supports AirPlay mirroring (devices with just plain old AirPlay need not apply; my old iPhone 4 was a no-go). Open Reflector, and your computer will appear as an AirPlay receiver to your phone. From there, it literally just works.
That is, for iOS. Looming large is the day when we find ourselves wanting to capture user testing using Android devices. There’s no doubt in my mind that this will in no way be as painless as Reflector makes it for iOS devices. But as someone wiser than myself once said, any testing is better than no testing at all.