Archives for February 2014
February 12, 2014
Yeah, I’m jumping on the QC bandwagon while it’s hot. Last week like so many others like me I read the Fast Company article celebrating Facebook’s release of Origami, a plugin to Apple’s Quartz Composer graphics tool that aims to make it easier to use for interaction design and interactive prototyping. I had opened QC once or twice and found it pretty unapproachable, but this time committed to spending a couple of hours figuring it out. I’m eventually going to write a lil’ tutorial of my own about how to get started prototyping a simple app, but for now wanted to share a few useful starting points and tutorials, and a few less-useful personal reactions and observations.
QC is clearly a very powerful tool with generative graphics and animation capabilities that go well beyond what I’d need for even complex UI prototyping. There are some full-on epic examples of what’s possible woven into this discussion thread on Branch.com. For example, check out this prototype of Facebook Home’s navigation complete with “gravity” effects. It’s evident that mastery of this tool gives an interaction designer a ton of knobs to tweak when prototyping and demonstrating rich animations and interactions.
QC, even with Origami, is definitely not a “photoshop for interaction design”. The Fast Company piece calls it that in the title, so it’s not my fault for expecting it. Origami makes it promising, but it’s nowhere near a one-stop shop. You’ll need to use Photoshop (or Illustrator, or Sketch) upstream from QC in your workflow to design, create and cut graphic assets—just like you would when designing an app—to import into QC and bring to life. The size-cut-import workflow actually reminded me of the good old days of building Flash apps, just without any ability to create vector shapes or type.
div containing an image has set of CSS rules like
position:absolute;top:10px; applied to it directly. In QC, if I want to apply that same 10 pixel offset to an image, I need to take the image and manipulate it indirectly using what QC calls a Transform patch. That, and there’s the typical beginner’s frustration of knowing that I could build a tab bar in 10 minutes MY WAY but instead I’m sitting here connecting those fucking tiny little wires to boxes repeatedly and seriously, can’t I just
$('.tab').on('click') or something and make it work already?
There really aren’t that many QC tutorials that focus specifically on interaction design and prototyping. It seems like there’s a community of VFX artists and, specifically, VJs who use the tool to great effect and have made both tutorials and tons of sample files available on quartzcomposer.com and kineme.net. It’s a bit harder to hunt down what applies specifically to an interaction designer’s use case.
QC and Origami Resources
Dave O’Brien created an epic set of video walkthroughs where he recreates Facebook Home using Quartz Composer. Confession: I didn’t really watch past the second one. I bet that if I did, I’d like QC more.
And whoever these Prabrothers are, they appear to be on track to write a super-comprehensive guide to using QC for prototyping. The first chapter alone as well as the glossary are the two best introductions I found.
This Hackdesign lesson on QC has links to those two and more.
Finally, the example compositions from Origami’s own documentation were perhaps the most enlightening of all. They have these helpful little hints nestled within the different editors, not to mention very explicit descriptions of what each of the Origami patches do.
February 03, 2014
On January 1, I publicly declared a self-imposed challenge that I’d write something on this blog every day. It took me three days to cheat.
So then on January 5, I changed those self-imposed rules of the game from “writing” to “publishing” every day. And here’s what January 2014 looked like in terms of sharing with the world.
I realize this isn’t really an impressive number and a lot of people do this before breakfast or from the bar before their second drink. But we’re talking about a guy who tweeted only 8 times in October of last year and only 5 times in August. So let me enjoy it.
I fell off in the second part of the month, mainly because I was spending what would be blogging time getting ready for public speaking time (see below). Still, I managed five more posts than the one that I made in October 2013, and infinitely more than the zero posts published in November and December.
Well, I gave two talks (one at UCLA and one at General Assembly) and sat on one panel at GA. I’m further changing the rules of the game by adding public speaking as another form of “publishing”. It works since a goal of this challenge is to turn myself into a courageous idea-spewing machine impervious to any fear of criticism or exposure, and public speaking has historically given me a lot of anxiety. Spoiler alert: the more I do it, the easier it’s getting.
Arguably these are just a by-product of giving talks, but I count them anyway.
I have absolutely no aspiration that these are going to take me any further than the second bedroom that I make them in, but DJing used to be a huge source of enjoyment for me and before this month I literally hadn’t touched it in almost two years.
I launched an Instagram x Vine mashup, Vinestatube, that I had been messing with for an hour here and an hour there on-and-off since the middle of 2013. I’m pretty sure I wouldn’t have landed the plane on this without the extra fire that this challenge put under my ass. Maybe next month I’ll actually tell someone about it and get a user or two on there.
I’m not gonna lie for a second: I’m proud of myself. While I don’t think my writing has gotten stronger or more confident yet, I also don’t find myself infinitely postponing that single click on the “Publish” button. On the other hand, I’m definitely starting to feel like I’m finding a speaking voice and gaining more confidence in front of a room. And as a side effect, I dusted off a hobby and made myself launch a side project even if it was a bit unfinished.
February 03, 2014
I wrote a couple of posts about preparing our discount usability testing and user research lab at Fullscreen with software, hardware and a physical space. In this third and final post, we get to put it through the paces for the first time.
Faking It (since we don’t want to make it yet)
The team used Keynote to prototype two different takes on a feature we’re building into the Creator Platform. Keynote is basically awesome for early-stage interactive prototypes because not only is it super-quick to learn and use (especially if you start with something like Keynotopia’s UI stencils) but also it’s relatively simplistic design features force a lower-fidelity approach than using Photoshop to create screens for a prototype. Bonus: it’s free!
It has limitations, though: for example, you can’t scroll within a screen. Since one of the concepts we wanted to test was heavily reliant on vertical scrolling, we had to (really inelegantly) fake it. This particular concept tested unanimously worse than the other one, which didn’t happen to require such overt trickery. And while I don’t attribute that to the scrolling fakery alone, I’m sure they didn’t help make the experience any easier to understand.
More faking: we used a static screenshot of browser chrome to frame the content of each prototype and make them seem minimally real. Which worked so well that it only took a couple minutes for someone to try the static browser “back” button when clicking through prototypes. It just took a quick adjustment to build a fake back button by placing a transparent hyperlink over it leading to the “previous slide” and it was a reminder that anything we present users with during testing is fair game and the more we can anticipate, the more realistic interactions we’ll be able to observe.
Since we were testing prototypes of a web app, there was no need for much of the laundry list of equipment that we’ll need to properly capture mobile user testing. Just a Macbook running Silverback to record a screencast and the user’s reactions. It worked pretty well to capture the 45-minute sessions, although sometimes we had a tough time making out what the subjects were saying when we played them back. Next time I’ll definitely try using an external mic which should take care of that.
Once the sessions were wrapped up, we got the team together to watch them and talk through the results. This almost didn’t happen because I didn’t realize that you actually need to export recordings from Silverback, which can take a good amount of time for these 45-minute sessions. Watching low-res previews within Silverback worked in a pinch, but next time I’ll make sure to allow for a little bit of time to get those exports done before getting everyone in a room.
All in all, testing web app prototypes was pretty seamless. I’m looking forward to getting some users in to do some mobile testing next time.