×

Blog

“Live prototyping” - a glimpse into the future of mobile UI design

It’s time to build better design tools for the next generation of mobile user interfaces.  Why? Wireframing and static mock tools were designed for the old school desktop experience. There is a clear gap between what these tools can effectively express and the much more dynamic, tangible and intimate mobile experience.

Imagine you are designing a “physics-based UI with panels and bubbles that can be flung across the screen”. How would you talk about its dynamics with just a stack of static images? You can’t. In fact that’s the UI of Facebook Home, and the team used Quartz Composer, a visual programming tool for animation, to create hi-fidelity, interactive prototypes.

But that’s not good enough either. Mobile UIs are so much more than merely a bunch of pictures on the screen (animated or not).  Mobile devices sport a richer set of interactions, form factors and features – multi-touch gestures, haptic feedback, radically different screen sizes, sensors etc. How would you know if a multitouch or tilting gesture works for a user story?  How much can you learn by staring at a few faked phone frames on the computer screen from 20 inches away? Not much. That’s simply not the same intimate experience as using the actual devices. You’d want to hold the device in your hand, touch it, swipe it, pinch it, tilt it, shake it and feel the vibration. However, with the state-of-the-art tools, it’s too difficult and expensive to offer this option.

This gap can be filled by a new generation of tools that empower designers to rapidly create “live prototypes”, hi-fidelity prototypes with realistic interfaces and interactions that “live” (run) in the target devices. They are real, working apps that give designers genuine and accurate feeling about their design, and let them quickly try different variations and observe the results.

Looking into the future of interaction technologies, we need the right tools to shape our increasingly digital world. How could you imagine to create still mockups to capture the interaction for a brainwave controlled app?  What about an interface made of deformable materials that we can see, feel and manipulate?

Live prototypes, real code that runs on the target devices, seem to be a solution. And there may be even better options. What are your thoughts?

NOTE: if you run into issues posting a comment here, would you mind sharing your thoughts in this G+ thread?