- •Improving Your Website Usability Tests
- •Design your usability test script to answer specific research questions
- •Key takeaway
- •Give participants the confidence to behave naturally
- •Key takeaway
- •Leave room for user freedom to complete the task in their way
- •Key takeaway
- •Relax, shut up and see what unfolds
- •Key takeaway
- •Tailor the tasks to the participant in front of you
- •Key takeaway
- •Always include tasks on peer or competitor websites
- •Key takeaway
- •Don’t let them know which website you are testing straight away
- •Key takeaway
- •In summary
- •We are multifaceted
- •Personas, user journeys, mental models?
- •Modelling user groups
- •Exploring our assumptions about users
- •Step 1: draw your matrix
- •Step 2: identify important axes
- •Step 3: identify key questions
- •Step 4: fill it up
- •Step 5: iterate
- •Analysis and key insights
- •Assumptions vs. Knowledge
- •Common factors
- •Priority and dependencies
- •Core value proposition
- •Taking it further
- •When does it work best?
- •Further reading
- •Acknowledgements
- •50 Design Problems In 50 Days: Real Empathy For Innovation
- •Tube congestion
- •Real empathy
- •Innovation via immersion
- •Methods for anyone and everyone
- •Get into a cold sweat
- •Meet people
- •Everyone is a designer
- •Prototype in situ
- •The solution… not quite
- •The solution: play
- •Conclusion
- •Challenging bars and buttons
- •The power of gesture-driven interfaces
- •Removing ui clutter
- •An interface that fits
- •Think in terms of time, dimension and animation
- •The learning curve
- •Stop talking, start making
- •Setting the stage: a few things to consider
- •The desktop
- •The shift in the way we interact with machines
- •Imagine a cooking website for people who can’t cook
- •Social gets back to the real world
- •Social ux in the future: a use case
- •How wearable technology interacts with desktops
- •What can we do about this today?
- •Conclusion
- •About The Authors damian rees
- •Pete smart
- •Robert hoekman jr
- •Stephanie troeth
- •Thomas joos
- •Tommy walker
- •About Smashing Magazine
Setting the stage: a few things to consider
Prior to the introduction of the iPhone in 2007, many considered the smartphone to be for techies and business folk. But in 2013, you’d be hard pressed to find someone in the developed world who isn’t checking their email or tweeting at random times.
So, it’s understandable to think that a conversation about motion control, 3-D interaction and portable monitors is premature. But if the mobile revolution has taught us anything, it’s that people crave connection without being tethered to a stationary device.
To really understand how user experience (UX) will change, we first have to consider the possibility that social and utilitarian UX will be taking place in different environments. In the future, people will use the desktop primarily for utilitarian purposes, while “social” UX will happen on a virtual layer, overlaying the real world (thanks to Glass). Early indicators of this are that Facebook anticipates its mobile growth to outpace its PC growth3 and that nearly one-seventh of the world’s population own smartphones4.
The only barrier right now is that we lack the technology to truly merge the real and virtual worlds. But I’m getting ahead of myself. Let’s start with something more familiar.
The desktop
Right now, UX on the desktop cannot be truly immersive. Every interaction requires physically dragging a hunk of plastic across a flat surface, which approximates a position on screen. While this is accepted as commonplace, it’s quite unnatural. The desktop is the only environment where you interact with one pixel at a time.
Sure, you could create the illusion of three dimensions with drop shadows and parallax effects, but that doesn’t change the fact that the user may interact with only one portion of the screen at a time.
This is why the Leap Motion controller is revolutionary. It allows you to interact with the virtual environment using all 10 fingers and real-world tools in 3-D space. It is as important to computing as analog sticks were to video games.
The shift in the way we interact with machines
To wrap our heads around just how game-changing this will be, let’s go back to basics. One basic UX and artificial intelligence test for any new platform is a simple game of chess.
(Image:
Wikimedia
Commons5)
In the game of chess below, thanks to motion controllers and webcams, you’ll be able to “reach in” and grab a piece, as you watch your friend stress over which move to make next.
Now
you can watch your opponent sweat. (Image: Algernon
D’Ammassa6)
In a game of The Sims, you’ll be able to rearrange furniture by moving it with your hands. CAD designers will use their hands to “physically” manipulate components (and then send their design to the 3-D printer they bought from Staples7 for prototyping.)
While the lack of tactile feedback might deter mainstream adoption early on, research into haptics8 is already enabling developers to simulate physical feedback in the real world to correspond with the actions of a user’s virtual counterpart. Keep this in mind as you continue reading.
Over time, this level of 3-D interactivity will fundamentally change the way we use our desktops and laptops altogether.
Think about it: The desktop is a perfect, quiet, isolated place to do more involved work like writing, photo editing or “hands-on” training to learn something new. However, a 3-D experience like those mentioned above doesn’t make sense for social interactions such as on Facebook or even reading the news, which are more becoming of mobile9.
With immersive, interactive experiences being available primarily via the desktop, it’s hard to imagine users wanting these two experiences to share the same screen.
So, what would a typical desktop experience look like?
