Jared Spool took UX to a new level in 1988 when he launched UIE. And by, "to a new level," we mean "validated UX as a vital component of our work, then spent the next 25 years conducting research and writing tirelessly to keep validating it."
Fortunately, for users everywhere, his continued efforts are still paying off.
Jared often can be found onstage, where he captivates crowds with stunning data that reveal how UX can affect a company’s bottom line. He's helped thousands of companies worldwide to increase their profits, identify interaction failures, and integrate UX research and design into their product development cycles.
Reaching designers is a priority, too—aside from these virtual seminars, Jared and the UIE team organize annual conferences chock full of hands-on, practical workshops led by the best industry minds around.
The world of metrics and analytics have often been at odds with how designers work. Design is a process where we finely tune our intuition to create great user experiences. Yet, sometimes, what we think is best rivals the metrics. So which do we believe-our gut or the data?
In the world of measures, metrics, and Key Performance Indicators some practices, like the growth hacking approach to increasing Monthly Average Users (MAUs), have hurt the online experience of Instagram and LinkedIn. While alternatives to satisfaction and net promoter score give insight into the design process and help designers have better instincts.
If you’re ready to talk to your teams about what you really need, help management interpret the data, and create analytical experiments that provide design insights, don’t miss this talk.
There’s never been a better time to be a designer. After years of wishing we’d have the recognition and appreciation for the value we bring, we’re now highly sought after for our talents and skills. A growing number of organizations have seen success through great design, from Apple to Cirque de Soleil to the White House. Others now want to get the same results. The demand for great designers has never been better.
Yet, as the proverb says, "Be careful for what you wish for, lest it become true." Now that everyone expects us to deliver great things, are we ready? While we’re presented with more opportunities than ever, we also have increased challenges.
Are you overwhelmed by the incredible amounts of data you're left with after performing a field study or usability test? Can you effectively analyze the data you've collected, and use your findings to successfully guide your product development process?
Field studies and usability tests produce a vast amount of quality data. However, these techniques produce a vast, often overwhelming, amount of quality data. Interacting with users provides great insights, but making sense of what you've learned is often a huge challenge that many teams find difficult to overcome.
Part of the secret to good data analysis is preparing properly *before* the study begins. Preparing a good set of focus questions, putting together data collection worksheets, and carefully studying participant recruitment will simplify the post-observation analysis. This in turn will help you understand how to spend your observation time more productively. In the first part of this presentation, Jared will walk through these techniques, providing examples as we prepare for a real study.
Once you've collected your observational data, the next challenge is to turn it into information the team can act on. In the second part of our seminar, Jared will explain some classic techniques for turning qualitative observations into quantitative analysis. He'll demonstrate participant pairing to extract attributes, attribute ratings, Pugh diagrams, and the K-J Analysis technique. All of these are essential tools for any team's analysis toolbox.