Abstract

Introduction Heuristic Evaluation is a usability method that requires usability experts to review and offer feedback on user interfaces based on a list of heuristics or guidelines. Heuristic Evaluations allow designers to get feedback early and quickly in the design process before a full usability test is done. Unlike many usability evaluation methods, Heuristic Evaluations are performed by usability experts as opposed to target users. That is one reason it is going to make a great challenge activity for the UX Day Challenge session. Heuristic Evaluation is a usability method often used in conjunction with usability testing. During the evaluation, usability experts evaluate an interface based on a list of heuristics or guidelines (Nielsen and Molich, 1990). There are several sets of guidelines and they are used to evaluate a myriad of interfaces from gaming (Pinelle, Wong & Stach, 2008) and virtual reality (Sutcliffe & Gault, 2004) to online shopping (Chen & Macredie, 2005). Some of the most common heuristic guidelines to choose from were created by Nielsen (Nielsen and Molich, 1990) (Nielsen, 1994), Norman (Norman, 2013), Tognazzini (Tognazzini, 1998), and Shneiderman (Shneiderman, Plaisant, Cohen and Elmqvist, 2016). Choosing the best set of guidelines and the most appropriate number of usability professions is important. Nielsen and Molich’s research found that individual evaluators only find 20-51% of the usability problems when evaluating alone. However, when the feedback of three to five evaluators is aggregated together, more usability problems can be uncovered (Nielsen and Molich, 1990). This method can be advantageous because designers can get quick feedback early for iteration before a full round of usability testing is performed. The goal of this session is to introduce this method to some and give others a refresher on how to apply this method in the real world. The Challenge For several years, UX day has offered an alternative session. The most intriguing sessions were interactive and offered hands-on training. For this UX Day Challenge session, teams of at most five participants will perform a Heuristic Evaluation of a sponsor’s website or product. During the session, participants will be introduced to Heuristic Evaluations. Topics such as how to perform one, who should perform one, and when it is appropriate to perform one will be covered. Additionally, the pros and cons of using this method will be discussed. Following the introduction to Heuristic Evaluation, teams will use the updated set of Nielson Heuristics (Nielsen, 1994) for the evaluation exercise. Although there are several sets of heuristics, Nielsen’s is one of the best known and widely accepted sets. The following Updated Nielsen Heuristics will be used: • Visibility of system status • Match between system and the real world • User control and freedom • Consistency and standards • Error prevention • Recognition rather than recall • Flexibility and efficiency of use • Aesthetic and minimalist design • Help users recognize, diagnose, and recover from errors • Help and documentation Following the evaluation period, teams will be asked to report their findings and recommendations to the judges and audience. The judges will deliberate and announce the winner. Conclusion This alternative session will be an opportunity to potentially expose participants to a methodology they may not use often. It will also be an opportunity to have a hands-on learning experience for students who have not formally used this methodology in the real world. Most importantly this session is in continuation of the goal to continue to bring new, interesting, and disruptive sessions to the traditional “conference” format and attract UX practitioners.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call