Friday, July 6 - 8am PDT / 10am MDT / 11am EDT / 4pm GMT
Note taker rotation:
Previous meeting notes: 2007- May - 11 Meeting Agenda and Notes
(Usability and accessibility Heuristic Evaluations and Cognitive Walkthroughs)
One of the first design activities we are working on in the Fluid project is to identify current user "pain points" by performing heuristic evaluation and cognitive walk-throughs of uPortal, Sakai and Moodle.
"Heuristic evaluation is a discount usability engineering method for quick, cheap, and easy evaluation of a user interface design" (Jacob Nielson, http://www.useit.com/papers/heuristic/). Design, usability, and accessibility experts will engage in systematic inspections of the the user interface with the goal of identifying usability and accessibility problems based on recognized principles ("heuristics"). Our particular technique will combine heuristic evaluations with cognitive walk-throughs of the user interface so that we also look at expected user flows through the system and identify potential work flow problems. Heuristic evaluation isn't, however, meant to be a replacement for watching real users' activity on a system, so we intend to use heuristics in conjunction with user testing.
These evaluations will help us identify areas of the user interface that are most in need of improvement. We can thus prioritize our work on the most important usability and accessibility problems. Based on the findings of these evaluations, we will focus on UI issues that can be solved by designing well-tested, user-centered UI components. These components will encompass common interactions that can be reused within and across community source applications. On the other hand, we don't expect that all problems can be solved by creating UI components. We'll also ensure that findings and identified solutions outside the component realm will be shared with the communities. Heuristic evaluations and walk-throughs will identify areas of focus; we will engage in solid user-centered design practices and user testing to create the right solution.
The working list of heuristics and cognitive walk-through questions for the Fluid project is being compiled at the User Experience Inspection wiki page. We welcome your input, and would very much appreciate additional volunteers to help with the evaluation process.
The checklist is organic and will continue to be refined as we learn from doing the hybrid inspections/evaluations.
Please feel free to add, delete and/or more your name around this list. This list of names came from volunteers at the May 11th meeting.
Moodle UX Inspection Subgroup
uPortal UX Inspection Subgroup
Sakai UX Inspection Subgroup
Accesibility UX Inspection Subgroup
As a group:
Within the "application" teams:
The process we're putting together has a lot of autonomy in it, and great potential for imaginative discovery. But we also want to achieve as much cross-pollination as we can between the teams, both in setting up the inspection plans and publishing the results. The intention is that each of teams makes its own plan, but with lots of open discussion so that everyone can see what everyone else is doing and borrow each other's best ideas. This should be easy if we all use the wiki as our common forum for planning and discussing.
Let's consider what is common to all the teams. We can start with a description of what a "protocol" means in this context. Roughly, a protocol is is a specification of the tests, processes, scenarios, and goals that determine the procedure for an inspection or walkthrough; and a description of what information is to be captured.
From a team's point of view, the inspection process rests on three elements:
Equipped with the right three elements, the teams will be able to:
The discovery of common issues with common solutions will be a big win for Fluid and should be kept in mind by everyone at all times as something to strive for. General and reusable solutions have a value that goes far beyond addressing a pain point in a particular product.
Ideally, we can set things up so that each team can work with the three elements (protocol, target, report template) that work best for their product, but with as much sharing as possible and a minimum of re-invention.
It's worth mentioning here that the inspections are bound to produce interesting information about each product that may not fit the report template and it will be up to the teams to find a way to communicate it to the rest of us - probably by recording their observations in the wiki..
There is one inspection team per project/application. Teams are expected to be self-organizing and to form their own plans on how to proceed, but to communicate actively with the other teams on their plans and decisions - primarily through the wiki. Much in our approach is experimental, and it will be valuable to record what works, and what does not. Here is an outline for consideration by the team members and coordinators:
Assumptions: Protocol has been created
With complex and flexible products such as uPortal, Sakai, and Moodle, which are highly configurable, customizable, extendable, and responsive to their local institutional environment, defining a test-bed environment for inspection presents some challenges. Some thoughts and suggestions are expressed in: Defining Inspection Targets.
Jacob Nielsen'sdescription and overview
Applying Heuristics to Perform a Rigorous Accessibility Inspection in a Commercial Context
Heuristic Evaluation Report Example
Usability Heuristic checklists
Accessibility Heuristic checklists