Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 5.3

...

Panelsection

Upcoming Meeting

Friday, June 8 - 8am PDT / 10am MDT / 11am EDT / 4pm GMT

Breeze: http://breeze.yorku.ca/fluidwork/

Agenda:

  • Updates on subproject work
  • Discuss reporting format (please take a look at/comment on/edit example template)
  • Other?

Note taker rotation:  

Previous meeting notes:   2007- May - 11 Meeting Agenda and Notes

User Experience Inspections of uPortal, Moodle, Sakai

(Usability and accessibility Heuristic Evaluationsand Cognitive Walkthroughs)

One of the first design activities we are working on in the Fluid project is to identify current user "pain points" by performing heuristic evaluation and cognitive walk-throughs of uPortal, Sakai and Moodle.

"Heuristic evaluation is a discount usability engineering method for quick, cheap, and easy evaluation of a user interface design" (Jacob Nielson, http://www.useit.com/papers/heuristic/). Design, usability, and accessibility experts will engage in systematic inspections of the the user interface with the goal of identifying usability and accessibility problems based on recognized principles ("heuristics"). Our particular technique will combine heuristic evaluations with cognitive walk-throughs of the user interface so that we also look at expected user flows through the system and identify potential work flow problems. Heuristic evaluation isn't, however, meant to be a replacement for watching real users' activity on a system, so we intend to use heuristics in conjunction with user testing.

These evaluations will help us identify areas of the user interface that are most in need of improvement. We can thus prioritize our work on the most important usability and accessibility problems. Based on the findings of these evaluations, we will focus on UI issues that can be solved by designing well-tested, user-centered UI components. These components will encompass common interactions that can be reused within and across community source applications. On the other hand, we don't expect that all problems can be solved by creating UI components. We'll also ensure that findings and identified solutions outside the component realm will be shared with the communities. Heuristic evaluations and walk-throughs will identify areas of focus; we will engage in solid user-centered design practices and user testing to create the right solution.

The working list of heuristics and cognitive walk-through questions for the Fluid project is being compiled at the User Experience Inspection wiki page. We welcome your input, and would very much appreciate additional volunteers to help with the evaluation process.

User Experience Inspection Protocol / Checklist

borderfalse
Column
width70%

What are Fluid UX Walkthroughs?

Excerpt

Fluid UX Walkthroughs are a combination of usability and accessibility reviews of Fluid partner applications, with the goal of identifying user "pain points," and then proposing and prioritizing user interface improvements to address them. Read more about UX Walkthroughs

Evaluations are conducted as:

  • heuristic evaluations - comparing the user interface against an accepted set of "rules" or heuristics.
  • cognitive walkthroughs - determining how easy it is for a user to use the application.
  • Code convention compliance reviews - evaluating compliance to a set of best practices for achieving accessibility and usability.

How are Fluid UX Walkthroughs Performed?

Fluid UX Walkthroughs are performed by reviewers with diverse areas of expertise residing at a number of different institutions. To ensure consistency of approach and results the following material has been created to guide their efforts.

The protocols and guidelines will continue to be refined as we learn from doing the hybrid inspections/evaluations.

Fluid Project Walkthroughs and Working Groups

Please feel free to add, delete and/or more your name around this list.  This list of names came from volunteers at the May 11th meeting.

Moodle UX Inspection Subgroup

  • Ron (coordinator)
  • Herb
  • Dave

uPortal UX Inspection Subgroup

  • Paul (coordinator)
  • Tara
  • Daphne
  • Colin
  • Kathy?
  • Gary?

Sakai UX Inspection Subgroup

  • Daphne (coordinator)
  • Kathy
  • Seamus

Accesibility UX Inspection Subgroup

  • Mike (coordinator)
  • Rich
  • Colin
  • Julie

"To Do" list - User Experience Inspection Protocol

As a group: 

  1. In Progress - Create list of usabilty accessibility heuristics - use draft list to get started.  In the spirit of agile, we'll refactor the list as we learn from the experience of combining the usability and accesibiliy heuristics and the cognitive walkthrough methods
  2. Agree on evaluation reporting format

Within the "application" teams:

  1. Agree on user profiles

Working Group Coordination

There is one inspection team per project/application. Teams are expected to be self-organizing and to form their own plans on how to proceed, but to communicate actively with the other teams on their plans and decisions - primarily through the wiki. Much in our approach is experimental, and it will be valuable to record what works, and what does not.  Here is an outline for consideration by the team members and coordinators:

  1. The coordinator arranges an initial team meeting, using the Breeze meeting room or other convenient venue.
  2. The team members identify their areas of experience, expertise, and interest in:
    1. Accessibility - cognitive, visual, etc
    2. Usability
    3. Cognitive walkthroughs
  3. The team discusses the protocol (See Clayton's outline)
    1. What usability heuristics do the members find most suitable?
    2. What accessibility measures/tools are to be used?
    3. What user profiles are to be assumed?
    4. What cognitive walkthrough scenarios are to be attempted?
    5. What refinements are required in the protocol?
  4. The team assesses coverage. What areas are covered, and with how much (desirable) redundancy? What areas aren't covered?
  5. Team member partnerships are arranged where possible to address usability and accessibility synchronously. Team leads are assigned in areas of expertise.
  6. The team discusses the logistics of actual inspection activities:
    1. What are the problems with geographically distributed teams?
    2. Can the Breeze facility help to overcome the problems?
  7. The team discusses reporting:
    1. Does the proposed template meet the team's needs?
    2. Are refinements to the template required?
    3. What additional information will be reported?
    4. How can results be aggregated with those from other teams (consistency, style, references to heuristic principles, etc.)
  8. The team determines the test target and records a clear definition of it, sufficient to permit repeatability of the assessments. (See Paul's notes)
  9. The team creates a test plan covering:
    1. Activity assignments
    2. Schedule
    3. Selected heuristcs, and CW methods
    4. User profiles
    5. Deliverables (what is to be captured from the inspections)
    6. Reporting template
  10. The test plan is published in the wiki.
  11. The team commences the evaluation process.

Evaluation Process (draft)

Assumptions:  Protocol has been created

  1. Break application into "chunks" for evaluation (highest priority areas first)
  2. Create usage scenarios for cognitive walkthroughs
  3. Individual evaluation by 3 - 5 evaluators
  4. Synthesize and prioritize findings
  5. Brainstorm design session (identify conceptual solutions to high priority issues).  Are there good component candidates?
  6. Write and share out report
  7. Incorporate findings into community (some will drive component development - others can be used for general product development in the communities)
    1. Sakai - Integrate into requirements group.  Do we need to create jira tickets?  Are these really "design bugs" conceptually and thus have a different status than requirements?
    2. Moodle - how does this get fed back into the process?
    3. uPortal - how do we integrate into their requirements process?  Deliver findings to the community?
  8. Look for pain across applications? Are there issues a component(s) can address well?

Selecting a Target Instance of a Product for Inspection

With complex and flexible products such as uPortal, Sakai, and Moodle, which are highly configurable, customizable, extendable, and responsive to their local institutional environment, defining a test-bed environment for inspection presents some challenges. Some thoughts and suggestions are expressed in: Defining Inspection Targets.

Heuristic evaluation & Cognitive walkthrough reference material:

Jacob Nielsen'sdescription and overview

Applying Heuristics to Perform a Rigorous Accessibility Inspection in a Commercial Context 

Usability.gov

Heuristic Evaluation Report Example

Usability Heuristic checklists

...

Within the Fluid Project, walkthroughs have been performed by walkthrough working groups. Most of the groups have had representatives from two or more institutions, with each group focused on a specific product.
Products examined so far include uPortal, Sakai, and Moodle.

Results from Previous Walkthroughs

UX Walkthrough Results:

Excerpt Include
UX Walkthrough Results
UX Walkthrough Results
nopaneltrue

UX Walkthrough Project Plan

Excerpt Include
UX Walkthrough Project Plan
UX Walkthrough Project Plan
nopaneltrue

> Learn more

Column
width30%
Panel
borderColor#93BB6F
bgColor#F8F8F8
titleBGColor#D3E3C4
borderStylesolid
titleOn this Page
Table of Contents
minLevel2
Panel
borderColor#321137
bgColor#fff
titleBGColor#c1b7c3
borderStylesolid
titleRelated Links