Skip to end of metadata
Go to start of metadata

Project Plan & Timelines

Activity

Time Estimate

Due Date

Finalize personas (identify & create new)

Complete for UX Walkthrough purposes

8/14/07

Flesh out scenario list

X *Should review but we have enough to move forward at this point

8/14/07

Identify tools for Overview Heuristic

X (see below in #1 Heuristic Overview)

8/14/07

Acquire needed logins, etc. to bSpace

X *for security reasons these will be shared directly as needed

8/15/07

Evaluation (usability, accessibility, walkthroughs, individual reports)

Work started

9/10/07

Screen Reader Evaluation (including report)

 

9/10/07

Write Summary Report (synthesize individuals)

 

9/24/07

Presentation for Summit?

 

9/24/07

Total

6 weeks


UX Walkthrough Work Breakdown

  1. Heuristic Overview of Specific Tools: 
    • Tools Heuristic Overview: https://bspace-qa.berkeley.edu/portal/; using the '107 Plant Morphology' site as our target since it is suggested as a "typical" course site.  It is a real course at Berkeley from Winter 2007 term.  Confidential information is scrambled.  NOTE:  Do not send email or notification -- it will go out to students in the class.
    • Daphne will add evaluators as participants to site in the role they are evaluating from:  instructor, student, teaching assistant, researcher
    • Announcements, Home, Assignment, Drop box, Email Archive, Forums, Mail tool, News, Quiz & Survey, Resources, Syllabus, Web content, Wiki (Q & S, Wiki were de-scoped for this round.  They are large, complex and very important tools in the content management realm that we'll look at in the next rounds)
    • done individually be each team member (someone with Sakai knowledge can be "helper" for those that are not)
    • "traditional" heuristic-style
    • Use Heuristic Evaluation
  2. Cognitive walkthroughs
    • Scenario Walkthroughs: http://bspace.berkeley.edu;
      Each evaluator uses a unique instructor login created for the evaluation.  Logins can be sent via email.
    • done in groups -- 2+
    • walkthrough specific "likely" (identified) activities as the user -- uses personas
    • walkthrough each scenario again based on description of how disabled person would complete this task differently
      • Keyboard only
      • Screen reader
  3. Technical evaluation
    • individually or in a group -- need specific skill set?
    • review code 'under the covers' of Content Management tools against guidelines

Reporting

After the first pilot evaluation we've decided to continue to work with template posted on the wiki.  There were some concerns expressed with the readability of the report and we have since made some adjustments:

  • to break the issues tables up by tool for the heuristic overview and by scenario for the cognitive walkthroughs (step by step reported at the end of the template and then issues should be pulled out and included in the issue table)
  • add columns for principle and suggested solution in the cognitive walkthrough tables.  We'll want to link between this table and the main issues table somehow to make sure duplicate information is in sync.
  • individuals will use the template on the wiki or in a word document...whatever works best for them.  The finalized, synthesized version (from all evaluators) will live in the confluence space to allow for easy linking to screen shots, etc.
    • [Daphne]  I'm currently using a word document, adding screen shots in an appendix and cross referencing.  I can see where this will cause some duplicate effort but am trying to at least organize the screen shots and name them in a way that will make sense when I move it all to confluence.  Because of past difficulties with confluence I've decided to continue with the word doc while I analyze and digitize hand written notes (I'm concerned that confluence will have me focusing on it rather the evaluation results (smile) ).
  • continuously reflect on the usefulness of the form.  The last thing we want is the template to get in the way of the work.

Potential difficulties to keep in mind for process review

  • bSpace idiosyncrasies vs. Sakai generic
    • are we finding issues only related bSpace or missing issues that exist in Sakai but not bSpace (particularly around skins)?
    • does the site's tool configuration make sense?
  • List of guidelines is quite long -- can we keep them all in mind while working through the evaluation?
  • By focusing on intermediate users (see personas) are we missing anything beginners might have difficulties with?
  • No labels