|Fluid is always looking for volunteers and we're always refining our roadmap. If you are interested in participating in the community, please get in touch with us and we will work with you to get started on some exciting solutions.|
Roadmapping is an activity largely influenced by the conversations we have about how our work fits together and how we can organize ourselves logically. This is a living document.
The Fluid Engage project had an all-hands meeting in late June and we are off and running on a good number of foundational activities. That foundation includes building fleshed-out interfaces for some of the solutions we've been working on: services and iPhone. We will aim to "ship" an early version of both services and an iPhone-ready web solution by the end of Q3.
"Tagging": browse, search, view, tag, and collect objects
(projected for end of September)
Our goals in Q3 are intended to help us build the foundation to support "micro-engagements" with our museum grant partners. Micro-engagements are small, subprojects within Fluid Engage that we will work on with our grant partner museums. These micro-engagements will be articulated around a local need and will give us an opportunity to work on integration while we ground our development and design in real use-cases.
This quarter we will be working closely with the museum partners to articulate these micro-engagements in preparation for integration activities in Q4.
The following are some early thoughts for micro-engagements with each of our core-museum partners. These ideas are early and will become clearer as we work more closely with our museum partners on the project.
McCord has expressed an interest in working with the mobile FSS system as well as extending their in-house solutions toward an RFID-based system.
- get to production-ized mobile solution
- keeping an eye on RFID
Jason shared an email with us that outlined some very, very early thoughts for a micro-engagement.
- digital souvenir
From the email:
"#1: Primarily, we started thinking about how video as artifact support could
be really useful. You'll see when you visit, but we do have some monitors
that loop a clip or two to show an artifact in its original context, but we
don't want to (or have the money or the space to) dedicate a monitor to
every single object on the floor of the gallery. Instead, visitors could
access the video on their own devices.
#2: the Magic Window. What would be cool and useful is the ability to frame
an object with your mobile device's camera, have the device recognize the
object, and then have the ability to "click" it and have the artifact info
(including video) pop up. The metaphor is the Magic Window, as if it's a
window you are peering through into an alternate reality."
Through initial conversations with the DIA staff, it seems as though there is an opportunity to extend some of their "interpretives" with technology. One such area might be to enable the staff to extend the "Family Fitting Room" digitally.
- museum staff tour creation tools, with a gentle small step towards putting it in the hands of the user
- start on achieving this by first working on a tagged map
We have made a visit to the McCord Museum and we have a few teams that will be visiting the MMI and DIA in this quarter to get more in-depth information about context, capabilities, and goals.
Teams for visits:
|working team: Tona||working team: Vicki|
view, browse, tag, and collection of objects in an exhibit
in this order – the above will be our focus for tagging work in anticipation of Engage 0.1 (released end of September?)
Though vaguely defined here, our first order of business is to clarify our approach to these ways of interacting with objects and then work on producing solutions in this area.
- development continue the work on the exhibits and objects components in the following areas
- Browsing, searching, collecting, tagging, view
- related artifacts
- design will work on creating early wireframes inspired by the mobile wireframes and the interactions with tags in there.
The Q3 goals for mobile work are to develop a fully fleshed out iPhone solution with some basic functionality plus a working demo on another platform.
- develop and release an iPhone solution in a few weeks
- move onto Android development
- design is working on high fidelity wireframes for an iPhone interface (not an iPhone app) using the artifacts page from McCord Museum as both a use-case and inspiration for a more generalized solution.
- design is working on some use cases for spatial mapping and diving in to the research and producing early wireframes in this quarter. Part of our inspiration on this item will likely come from the DIA maps.
- development is going to implement a simple map + object information layers in Canvas (canvas html 5 element) and Processing
- come up with an approach
- implement and iterate
- design will complete some early wireframes with maps.
- first step on Renderer 2.0
- bug fixes and issues
- creating a builder for web - packaging of Infusion a la carte solution
- bug fixing and a11y testing
- a revamp of our nightly build server
- how do we select data to transform to mobile?
- Decide on server-side framework (make a proposed solution to the community)
research on accessibility solutions in mobile devices maybe work with Nokia contacts on this . Relates to: AEGIS SP1 deliverable (http://redmine.atrc.utoronto.ca/issues/show/543)
tagging maps from DIA
• Nokia and Android platform work
• Multimedia component
- audio? from DIA?
• Work on wayfinding - location-aware mobile solution