Context-Aware Computing
(or, why context needs wearables and wearables need context)
Bradley Rhodes
Most desktop computer applications have an explicite user interface that
expects you to specifiy exactly what you want the computer to do. Wearable
computers will certainly be able to run the standard desktop applications like
wordprocessing, spreadsheets, and databases, but to expect these to be the
primary applications for wearables is to ignore the vast potential for
wearables to be more than simply highly-portable computers. In short, it is
making the same mistake people made when they looked at the first PCs from the
perspective of mainframe computers and assumed they would be used to keep
rescipe databases in the kitchen.
Unlike desktop computers, wearable computers have the potential to ``see''
as the user sees, ``hear'' as the user hears, and experience the life of the
user in a ``first-person'' sense. They can sense the user's physical
environment much more completely than previously possible, and in many more
situations. This makes them excellent platforms for applications where the
computer is working even when you aren't giving explicit commands. Health
monitors, communications systems, just-in-time information systems, and
applications that control realworld devices for you are all examples of these
contextually aware / agent applications. Wearables also need these new kinds
of applications more than desktop computers do. When sitting at a desktop
computer you can expect your user to be interacting with the screen directly.
The user's primary task is working with the computer. With wearables, most of
the time the user is doing something besides interacting with the computer.
They might be crossing the street, or enganged in conversation, or fixing a
boeing 777 jet engine. In most cases the wearable is there in a support role
at best, and may even be an active distraction from the user's primary task.
In these situations the computer can't rely on the user to tell it everything
to do, and so it needs information from the wearer's environment. For example,
imagine an interface which is aware of the user's location: while being in the
subway, the system might alert him with a spoken summary of an e-mail.
However, during a conversation the wearable computer may present the name of a
potential caller unobtrusively in the user's head-up display, or simply
forward the call to voicemail.
MIT Papers on Context Awareness in Wearable Computing
Below is a list of papers and projects here at the MIT Media Lab on
contextually aware applications for wearable computers. Readers are invited to
contact the author of individual papers for more information.
- Nomadic
Radio: Scaleable and Contextual Notification for Wearable Audio
Messaging (postscript). Nitin Sawhney and Chris Schmandt. To appear in the Proceedings of
the ACM SIGCHI Conference on Human Factors in Computing Systems, Pittsburgh,
May 15-20, 1999.
- Auditory
Context Awareness in Wearable Computing (postscript). Brian Clarkson,
Nitin Sawhney, and Alex Pentland.
Workshop on Perceptual User Interfaces, San Francisco, November 5-6,
1998. (pdf)
- Visual
Contextual Awareness in Wearable Computing (html), Thad Starner ,
Bernt Schiele ,
and Alex Pentland
Proceedings of the International Symposium on Wearable Computing,
Pittsburgh, Pennsylvania, 19-20 October 1998, pp. 50-57. (Compressed
Postscript)
- Speaking
and Listening on the Run: Design for Wearable Audio Computing
(compressed postscript), Nitin
Sawhney and Chris
Schmandt. Proceedings of the International Symposium on Wearable
Computing, Pittsburgh, Pennsylvania, 19-20 October 1998, pp. 108-115.
- StartleCam:
A Cybernetic Wearable Camera (compressed postscript), J. Healey and R. W. Picard. Proceedings of the
International Symposium on Wearable Computing, Pittsburgh, Pennsylvania,
19-20 October 1998, pp. 42-49.
-
Participatory
Simulations: Using Computational Objects to Learn about Dynamic
Systems (html). V. Colella, R. Borovoy, & M. Resnick. Proceedings of the
CHI '98 conference, Los Angeles, April 1998.
- DyPERS:
Dynamic Personal Enhanced Reality System (html), Tony Jebara, Bernt Schiele, Nuria Oliver and Alex Pentland. Vision and
Modeling Technical Report #463, May 1998.
(compressed postscript)
- The
Wearable Remembrance Agent: A system for augmented memory (html), Bradley J. Rhodes, in
Personal Technologies Journal Special Issue on Wearable Computing ,
Personal Technologies (1997) 1(4), pp. 218-224. Compressed
Postscript .
See also the earlier version
from The Proceedings of The First International Symposium on
Wearable Computers (ISWC '97), Cambridge, Mass, October 1997,
pp. 123-128. (compressed postscript)
- Stochasticks:
Augmenting the Billiards Experience with Probabilistic Vision and Wearable
Computers (html). Tony
Jebara, Cyrus Eyster, Josh Weaver, Thad Starner and Alex
Pentland. In Proceedings of the International Symposium on Wearable
Computers , Cambridge, Massachusetts, October 1997. Also appears as Vision
and Modeling Technical Report #439.
(compressed postscript)
- Real-Time
American Sign Language Recognition Using Desk and Wearable Computer Based
Video Perceptual (html) Thad
Starner , Joshua Weaver, and Alex
Pentland Perceptual Computing TR#466. To appear PAMI in 1998. Originally
submitted 4/26/96. (Compressed
Postscript)