Augmented Reality
Augmented reality refers to the combination the real and the
virtual to assist the user in his environment. Applications include
telemedicine, architecture, construction, devices for the disabled,
and many others. Several large augmented reality systems already
exist (for example, the Interactive Video Environment system), but a wearable
computer with a small camera and digitizer opens a whole new set of
applications.
Finger Tracking
One of the simplest applications of this camera-based wearable
computer is finger tracking. Many pen computer users appreciate the pen
interface for its drawing capability. However, given that a computer can
visually track the user's finger, there is no need to use a pen
(see Figure \ref{fig:drawing}). Such an interface allows the user to replace
normal computer pointing devices, such as the mouse, with his finger.
The user can control the operating system in this manner or digitize
an image and virtually annotate it.
A prototype of a vision-based finger tracking system.
As commercial computing becomes more powerful, the orange thimble will
not be necessary.
Face Recognition
Several years ago, we demonstrated the Photobook
face database system. Photobook has the ability to search a
database of 8000 faces in approximately 1 second on the equivalent of
a high-end 80486 system and return the top 40 closest matches for a
given face. The system is surprisingly tolerant of lighting changes
and facial hair differences. In addition, the user can specify a
combination of faces for the search. This allows a ``mugshot''
application where a crime victim may be able to search mugbooks much
quicker than ever before. An experienced user can find a particular
person in the 8000 face database within a few mouse clicks.
With the addition of face-finding software, this system is being
adapted for use in wearable computing. The goal for the system
is to overlay names on faces as the user moves about the world.
Markets include the police, reporters, politicians, the visually
disabled (with an audio interface), and those with bad memories for
faces (the author being included in this last group).
Visual Filter
Steve Mann and I have recently demonstrated a digital visual filter.
The basic concept is to process video images digitally in real time to
assist the user in everyday tasks. For example, users with low vision
find that enhancing the edges in an image helps in face recognition.
Another application is to map around ``blind spots'' in the visually
disabled. Figure \ref{fig:vf} shows yet another variation, digitally
magnifying the image through use of a virtual fish eye lens for help
in reading. While current wearable computers do not have the
processing power to do these manipulations in real time, the video
image can be transferred to a base station computer which does the
transformation and resends the image to the user. Thus
experimentation can be performed until wearable
computers become powerful enough to manipulate video locally.
\begin{figure} [bth]
\centerline{
\epsfysize=1.0in
\epsfbox{vfilter2.ps}
}
\caption{
\label{fig:vf}
A digital video texture-mapped fisheye ``lens'' that can be
used to help those with low vision.
}
\end{figure}
Navigation
The Global Position System (GPS) allows private users to find their
position anywhere on the globe to within 100 meters. Naturally,
hooking a GPS system to a wearable computer and mapping software
allows the user to track himself while exploring a city. However,
the resolution is not fine enough in many situations. By using
optical flow (comparing consecutive images to determine the direction
of motion) not only can the movement of a user's head be tracked, but
warnings can be given of approaching objects for the visually
disabled. By implementing a local beacons or a dead-reckoning system in
the workplace, much more advanced applications can be developed.
Examples include virtual museum tour guides, automatic wiring and gas
line view overlays in buildings and on streets, and a new computing
environment we like to call the ``reality'' metaphor. The reality
metaphor replaces the typical computer desktop metaphor by overlaying
files onto real world objects. Thus, a filing cabinet may have a
searchable index overlaid on it. Telephones may have virtual phone
directories attached. Virtual 3D ``post-it'' notes and
movies may be applied to objects. Recent electronic
mail messages may be rendered on a co-worker's door (or the co-worker!)
to remind the user of the last communication with that person. Again, such a
system would help provide context-based information in a timely fashion.
Repair instruction
Since a manufacturer can control the markings on the inside of
his product, why not instrument the product to allow a wearable
camera system to track the object in the user's visual field? By
simply putting three distinctive marks at known distances from each
other, a wearable camera with known focal length can recover the 3D
location of the plane defined by these three marks. By
extrapolation from an on-line technical manual, the rest of the object's
3D location can be derived. Thus, when a repair technician walks up
to a broken machine, the machine can transmit its diagnostics to the
technician's wearable. The wearable automatically determines the
problem, locates the 3D position of the object, and overlays specific
3D real-time step-by-step guidelines on the object for the
technician to follow. A prototype of such a system is currently
under way.
Return to Wearables main page
Last modified: Mon Dec 13 00:32:32 EST 1999