Research Projects
Chain API
Joseph A. Paradiso, Gershon Dublon, Brian Mayton and Spencer RussellRESTful services and the Web provide a framework and structure for content delivery that is scalable, not only in size but, more importantly, in use cases. As we in Responsive Environments build systems to collect, process, and deliver sensor data, this project serves as a research platform that can be shared between a variety of projects both inside and outside the group. By leveraging hyperlinks between sensor data clients can browse, explore, and discover their relationships and interactions in ways that can grow over time.
Circuit Stickers
Joseph A. Paradiso, Jie Qi, Nan-wei Gong and Leah BuechleyCircuit Stickers is a toolkit for crafting electronics using flexible and sticky electronic pieces. These stickers are created by printing traces on flexible substrates and adding conductive adhesive. These lightweight, flexible, and sticky circuit boards allow us to begin sticking interactivity onto new spaces and interfaces such as clothing, instruments, buildings, and even our bodies.
Circuit Stickers Activity Book
Leah Buechley and Jie QiThe Circuit Sticker Activity Book is a primer for using circuit stickers to create expressive electronics. Inside are explanations of the stickers, and circuits and templates for building functional electronics directly on the pages of the book. The book covers five topics, from simple LED circuits to crafting switches and sensors. As users complete the circuits, they are also prompted with craft and drawing activities to ensure an expressive and artistic approach to learning and building circuits. Once completed, the book serves as an encyclopedia of techniques to apply to future projects.
Circuit Storybook
Joseph A. Paradiso, Kevin Slavin, Jie Qi and Sonja de BoerAn interactive picture book that explores storytelling techniques through paper-based circuitry. Sensors, lights, and microcontrollers embedded into the covers, spine, and pages of the book add electronic interactivity to the traditional physical picture book, allowing us to tell new stories in new ways. The current book, "Ellie," tells the adventures of an LED light named Ellie who dreams of becoming a star, and of her journey up to the sky.
DoppelLab: Experiencing Multimodal Sensor Data
Joe Paradiso, Gershon Dublon and Brian Dean MaytonDoppelmarsh: Cross-Reality Environmental Sensor Data Browser
Joseph A. Paradiso, Gershon Dublon, Donald Derek Haddad, Evan Lynch, Brian Mayton and Spencer RussellDoppelmarsh is a cross-reality sensor data browser built for experimenting with presence and multimodal sensory experiences. Built on evolving terrain data from a physical wetland landscape, the software integrates real-time data from an environmental sensor network with real-time audio streams and other media from the site. Sensor data is rendered in the scene in both visual representations and as 3D sonification. Users can explore this data by walking on the virtual terrain in a first person view, or flying high above it. This flexibility allows Doppelmarsh to serve as an interface to other research platforms on the site, such as Quadrasense, an augmented reality UAV system that blends a flying live camera view with a virtual camera from Doppelmarsh. We are currently investigating methods for representing subsurface data, such as soil and water temperatures at depth, as well as automation in scene and terrain painting.
Experiential Lighting: New User-Interfaces for Lighting Control
Joseph A. Paradiso, Matthew Aldrich and Nan ZhaoWe are evaluating new methods of interacting and controlling solid-state lighting based on our findings of how participants experience and perceive architectural lighting in our new lighting laboratory (E14-548S). This work, aptly named "Experiential Lighting," reduces the complexity of modern lighting controls (intensity/color/space) into a simple mapping, aided by both human input and sensor measurement. We believe our approach extends beyond general lighting control and is applicable in situations where human-based rankings and preference are critical requirements for control and actuation. We expect our foundational studies to guide future camera-based systems that will inevitably incorporate context in their operation (e.g., Google Glass).
FingerSynth: Wearable Transducers for Exploring the Environment through Sound
Joseph A. Paradiso and Gershon DublonThe FingerSynth is a wearable musical instrument made up of a bracelet and set of rings that enables its players to produce sound by touching nearly any surface in their environments. Each ring contains a small, independently controlled audio exciter transducer. The rings sound loudly when they touch a hard object, and are silent otherwise. When a wearer touches their own (or someone else's) head, the contacted person hears sound through bone conduction, inaudible to others. A microcontroller generates a separate audio signal for each ring, and can take user input through an accelerometer in the form of taps, flicks, and other gestures. The player controls the envelope and timbre of the sound by varying the physical pressure and the angle of their finger on the surface, or by touching differently resonant surfaces. The FingerSynth encourages players to experiment with the materials around them and with one another.
Hacking the Sketchbook
Joseph A. Paradiso and Jie QiIn this project we investigate how the process of building a circuit can be made more organic, like sketching in a sketchbook. We integrate a rechargeable power supply into the spine of a traditional sketchbook, so that each page of the sketchbook has power connections. This enables users to begin creating functioning circuits directly onto the pages of the book and to annotate as they would in a regular notebook. The sequential nature of the sketchbook allows creators to document their process for circuit design. The book also serves as a single physical archive of various hardware designs. Finally, the portable and rechargeable nature of the book allows users to take their electronic prototypes off of the lab bench and share their creations with people outside of the lab environment.
Halo: Wearable Lighting
Joseph A. Paradiso and Nan ZhaoImagine a future where lights are not fixed to the ceiling, but follow us wherever we are. In this colorful world we enjoy lighting that is designed to go along with the moment, the activity, our feelings, and our outfits. Halo is a wearable lighting device created to explore this scenario. Different from architectural lighting, this personal lighting device aims to illuminate and present its user. Halo changes the wearer's appearance with the ease of a button click, similar to adding a filter to a photograph. It can also change the user's view of the world, brightening up a rainy day or coloring a gray landscape. Halo can react to activities and adapt based on context. It is a responsive window between the wearer and his or her surroundings.
HearThere: Ubiquitous Sonic Overlay
Joseph A. Paradiso, Gershon Dublon and Spencer RussellWith our Ubiquitous Sonic Overlay, we are working to place virtual sounds in the user's environment, fixing them in space even as the user moves. We are working toward creating a seamless auditory display, indistinguishable from the user's actual surroundings. Between bone-conduction headphones, small and cheap orientation sensors, and ubiquitous GPS, a confluence of fundamental technologies is in place. However, existing head-tracking systems either limit the motion space to a small area (e.g., Occulus Rift), or sacrifice precision for scale using technologies like GPS. We are seeking to bridge the gap to create large outdoor spaces of sonic objects.
KickSoul: A Wearable System for Foot Interactions with Digital Devices
Pattie Maes, Joseph A. Paradiso, Xavier Benavides Palos and Chang Long Zhu JinKickSoul is a wearable device that maps natural foot movements into inputs for digital devices. It consists of an insole with embedded sensors that track movements and trigger actions in devices that surround us. We present a novel approach to use our feet as input devices in mobile situations when our hands are busy. We analyze the foot's natural movements and their meaning before activating an action.
ListenTree: Audio-Haptic Display in the Natural Environment
V. Michael Bove, Joseph A. Paradiso, Gershon Dublon and Edwina PortocarreroListenTree is an audio-haptic display embedded in the natural environment. Visitors to our installation notice a faint sound emerging from a tree. By resting their heads against the tree, they are able to hear sound through bone conduction. To create this effect, an audio exciter transducer is weatherproofed and attached to the tree's roots, transforming it into a living speaker, channeling audio through its branches, and providing vibrotactile feedback. In one deployment, we used ListenTree to display live sound from an outdoor ecological monitoring sensor network, bringing a faraway wetland into the urban landscape. Our intervention is motivated by a need for forms of display that fade into the background, inviting attention rather than requiring it. We consume most digital information through devices that alienate us from our surroundings; ListenTree points to a future where digital information might become enmeshed in material.
Living Observatory: Sensor Networks for Documenting and Experiencing Ecology
Special Interest group(s):Glorianna Davenport, Joe Paradiso, Gershon Dublon, Donald Derek Haddad, Brian Dean Mayton and Spencer RussellLiving Observatory is an initiative for documenting and interpreting ecological change that will allow people, individually and collectively, to better understand relationships between ecological processes, human lifestyle choices, and climate change adaptation. As part of this initiative, we are developing sensor networks that document ecological processes and allow people to experience the data at different spatial and temporal scales. Low-power sensor nodes capture climate and other data at a high spatiotemporal resolution, while others stream audio. Sensors on trees measure transpiration and other cycles, while fiber-optic cables in streams capture high-resolution temperature data. At the same time, we are developing tools that allow people to explore this data, both remotely and onsite. The remote interface allows for immersive 3D exploration of the terrain, while visitors to the site will be able to access data from the network around them directly from wearable devices.
Low-Power Gesture Input with Wrist-Worn Pressure Sensors
Joseph A. Paradiso and Artem DementyevWe demonstrate an always-available, on-body gestural interface. Using an array of pressure sensors worn around the wrist, it can distinguish subtle finger pinch gestures with high accuracy (>80℅). We demonstrate that it is a complete system that works wirelessly in real time. The device is simple and light-weight in terms of power consumption and computational overhead. Prototype's sensor power consumption is 89uW, allowing the prototype to last more then a week on a small lithium polymer battery. Also, device is small and non-obtrusive, and can be integrated into a wristwatch or a bracelet. Custom pressure sensors can be printed with off-the-shelf conductive ink-jet technology. We demonstrate that number of gestures can be greatly extended by adding orientation data from an accelerometer. Also, we explore various usage scenarios with the device.
MedRec
Ariel Ekblaw, Asaf Azaria, Thiago Vieira, Joe Paradiso, Andrew LippmanWe face a well-known need for innovation in electronic medical records (EMRs), but the technology is cumbersome and innovation is impeded by inefficiency and regulation. We demonstrate MedRec as a solution tuned to the needs of patients, the treatment community, and medical researchers. It is a novel, decentralized record management system for EMRs that uses blockchain technology to manage authentication, confidentiality, accountability, and data sharing. A modular design integrates with providers' existing, local data-storage solutions, enabling interoperability and making our system convenient and adaptable. As a key feature of our work, we engage the medical research community with an integral role in the protocol. Medical researchers provide the "mining" necessary to secure and sustain the blockchain authentication log, in return for access to anonymized, medical metadata in the form of "transaction fees."
Mindful Photons: Context-Aware Lighting
Joseph A. Paradiso, Akane Sano and Nan ZhaoLight enables our visual perception. It is the most common medium for displaying digital information. Light regulates our circadian rhythms, affects productivity and social interaction, and makes people feel safe. Yet despite the significance of light in structuring human relationships with their environments on all these levels, we communicate very little with our artificial lighting systems. Occupancy, ambient illuminance, intensity, and color preferences are the only input signals currently provided to these systems. With advanced sensing technology, we can establish better communication with our devices. This effort is often described as context-awareness. Context has typically been divided into properties such as location, identity, affective state, and activity. Using wearable and infrastructure sensors, we are interested in detecting these properties and using them to control lighting. The Mindful Photons Project aims to close the loop and allow our light sources to "see" us.
NailO
Cindy Hsin-Liu Kao, Artem Dementyev, Joe Paradiso, Chris SchmandtNailO is a nail-mounted gestural input surface inspired by commercial nail stickers. Using capacitive sensing on printed electrodes, the interface can distinguish on-nail finger swipe gestures with high accuracy (>92 percent). NailO works in real time: the system is miniaturized to fit on the fingernail, while wirelessly transmitting the sensor data to a mobile phone or PC. NailO allows for one-handed and always-available input, while being unobtrusive and discrete. The device blends into the user's body, is customizable, fashionable, and even removable.
Prosthetic Sensor Networks: Factoring Attention, Proprioception, and Sensory Coding
Joseph A. Paradiso and Gershon DublonSensor networks permeate our built and natural environments, but our means for interfacing to the resultant data streams have not evolved much beyond HCI and information visualization. Researchers have long experimented with wearable sensors and actuators on the body as assistive devices. A user's neuroplasticity can, under certain conditions, transcend sensory substitution to enable perceptual-level cognition of "extrasensory" stimuli delivered through existing sensory channels. But there remains a huge gap between data and human sensory experience. We are exploring the space between sensor networks and human augmentation, in which distributed sensors become sensory prostheses. In contrast, user interfaces are substantially unincorporated by the body, our relationship to them never fully pre-attentive. Attention and proprioception are key, not only to moderate and direct stimuli, but also to enable users to move through the world naturally, attending to the sensory modalities relevant to their specific contexts.
Quantizer: Sonification Platform for High-Energy Physics Data
Joseph A. Paradiso and Juliana CherstonInspired by previous work in the field of sonification, we are building a data-driven composition platform that will enable users to map collision event information from experiments in high-energy physics to audio properties. In its initial stages, the tool will be used for outreach purposes, allowing physicists and composers to interact with collision data through novel interfaces. Our longer-term goal is to develop strategic mappings that facilitate the auditory perception of hidden regularities in high dimensional datasets and thus evolve into a useful analysis tool for physicists as well, possibly for the purpose of monitoring slow control data in experiment control rooms. The project includes a website with real-time audio streams and basic event data, which is not yet public.
SensorChimes: Musical Mapping for Sensor Networks
Joseph A. Paradiso and Evan LynchSensorChimes aims to create a new canvas for artists leveraging ubiquitous sensing and data collection. Real-time data from environmental sensor networks are realized as musical composition. Physical processes are manifested as musical ideas, with the dual goal of making meaningful music and rendering an ambient display. The Tidmarsh Living Observatory initiative, which aims to document the transformation of a reclaimed cranberry bog, provides an opportunity to explore data-driven musical composition based on a large-scale environmental sensor network. The data collected from Tidmarsh are piped into a mapping framework, which a composer configures to produce music driven by the data.
SensorTape: Modular and Programmable 3D-Aware Dense Sensor Network on a Tape
Joseph A. Paradiso, Artem Dementyev and Cindy Hsin-Liu KaoSensorTape is a modular and dense sensor network in a form factor of a tape. SensorTape is composed of interconnected and programmable sensor nodes on a flexible electronics sub-strate. Each node can sense its orientation with an inertial measurement unit, allowing deformation self-sensing of the whole tape. Also, nodes sense proximity using time-of-flight infrared. We developed network architecture to automatically determine the location of each sensor node, as SensorTape is cut and rejoined. We also made an intuitive graphical interface to program the tape. Our user study suggested that SensorTape enables users with different skill sets to intuitively create and program large sensor network arrays. We developed diverse applications ranging from wearables to home sensing, to show low-deployment effort required by the user. We showed how SensorTape could be produced at scale and made a 2.3-meter long prototype.
Skrin
Pattie Maes, Joseph A. Paradiso, Xin Liu and Katia VegaSkrin is an exploration project on digitalized body skin surface using embedded electronics and prosthetics. Human skin is a means for protection, a mediator of our senses and a presentation of ourselves. Through several projects, we expand the expression capacity of body surface and emphasize on the dynamic aesthetics of body texture by technological means.
Tid'Zam
Joseph A. Paradiso, Clement Duhart and Donald Derek HaddadThe Tidmarsh project is interested in the documentation of ecological processes to understand their spatial and temporal evolution. Its cross-reality component provides user experiences for numerical reconstructions of outdoor environments thanks to data collected from real-time sensor networks. Tid'Zam analyses multi-source audio streams in real-time to identify events happening on Tidmarsh, such as bird calls, frogs, or car noise. Its Deep Learning stack offers an interface to create and improve the different classifier units from a Web interface. In addition, its interactive HCI has been designed to provide a training feedback mechanism between users/experts and the neural networks in order to improve knowledge for both the system and the users.