art
artificial intelligence
health
human-machine interaction
learning + teaching
design
robotics
technology
architecture
consumer electronics
human-computer interaction
music
kids
wearable computing
bioengineering
data
politics
sensors
machine learning
networks
environment
social science
entertainment
cognition
space
economy
wellbeing
history
computer science
storytelling
interfaces
creativity
covid19
ethics
engineering
developing countries
prosthetics
civic technology
alumni
community
biology
privacy
social media
social robotics
computer vision
communications
augmented reality
public health
urban planning
neurobiology
imaging
virtual reality
industry
synthetic biology
biotechnology
food
affective computing
social networks
climate change
energy
biomechanics
transportation
government
data visualization
social change
fabrication
behavioral science
ocean
medicine
data science
materials
cognitive science
zero gravity
startup
agriculture
women
blockchain
prosthetic design
diversity
genetics
manufacturing
racial justice
sustainability
gaming
neural interfacing and control
3d printing
banking and finance
electrical engineering
ecology
cryptocurrency
fashion
human augmentation
civic action
bionics
construction
microfabrication
security
performance
healthcare
sleep
open source
systems
language learning
natural language processing
marginalized communities
interactive
microbiology
visualization
social justice
internet of things
autonomous vehicles
perception
mental health
collective intelligence
mechanical engineering
clinical science
water
code
nanoscience
cities
mapping
physiology
physics
nonverbal behavior
chemistry
textiles
voice
rfid
hacking
long-term interaction
trust
biomedical imaging
sports and fitness
algorithms
orthotic design
gender studies
networking
pharmaceuticals
culture
mechatronics
soft-tissue biomechanics
law
open access
autism research
assistive technology
member company
business
real estate
internet
science
digital currency
womens health
exhibit
wireless
news
cells
decision-making
asl
How to build intelligent music systems out of interacting audio-processing agents.
Often, we neglect to see the city as living, complex, and dynamic. However, shrouded by its masses of concrete and steel lie unique ecosyst…
The Intertidal Experimentation Workshop will take place September 29 and 30 (9am to 2pm) at the MIT Media Lab, open to students ages 8-14. …
The City Symphony project by the Opera of the Future group brings creative musical participation to everyone while encouraging collaboratio…
A conversation between Ed Boyden and Tyler Cowen on optogenetics and expansion microscopy to storytelling and the nature of consciousness.
Mike Bove, head of the Object-Based Media group, on the current state of the technology and what his research team is working on.
Avery Normandin and Devora Najjar are on a mission to build literacy and appreciation for urban ecology.
EEEeb Spring 2019: Urban OceansMarch 24, April 7 and 21, May 19, June 2 To register, please visit this link.…
“Happiness makes the world go round."
The news is probably one of the first things people check in the morning, but how much does what you know and understand about the world de…
Enhancing mobile life through improved user interactions
Ariel Ekblaw speaks at Sónar+D
The Media Lab panel at San Diego Comic-Con covered inclusion, exploration, and the reciprocity between fiction and research at the Lab
The Storytelling project uses machine-based analytics to identify the qualities of engaging and marketable media. By developing models with…
Chowdhury, S. K. (2018). Pintail: A Travel Companion for Guided Storytelling.
Dan Novy, MIT Media Lab engineer, is a strong advocate of bridging this gap, especially in media innovation.
Realtime detection of social cues in children’s voicesIn everyday conversation, people use what are known as backchannels to signal to some…
The Shelley Project studies how to produce horror stories as a result of collaboration between humans and artificial intelligence.
An AI algorithm can predict which parts of a film will generate the greatest emotional responses in audiences.
Hae Won Park, Mirko Gelsomini, Jin Joo Lee, and Cynthia Breazeal. 2017. Telling Stories to Robots: The Effect of Backchanneling on a Child's Storytelling. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (HRI '17).
Research scientist Manuel Cebrian discusses Shelley with CNN en Español host Andrés Oppenheimer.
New research can predict how plots, images, and music affect your emotions while watching a movie.
Computers don’t cry during sad stories, but they can tell when we will.
Artificial intelligence (AI) may not be ready to write the next blockbuster movie, but a team of AI researchers from the Massachusetts…
This brief excerpt video shows a glimpse of some of Tod Machover’s innovative, unusual opera realized at—and with the collaboration of—the …
Jin Joo Lee. A Bayesian Theory of Mind Approach to Nonverbal Communication for Human-Robot Interactions. PhD Thesis, Massachusetts Institute of Technology, 2017.
HW Park, M Gelsomini, JJ Lee, T Zhu, and C Breazeal (2017). Backchannel Opportunity Prediction for Social Robot Listeners. In Proceedings of the International Conference on Robotics and Automation (ICRA).
Designing systems that become experiences to transcend utility and usability
Music software that lets anyone compose music. The first music software program designed to teach students and adults how to compose music …
Pintail is a travel companion app for guided storytelling. It will start by capturing your travel plan so that it can nudge you with person…
The Hyperinstruments project creates expanded musical instruments and uses technology to give extra power and finesse to virtuosic performe…