Storytelling with Clouds
This project engages with the familiar activity of childhood bedtime stories. We propose creating an immersive storytelling device, which encourages imagination using an ambient atmosphere. This project is based on experiences such as finding images in clouds and telling stories around the constellations in the sky.
How it works:
We propose creating a multimodal interface, which would allow you to bring a narrative to life using amorphous cloud like images. As you start telling your story you can use a tablet as an input device to draw an image to support your narrative: animals, buildings, hearts etc. When you are done drawing your creation takes form on the projected screen in front of you. You can then blow on the device (microphone), your breath simulates the wind that carries clouds. The harder you blow the faster and farther your cloud creation moves away from the center of the projection. Over the course of your narrative event your cloud images begins to break down and dissipate, simulating the way real clouds change shape as they pass across the sky. We would also like to incorporate music, however we haven’t fully thought through how to integrate it effectively.
Implementation Plan:
At the core of the system lies an Android OS based tablet which is closed inside a cubical box. The touch screen of the tablet is exposed and the user can draw clouds on the screen. On the side of the cube is a microphone connected to the jack on the Tablet. The other sides of the box are connected to pressure sensors that are directly connected to the tablet or connected via an Arduino board. The HDMI output of the tablet is connected to a mini-projector which projects the screen onto the ceiling.
The system software consists of an Android application and arduino software that filters the touch input. The Android application consists of a drawing application and an audio signal processor which determines the force of the blowing. It also integrates with a fluid dynamics cs engine to determine the trajectory/shape of the cloud when blown or touched.
We propose creating a multimodal interface, which would allow you to bring a narrative to life using amorphous cloud like images. As you start telling your story you can use a tablet as an input device to draw an image to support your narrative: animals, buildings, hearts etc. When you are done drawing your creation takes form on the projected screen in front of you. You can then blow on the device (microphone), your breath simulates the wind that carries clouds. The harder you blow the faster and farther your cloud creation moves away from the center of the projection. Over the course of your narrative event your cloud images begins to break down and dissipate, simulating the way real clouds change shape as they pass across the sky. We would also like to incorporate music, however we haven’t fully thought through how to integrate it effectively.
Implementation Plan:
At the core of the system lies an Android OS based tablet which is closed inside a cubical box. The touch screen of the tablet is exposed and the user can draw clouds on the screen. On the side of the cube is a microphone connected to the jack on the Tablet. The other sides of the box are connected to pressure sensors that are directly connected to the tablet or connected via an Arduino board. The HDMI output of the tablet is connected to a mini-projector which projects the screen onto the ceiling.
The system software consists of an Android application and arduino software that filters the touch input. The Android application consists of a drawing application and an audio signal processor which determines the force of the blowing. It also integrates with a fluid dynamics cs engine to determine the trajectory/shape of the cloud when blown or touched.
Group:
Ariel Haney
Iris Cheung
Prayag Narula