Interactive Memories
2017
Human beings have been cherishing important moments using various methods. For a long time, we used drawing and painting. With the invention of Camera, we started taking photos. With the birth of electronics, recording videos has become one of main ways of cherishing moments. In this project, we explore the next medium for cherishing moments: Memory that can be interacted with.
Sonification of Everyday Life: Byo
2017
Our visual perception is not sensitive to what we see everyday. In this sonification project, we explore a new way of 'seeing' our body movement. We made a real-time dance movement sonification program, Byo. It generates sound using wireless motion capture sensor Myo. The captured body and muscle movements are translated by machine learning and sound system into the sound that dancers have never heard before from their bodies.
Theatre of the Future: Listening Together
2016-2017
A human tends to reference other human's feeling and set his one accordingly. This project explores the possibilty of using virtual character for helping audience to understand music better. The facial expression of one who understands a music well is captured using a motion-capture method and projected in the concert hall. During the concert, an audience and the character listen to a music together.
Theatre of the Future: A Dream of a Butterfly_After 4
2016
This project explores the possibility of interactive animation for the stage performance. A large butterfly is generated and animated in real-time during the performance and interacts with the performer using wireless motion capture device. The interactive animation turns the butterfly into another important acting character for the performance.
Future MindCare: CrowdSourced MindCare
2016-2017
This is a collaborative research project with Dept. of Psychology. We aim to develop new Scalable MindCare Solutions that can be effectively applied to many people who suffer from minor depression or anxiety disorder as an alternative to the traditional one to one psychotherapy methods usually done in the counseling office.
Storytelling of the Future: HoloStory
2016
The recently developed HMD type augmented reality device Microsoft's Hololens has various possibilities in the development of augmented reality contents. In this study, we focus on the storytelling, and propose a new immersive interactive storytelling method using Hololens. Through this, we seek to transform storytelling methods from sit-&-watch into walk-&-experience.
Immersive Classroom
2015-2016
This is a collaborative research project with d'Strict and Jeju Marine Science Museum. We develop new immersive learning content for the museum.
Theatre of the Future: VR Art Performance
2016
This project explores the possibility of VR for performing arts. In digital space, everyone can be VIP-level audience sitting in the best seat. We used a 360 video camera (6 GoPro cameras) to capture the whole environment of the stage. An audience watches the show with a HMD device such as Google Cardboad or Samsung Gear VR. The audience feels like the whole show is performed only for himself or herself. Suddenly the audience becomes the main character of the show rather than a spectator.
Storytelling of the Future: VR Poetry
2013-2015
In this project, we explore the next storytelling medium for poetry. We re-present a text poem into a visual form that reflects the characteristics of the poem. Our vision is to create a pandora of poems where all the poems from all over the world are re-born as a unique visual entity and anyone can walk around, see, listen, and touch the poem. This is a collaborative research project with Prof. Wayne de Fremery at Global Korean Studies.
Theatre of the Future: Pan-Push/Pull
2015
As one of the Live Animation Theatre series, this project explores the possibilty of wearable motion capture devices for performance. I used myo armband which wirelessly captures motions of human arms. The detected motion is used to create virtual wind in a virtual stage filled with bird feathers - the metaphoric visual element for desire for freedom.
Theatre of the Future: Art & Tech Center Opening Performance
2014
This modern dance performance project explores the possibility of a real-time motion capture technology as a tool for enhancing storytelling. Dancers' performance is captured in real-time and used for generating visuals that can help storytelling on a stage.
Theatre of the Future: Choreographing Digital Water
2013-2014
This project explores the possibility of using virtual dancers for live synesthetic experience. Digitally created water is explored as a metaphoric extension of the virtual dancers’ arms and hands. The non-abstract physically-based digital water’s movements are choreographed based on Laban's movement analysis and performed/improvised in real-time by a human performer using the piano as the interface.
Sonification of Everyday Life: PerSon
2014
Add News Story herOur visual perception is not sensitive to what we see everyday. In this project, we explore a new way of 'seeing' our everyday life. We made a real-time video sonification mobile app. It generates music by finding musical motifs corresponding to video input from motif database and concatenating them.
Theatre of the Future: Monet Listens to Debussy
2013
Claude Monet and Claude Debussy pursued impressionism in painting and music respectively. This live animation theatre project aims to show an imaginary encounter of the two pioneers in impressionism. The interactive generative art offer audience an unique opportunity of seeing what Monet might have seen in his perception if he had listened to Debussy's music when he painted the parliament series in London.
Crocessing
2013
Library for Processing Style Programming in C Language. Processing provides an easy way to program for interactive visual experience, but is slow in performance. C is fast in performance, but difficult to program. "Crocessing" aims to fill the gap. It is a C library that allows code written in Processing to run as fast as native C code with minimal changes.
Volumetric 3D Display
2012
The widely used projector-based display system has the limitation that the image can only be projected on 2D physical space. A volumetric display can become an alternative display that allows more active audience participation since it allows viewers to choose the viewing angle independently from other viewers participating at the same time. We made Medialamp-Echoluminari, a very large outdoor volumetric 3D display made with 10,000 LEDs.
FILM VFX
2008-2012
Digital technology allows us to share visual imaginations in a vivid and realistic way and keeps pushing the boundary of storytelling. While working at Rhythm & Hues Studios, I had opportunities to push the boundary further for the Hollywood major feature films such as Life of PI which won the Academy award in Visual effects in 2013.