top of page

Theatre of the Future | Listening Together | 2017

  • Writer: CreativeComputingGroup
    CreativeComputingGroup
  • Feb 6, 2018
  • 1 min read

Updated: Nov 16, 2019

A human tends to reference other human's feeling and set his one accordingly. This project explores the possibilities of using virtual character for helping audience to understand music better. The facial expression of one who understands a music well is captured using a motion-capture method and projected in the concert hall. During the concert, an audience and the character listen to a music together.




Publication

1. Saebom Kwon, Hyang Sook Kim, and Jusub Kim, Guided Music Listening: Can a Virtual Character Help Us Appreciate Music Better?, HCI Korea 2017, Feb. 2017

2. Saebom Kwon, Jusub Kim. (2019). Enhancing Music Listening Experience Based on Emotional Contagion and Real-time Facial Expression Retargeting. Journal of Digital Contents Society, 20(6), 1117-1124.

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.

    Creative Computing Group | Dept. of Art & Technology | College of Media, Arts, and Science
    Sogang University | 35 Baekbeom-ro, Seoul, South Korea
    서강대 김주섭 교수

    bottom of page