Gestural Interface for Conducting Virtual Concerts

Dr. Rasika Ranaweera ,Senior Lecturer/Dean ,Faculty of Computing ,ranaweera.r@nsbm.ac.lk

Abstract :-

We have created a mixed reality concert application using Alice, a 3d rapid prototyping programming environment, in which musical instruments are arranged around a virtual conductor (in this case the user) located at their center. A user-conductor can use a smartphone as a simplified baton, pointing at a preferred instrument and tapping a button to start playing. The volume and panning of a selected instrument can be adjusted by simply tilting and rolling the smartphone. When selected, an instrument is jiggled or its components dilated and contracted, and a spotlight illuminates it until the instrument is muted, providing conductor and audience with visual cues about the ensemble. Unlike other systems, ours does not require user or equipment to be placed at specific locations (contrasted with Kinect, Wii sensors, or camera-based tracking systems), and there is no issue regarding room lighting (such as digital camera-based tracking systems or Kinect), nor interference with other players or obstacles. The goal of using different equipment as a conductor’s baton is to allow nonexpert users to lead a realtime concert within a cyberworld. Synchronization of gestures with music and animation has been one of the biggest challenges in many systems we surveyed, although ours showed only minimal delays. We compared user experience with a contemporary commercial game, receiving acceptable ratings from the participants

Subjects :-  Gestural Interface for Conducting Virtual Concerts