Event script interpreter for synchronized “roller-cam” graphical display and rotary motion platform

Dr. Rasika Ranaweera ,Senior Lecturer/Dean ,Faculty of Computing ,ranaweera.r@nsbm.ac.lk

Abstract :-

In the field of virtual and mixed reality, vision, audition, olfaction (smell), gustation (taste), and touch sensations are important for a human to immerse in a synthetic environment. Toward the virtual display of some of these, we have developed a script processor, a client which synchronizes rotational imperatives with sophisticated graphics and sound generated by deterministic output from a modeling game called “Roller Coaster Tycoon.” The game itself has egocentric perspective visual display and soundscape. Yaw of the rotary motion platform [3] can be synchronized with these displays to have integrated 1st–person multimodal display [3]. For the synchronization, we use a Collaborative Virtual Environment (CVE) platform, a client–server architecture developed by our group. Our newly built client, the CVE script interpreter, parses choreographed motions and connects to the server which broadcasts to clients in the same session (in particular, the rotary motion platform). Positions are stored in an XML resource file. Each track of the game has a corresponding XML file. It contains location and orientation details in position nodes, each with a time stamp relative to the previous node. The Document Object Model is used for the XML parser since it is fast in navigating the tree structure. This architecture allows us freedom to add new tracks and use the same client for other games.