My abstract for 21 May talk at the Digital Heritage 3D representation conference at Moesgaard Museum Aarhus Denmark
Title: Motion Control For Remote Archaeological Presentations
Displaying research data between archaeologists or to the general public is usually through linear presentations, timed or stepped through by a presenter. Through the use of motion tracking and gestures being tracked by a camera sensor, presenters can provide a more engaging experience to their audience, as they won’t have to rely on prepared static media, timing, or a mouse. While low-cost camera tracking allow participants to have their gestures, movements, and group behaviour fed into the virtual environment, either directly (the presenter is streamed) or indirectly (a character represents the presenter).
Using an 8 metre wide curved display (Figure 1) that can feature several on-screen panes at once, the audience can view the presenter next to a digital environment, with slides or movies or other presentation media triggered by the presenter’s hand or arm pointing at specific objects (Figure 2). An alternative is for a character inside the digital environment mirroring the body gestures of the presenter; where the virtual character points will trigger slides or other media that relates to the highlighted 3D objects in the digital scene.
Acknowledgement: I would like to thank iVEC summer intern Samuel Warnock for kicking off the prototype development for me and Zigfu for allowing us access to their SDK.
Figure 1. Screenshot of stereo curved screen at the HIVE, Curtin University.
Figure 2. Screenshot of prototype and pointing mechanism at the HIVE, Curtin University.