3D and game editors, archaeology, Architecture, HIVE, tracking, virtual heritage, visualisation

Kinect & HMD collaborative engagement

Corbin is my summer intern, looking at
1. Kinect-Minecraft v2: a software framework for non-programmers to create their own gestures for Minecraft interaction: https://www.youtube.com/watch?v=09tc3nLgx9w

See also: https://maker.library.curtin.edu.au/2016/08/02/creating-a-gui-for-kinect-v-2/

2 Kinect-Unity pointer software:Kinect-Unity-3Dpointer

3. Point clouds with a Head Mounted Display (HMD) /Unreal. Status: exploratory.

Reference http://digitime.nazg.org/index.php/2016/10/09/exploring-massive-point-clouds-in-virtual-reality-with-nvidia-tech-demo/

See also CAA2017 slides from Damien Vurpillot: https://www.academia.edu/30171751/Exploring_massive_point_clouds_how_to_make_the_most_out_of_available_digital_material

4. Corbin will narrow down the above into one main investigation. Evaluate: sharing virtual experiences across different displays (cylindrical versus HMD): to uncover similar papers with a collaborative learning focus. Ideally there will be a comparison of Unity versus Unreal.

 

 

 

Advertisements
Academic, augmented reality, digital heritage, Digital Humanities, heritage, HIVE, Presentation, respositories, Symposium, talks, virtual heritage, Virtual Reality, visualisation

#GLAMVR16

Well #GLAMVR16 was the twitter hashtag for Friday 26 August’s event held at the HIVE Curtin university, Perth. In the morning two invited speakers (Assistant Professor Elaine Sullivan and Mr Conal Tuohy) gave talks on Digital Karnak and Linked Open Data. They were followed by myself and my colleagues at the School of Media, Culture and Creative Arts, then a workshop on Trove data feed into UNITY game engine dynamically (Mr Michael Wiebrands) and Augmented Reality, Vueforia>Unity (Mr Dominic Manley).

There were three themes/reasons for the morning talks and afternoon workshops.

1.Digital Heritage: Workflows & issues in preserving, exporting & linking digital collections (especially heritage collections for GLAM.

2.Scholarly Making: Encourage makerspaces & other activities in tandem with academic research.

3.Experiential Media: Develop AR/VR & other new media technology & projects esp. for humanities.

The event was part of a strategic grant received from the School of Media Culture and Creative Arts, so thanks very much to MCCA!

Schedule and links to slides

Session title and links to slideshare PRESENTER
Introductions Erik Champion
Digital Karnak Elaine Sullivan, UCSC USA
Linked Open Data Visualisation Conal Tuohy, Brisbane
MORNING TEA morning TEA
Making collections accessible in an online environment Lise Summers
Digital scholarship, makerspaces and the library Karen Miller
Digital Heritage Interfaces and Experiential Media Erik Champion
Simple Biometric Devices for Audience Engagement Stuart Bender
Usability of interactive digital multimedia in the GLAM sector Beata Dawson
Emotive Media – Visualisation and Analysis of Human Bio-Feedback Data Artur Lugmayr
Visualising information with RAM iSquares Pauline Joseph
LUNCH
digital workflows (UNITY)  Michael Wiebrands
Introduction to Augmented Reality Dominic Manley
final questions/social networking/ SUNDOWNER Centre for Aboriginal Studies Foyer
3D and game editors, Announcements, augmented reality, Biofeedback, design, digital heritage, game, heritage, HIVE, Presentation, Symposium, talks, tracking, virtual heritage, Virtual Reality, visualisation

Digital Heritage, Scholarly Making & Experiential Media

Our internal small grant (School of Media Culture and Creative Arts, Curtin University) was successful!

Here is a synopsis of the application (redacted):

Digital Heritage, Scholarly Making & Experiential Media

We propose

  • A one-day workshop [Friday 26 August 2016, HIVE] with 3D, Digital APIs, UNITY and Augmented Reality workshops.
  • We will present our projects at that workshop and a month later meet to review progress and each other’s publications and grants.
  • Then we will organize with the Library and other GLAM partners a cultural hackathon in Perth where programmers and other parties spend a day creating software prototypes based on our ideas from the workshop. The best project will win a prize but the IP will be open source and contestants may be invited into the research projects or related grant applications.
  • Equipment to build prototypes and showcases for future grants. Part of the money will also go into Virtual Reality headsets, and Augmented Reality equipment that can be loaned out from the MCCA store to postgraduates and students.

The above would help progress the below research projects:

  • Another need is to develop the maker-space and digital literacy skills in information studies and the Library Makerspace, to develop a research area in scholarly making.
  • Another project is to integrate archives and records with real-time visualisation such as in the area of digital humanities scholarship, software training in digital humanities, and hands on workshops and crafting projects at the Curtin University Library.
  • Another project is to explore how SCALAR can integrate 3D and Augmented Reality and create a framework for cloud-based media assets that could dynamically relate to an online scholarly publication and whether that journal in printed form, with augmented reality trackers and head mounted displays could create multimedia scholarly journals where the multimedia is dynamically downloaded from the Internet so can be continually updated. Can this work inform future developments of eSPACE and interest in ‘scholarly making’ and makerspaces?
  • There is potential to create an experiential media research cluster with the new staff of SODA, to explore immersive and interactive media that can capture emotions and affects of participants or players. This requires suitable equipment.
3D and game editors, Academic, Biofeedback, Hand tracking, heritage, HIVE

Ideas on how to adapt Kinect camera tracking for 3D presentations in archaeology

I did not mention all these in my 22 May presentation at Digital Heritage 3D conference in Aarhus (http://conferences.au.dk/digitalheritage/)

But here are some working notes for future development:

How Xbox Kinect camera tracking could change the simulated avatar:

  1. Avatars in the simulated world change their size clothing or inventories – they scale relative to typical sizes and shapes of the typical inhabitants, or scale is dependent on the scene or avatar character chosen.
  2. Avatars change to reflect people picking up things.
  3. Avatars role-play – different avatars see different things in the digital world.
  4. Narrator gestures affect the attention or behavior of the avatar.

How Xbox Kinect camera tracking could change the simulated world or digital objects in that world:

  1. Multiple players are needed to lift and examine objects.
  2. Objects move depending on the biofeedback of the audience or the presenter.
  3. Interfaces for Skype and Google hangout – remote audiences can select part of the screen and filter scenes or wire-frame the main model.
  4. Levels of authenticity and time layers can be controlled or are passively / indirectly affected by narrator motion or audience motion / volume / infrared output.

MSKinect-3Dpowerpoint

3D and game editors, Architecture, Biofeedback, design, Finger Tracking, Hand tracking, HIVE

Kinect SDK 2 FINGER TRACKING (etc) for Desktops & Large Screens (VR)

We are trying to create some applications/extensions that allow people to interact naturally with 3D built environments on a desktop by pointing at or walking up to objects in the digital environment:

Screen Shot 2014-12-17 at 4.11.20 PM

or a large surround screen (figure below is of the Curtin HIVE):

IMG_5716_2

using a Kinect (SDK 1 or 2) for tracking. Ideally we will be able to:

  1. Green screen narrator into a 3D environment (background removal).
  2. Control an avatar in the virtual environment using speaker’s gestures.
  3. Trigger slides and movies inside a UNITY environment via speaker finger-pointing Ideally the speaker could also change the chronology of built scene with gestures (or voice), could alter components or aspects of buildings, move or replace parts or components of the environment. Possibly also use Leap SDK (improved).
  4. Better employ the curved screen so that participants can communicate with each other.

We can have a virtual/tracked hand point to objects creating an interactive slide presentation to the side of the Unity environment. As objects are pointed at information appears in a camera window/pane next to the 3D digital environment, or, these info windows are triggered on approach.

A commercial solution to Kinect tracking for use inside Unity environments is http://zigfu.com/ but they only appear to be working with SDK 1. Which is a bit of a problem, to rephrase:

Problem: All solutions seem to be Kinect SDK 1 and SDK 2 only appears to work on Windows 8. We use Windows 7 and Mac OS X (10.10.1).

So if anyone can help me please reply/email or comment on this post.

And for those doing similar things, here are some links I found on creating Kinect-tracked environments:

KINECT SDK 1
Kinect with MS-SDK is a set of Kinect examples, utilizing three major scripts and test models. It demonstrates how to use Kinect-controlled avatars or Kinect-detected gestures in your own Unity projects. This asset uses the Kinect SDK/Runtime provided by Microsoft. URL: http://rfilkov.com/2013/12/16/kinect-with-ms-sdk/
And here is “one more thing”: A great Unity-package for designers and developers using Playmaker, created by my friend Jonathan O’Duffy from HitLab Australia and his team of talented students. It contains many ready-to-use Playmaker actions for Kinect and a lot of example scenes. The package integrates seamlessly with ‘Kinect with MS-SDK’ and ‘KinectExtras with MsSDK’-packages.

NB
KinectExtras for Kinect v2 is part of the “Kinect v2 with MS-SDK“. This package here and “Kinect with MS-SDK” are for Kinect v1 only.

BACKGROUND REMOVAL (leaves just player)
rfilkov.wordpress.com/2013/12/17/kinectextras-with-mssdk/

FINGER TRACKING (Not good on current Kinect for various reasons)

  1. http://www.ar-tracking.com/products/interaction-devices/fingertracking/
  2. Not sure if SDK 1 but FingerTracker is a Processing library that does real-time finger-tracking from depth images: http://makematics.com/code/FingerTracker/
  3. Finger tracking for interaction in augmented environments: Finger tracking for interaction in augmented environments OR https://www.ims.tuwien.ac.at/publications/tr-1882-00e.pdf by K Dorfmüller-Ulhaas – a finger tracker that allows gestural interaction and is sim- ple, cheap, fast … is based on a marked glove, a stereoscopic tracking system and a kinematic 3-d …
  4. Video of “Finger tracking with Kinect SDK” see https://www.youtube.com/watch?v=rrUW-Z3fHkk
  5. Finger tracking using Java http://www.java2s.com/Open-Source/CSharp_Free_Code/Xbox/Download_Finger_Tracking_with_Kinect_SDK_for_XBOX.htm
  6. Microsoft can do it: http://www.engadget.com/2014/10/08/kinect-for-windows-finger-tracking/ Might need to contact them though for info

HAND TRACKING FOR USE WITH AN OCULUS RIFT
http://nimblevr.com/ For use with rift
Download nimble VR http://nimblevr.com/download.html Win 8 required but has mac binaries