Landscape Data, Art/Artefacts & Models as Linked Open Data Perth, Australia

For those interested in the above, please keep Friday 27 July 2018, open for an all-day free event in Perth.

We will be inviting speakers to talk on Australia-specific cultural issues and digital (geo) projects in relation to the above event.

More details to follow shortly and announced via

So there is an Australian working group for Pelagios – Linked Open Data. We will run an event on 27 July at Curtin. News to follow.

Australia LAMLOD Group: led by Erik Champion (UNESCO Chair of Cultural Visualisation and Heritage, Curtin University) and Susan Fayad (City of Ballarat), this WG seeks to address the problem of linking materials between academic research and cultural heritage in an Australian context. This is not so much about extending Pelagios linked data practice to an entirely new continent, though that is important; the problem this WG seeks to address is the multi-layered and contentious representation of cultural heritage, namely: the vast scale of Australian landscapes and historic journeys; the local and highly specific Aboriginal ways of describing, navigating and experiencing the landscapes with hundreds of different languages; and the specific problem of integrating UNESCO designated built and natural heritage with its surrounding ecosystems. The LAMLOD WG will create landscape data and visualisation displays, investigate related cultural artefact knowledge (Indigenous and colonial), and build towards the integration of linked open data and 3D models.



Converting Unreal Tournament Levels

Hope to convert an Unreal Tournament (UT2004) game level to UT3. My models (originally), but ported to UT from Adobe Atmosphere and re-textured (read: sculptures/reliefs removed) by students in 2005.


And tutorials warn I have to delete almost everything to convert, and it may well not work. Great!

Perhaps it would be easier to import from 3DS (3D Studio Max) but I no longer have the models! Oh well, that is virtual heritage for you.

If others have virtual heritage models in the UDK editor (Unreal 3) or directly in the latest Unreal 4 engine, please let me know, a student intern here is modifying Unreal to run on the Curtin HIVE cylindrical screen and (semi) dome.


Kinect & HMD collaborative engagement

Corbin is my summer intern, looking at
1. Kinect-Minecraft v2: a software framework for non-programmers to create their own gestures for Minecraft interaction:

See also:

2 Kinect-Unity pointer software:Kinect-Unity-3Dpointer

3. Point clouds with a Head Mounted Display (HMD) /Unreal. Status: exploratory.


See also CAA2017 slides from Damien Vurpillot:

4. Corbin will narrow down the above into one main investigation. Evaluate: sharing virtual experiences across different displays (cylindrical versus HMD): to uncover similar papers with a collaborative learning focus. Ideally there will be a comparison of Unity versus Unreal.





Well #GLAMVR16 was the twitter hashtag for Friday 26 August’s event held at the HIVE Curtin university, Perth. In the morning two invited speakers (Assistant Professor Elaine Sullivan and Mr Conal Tuohy) gave talks on Digital Karnak and Linked Open Data. They were followed by myself and my colleagues at the School of Media, Culture and Creative Arts, then a workshop on Trove data feed into UNITY game engine dynamically (Mr Michael Wiebrands) and Augmented Reality, Vueforia>Unity (Mr Dominic Manley).

There were three themes/reasons for the morning talks and afternoon workshops.

1.Digital Heritage: Workflows & issues in preserving, exporting & linking digital collections (especially heritage collections for GLAM.

2.Scholarly Making: Encourage makerspaces & other activities in tandem with academic research.

3.Experiential Media: Develop AR/VR & other new media technology & projects esp. for humanities.

The event was part of a strategic grant received from the School of Media Culture and Creative Arts, so thanks very much to MCCA!

Schedule and links to slides

Session title and links to slideshare PRESENTER
Introductions Erik Champion
Digital Karnak Elaine Sullivan, UCSC USA
Linked Open Data Visualisation Conal Tuohy, Brisbane
Making collections accessible in an online environment Lise Summers
Digital scholarship, makerspaces and the library Karen Miller
Digital Heritage Interfaces and Experiential Media Erik Champion
Simple Biometric Devices for Audience Engagement Stuart Bender
Usability of interactive digital multimedia in the GLAM sector Beata Dawson
Emotive Media – Visualisation and Analysis of Human Bio-Feedback Data Artur Lugmayr
Visualising information with RAM iSquares Pauline Joseph
digital workflows (UNITY)  Michael Wiebrands
Introduction to Augmented Reality Dominic Manley
final questions/social networking/ SUNDOWNER Centre for Aboriginal Studies Foyer

Digital Heritage, Scholarly Making & Experiential Media

Our internal small grant (School of Media Culture and Creative Arts, Curtin University) was successful!

Here is a synopsis of the application (redacted):

Digital Heritage, Scholarly Making & Experiential Media

We propose

  • A one-day workshop [Friday 26 August 2016, HIVE] with 3D, Digital APIs, UNITY and Augmented Reality workshops.
  • We will present our projects at that workshop and a month later meet to review progress and each other’s publications and grants.
  • Then we will organize with the Library and other GLAM partners a cultural hackathon in Perth where programmers and other parties spend a day creating software prototypes based on our ideas from the workshop. The best project will win a prize but the IP will be open source and contestants may be invited into the research projects or related grant applications.
  • Equipment to build prototypes and showcases for future grants. Part of the money will also go into Virtual Reality headsets, and Augmented Reality equipment that can be loaned out from the MCCA store to postgraduates and students.

The above would help progress the below research projects:

  • Another need is to develop the maker-space and digital literacy skills in information studies and the Library Makerspace, to develop a research area in scholarly making.
  • Another project is to integrate archives and records with real-time visualisation such as in the area of digital humanities scholarship, software training in digital humanities, and hands on workshops and crafting projects at the Curtin University Library.
  • Another project is to explore how SCALAR can integrate 3D and Augmented Reality and create a framework for cloud-based media assets that could dynamically relate to an online scholarly publication and whether that journal in printed form, with augmented reality trackers and head mounted displays could create multimedia scholarly journals where the multimedia is dynamically downloaded from the Internet so can be continually updated. Can this work inform future developments of eSPACE and interest in ‘scholarly making’ and makerspaces?
  • There is potential to create an experiential media research cluster with the new staff of SODA, to explore immersive and interactive media that can capture emotions and affects of participants or players. This requires suitable equipment.

Ideas on how to adapt Kinect camera tracking for 3D presentations in archaeology

I did not mention all these in my 22 May presentation at Digital Heritage 3D conference in Aarhus (

But here are some working notes for future development:

How Xbox Kinect camera tracking could change the simulated avatar:

  1. Avatars in the simulated world change their size clothing or inventories – they scale relative to typical sizes and shapes of the typical inhabitants, or scale is dependent on the scene or avatar character chosen.
  2. Avatars change to reflect people picking up things.
  3. Avatars role-play – different avatars see different things in the digital world.
  4. Narrator gestures affect the attention or behavior of the avatar.

How Xbox Kinect camera tracking could change the simulated world or digital objects in that world:

  1. Multiple players are needed to lift and examine objects.
  2. Objects move depending on the biofeedback of the audience or the presenter.
  3. Interfaces for Skype and Google hangout – remote audiences can select part of the screen and filter scenes or wire-frame the main model.
  4. Levels of authenticity and time layers can be controlled or are passively / indirectly affected by narrator motion or audience motion / volume / infrared output.


Kinect SDK 2 FINGER TRACKING (etc) for Desktops & Large Screens (VR)

We are trying to create some applications/extensions that allow people to interact naturally with 3D built environments on a desktop by pointing at or walking up to objects in the digital environment:

Screen Shot 2014-12-17 at 4.11.20 PM

or a large surround screen (figure below is of the Curtin HIVE):


using a Kinect (SDK 1 or 2) for tracking. Ideally we will be able to:

  1. Green screen narrator into a 3D environment (background removal).
  2. Control an avatar in the virtual environment using speaker’s gestures.
  3. Trigger slides and movies inside a UNITY environment via speaker finger-pointing Ideally the speaker could also change the chronology of built scene with gestures (or voice), could alter components or aspects of buildings, move or replace parts or components of the environment. Possibly also use Leap SDK (improved).
  4. Better employ the curved screen so that participants can communicate with each other.

We can have a virtual/tracked hand point to objects creating an interactive slide presentation to the side of the Unity environment. As objects are pointed at information appears in a camera window/pane next to the 3D digital environment, or, these info windows are triggered on approach.

A commercial solution to Kinect tracking for use inside Unity environments is but they only appear to be working with SDK 1. Which is a bit of a problem, to rephrase:

Problem: All solutions seem to be Kinect SDK 1 and SDK 2 only appears to work on Windows 8. We use Windows 7 and Mac OS X (10.10.1).

So if anyone can help me please reply/email or comment on this post.

And for those doing similar things, here are some links I found on creating Kinect-tracked environments:

Kinect with MS-SDK is a set of Kinect examples, utilizing three major scripts and test models. It demonstrates how to use Kinect-controlled avatars or Kinect-detected gestures in your own Unity projects. This asset uses the Kinect SDK/Runtime provided by Microsoft. URL:
And here is “one more thing”: A great Unity-package for designers and developers using Playmaker, created by my friend Jonathan O’Duffy from HitLab Australia and his team of talented students. It contains many ready-to-use Playmaker actions for Kinect and a lot of example scenes. The package integrates seamlessly with ‘Kinect with MS-SDK’ and ‘KinectExtras with MsSDK’-packages.

KinectExtras for Kinect v2 is part of the “Kinect v2 with MS-SDK“. This package here and “Kinect with MS-SDK” are for Kinect v1 only.

BACKGROUND REMOVAL (leaves just player)

FINGER TRACKING (Not good on current Kinect for various reasons)

  2. Not sure if SDK 1 but FingerTracker is a Processing library that does real-time finger-tracking from depth images:
  3. Finger tracking for interaction in augmented environments: Finger tracking for interaction in augmented environments OR by K Dorfmüller-Ulhaas – a finger tracker that allows gestural interaction and is sim- ple, cheap, fast … is based on a marked glove, a stereoscopic tracking system and a kinematic 3-d …
  4. Video of “Finger tracking with Kinect SDK” see
  5. Finger tracking using Java
  6. Microsoft can do it: Might need to contact them though for info

Download nimble VR Win 8 required but has mac binaries