Academic Portfolio

The Virtual Environment for medical teamwork training, designed during my PhD thesis

During my academic career, I have ventured into quite a number of scientific areas. I started with a German “Diplom” in Microinformatics at the University of Applied Sciences Gelsenkirchen, Germany, transitioning seamlessly into my first industry job where I developed electronic components and software for mobile weighing systems used in the waste management industry.

Three years later, went back to Academia, studying towards a degree of Master of Science in Computer Science with a specialisation in Human-Computer Interaction. For my thesis, I created an immersive virtual environment for physical simulations, using magnetic tracking technology, VR helmets, and a 3D projection screen.

In 2007, I moved to New Zealand and continued my academic career at The University of Auckland, where I used game technology to create a collaborative virtual environment for medical teamwork training with support for nonverbal communication.

After submitting (and later successfully defending) my thesis, I started working at the Auckland University of Technology as lecturer, being able to return to students and colleagues the knowledge invested into me and to apply my scientific curiosity to the research that I am involved in.

In 2016, I advanced to senior lecturer and founded Sentience Lab, a research space which specialises on the combination of 3D data visualisation and storytelling, using modern 3D technologies such as motion capture and head mounted displays (e.g., Oculus Rift, HTC Vive).

Research

3D realtime visualisation of global earthquake data
3D realtime visualisation of an artificial 3D neural network
Testing the Virtual Environment created during my Master's Thesis in 2005

My main research interests are:

  • Immersive Technologies and Applications (AR/VR/MR/XR)
  • 3D Data Visualisation
  • 3D Graphics
  • Human-Computer Interaction
  • Computer Science Education

Projects

Sentience Lab - Immersive VR Visualisation

I am director of Sentience Lab, a research space which specialises on the combination of 3D data visualisation and storytelling, using modern 3D technologies such as motion capture and head mounted displays (e.g., HTC Vive, Microsoft HoloLens).

We have developed a hardware and software framework that can be used flexibly to create 3D virtual worlds that the user can explore by simply walking through them and by interacting with elements by input controllers like motion tracked joysticks or gestures. Examples of interactive visualisations include:

  • NeuCube, an artificial 3D neural network and its connections, developed by KEDRI, the Knowledge Engineering and Discovery Research Institute of AUT
  • New Zealand and global earthquake data from 1900 to the present
  • Anatomical models, e.g., human nasal cavity
  • NASA atmospheric datasets, e.g., CO2, water vapour, precipitation
  • Robotics, e.g., workspace and movement planning
  • 3D Sketching, e.g., rapid prototyping, modelling
  • Entertainment, e.g., Pre-visualisation of motion capture scenes, gaming

Navigating a Virtual Environment using the Oculus Rift and Motion Capture technology
Handheld camera for navigating through a Virtual Environment
Realtime 3D visualisation of Stonehenge with celestial simulation
Realtime 3D visualisation of a point cloud of Saint Sulpice, Paris
Realtime interactive 3D visualisation of an artificial 3D neural network with 1.5 Million neurons
Realtime interactive 3D visualisation of an artificial 3D neural network with 1.5 Million neurons

Jetblack

This project was an investigation into the design of a cockpit for a New Zealand landspeed record vehicle. What is the best design for controls and instruments and the flow of information in a vehicle that travels at supersonic speeds? How can we organise and present vital vehicle information to the pilot in a non-intrusive, but physiologically effective manner?

For the project, a life-sized cockpit was built, running a physical simulation of the vehicle that could be controlled with a yoke that provided force feedback. A control centre software allows the monitoring and manipulation of the simulated vehicle during each run, e.g., triggering fires or component failures to test the reaction time of the pilot given a specific layout of the warning signals and indicators.

Pilot's view of the Jetblack simulator
The simulator in the Unity engine

Virtual Medical Team Training

This is my PhD project, developed between 2007 and 2011 at The University of Auckland.

It is a Serious Game, using the Source Engine as a framework for training medical teams to cooperate during a surgical task. The specific focus was on incorporating nonverbal cues like head gestures into the simulation to investigate the effect on realism and efficiency. In a user study with 30 participants, we found that the gestures did not significantly influence the performance of the teamwork, but increased the perceived immersion and realism of the simulation.

Start screen of the training program
Screenshot of the simulation with three avatars
The program used to capture head gestures
Single user study testing the reliability of head gestures
Single user study testing the precision of avatar head gestures
The operating field in th simulator
The anaesthesia monitor in the simulator
Photo of the single user study setup
Photo of the multi user study showing three participants using the simulator

Learning to Walk

This is my Master's Thesis project, developed between 2004 and 2005 at the University of Applied Sciences Gelsenkirchen (now the University of Applied Sciences Westphalia), Germany.

It uses a simulated virtual reality environment to evolve walking patterns of virtual 3D creatures using evolutionary strategies and neural networks. The original idea was to investigate whether or not the combination of the above technologies would result in a locomotion pattern that would be similar or superior to patterns created by animators. The results were unsatisfactory in that it proved hard to define a “good” locomotion in mathematical terms that the evolutionary algorithms needed for the selection process.

3D Helmet for immersive exploration of the Virtual Environment
The 3D stereo vision projection screen
A control panel for monitoring the neural network
The Linux cluster control panel for running the genetic algorithms
A biped that has learned to walk forwards
A quadruped moving in a straight line

Publications