During my academic career, I have ventured into quite a number of scientific areas.
I started with a German “Diplom” in Microinformatics at the
University of Applied Sciences Gelsenkirchen, Germany,
transitioning seamlessly into my first industry job
where I developed electronic components and software
for mobile weighing systems used in the waste management industry.
Three years later, went back to Academia,
studying towards a degree of Master of Science in Computer Science
with a specialisation in Human-Computer Interaction.
For my thesis, I created an immersive virtual environment
for physical simulations, using magnetic tracking
technology, VR helmets, and a 3D projection screen.
In 2007, I moved to New Zealand and continued my academic career
at The University of Auckland, where I used game technology
to create a collaborative virtual environment
for medical teamwork training
with support for nonverbal communication.
After submitting (and later successfully defending) my thesis,
I started working at the Auckland University of Technology as lecturer,
being able to return to students and colleagues the knowledge invested into me
and to apply my scientific curiosity to the research that I am involved in.
In 2016, I advanced to senior lecturer and founded Sentience Lab,
a research space which specialises on the combination of 3D data visualisation and storytelling,
using modern 3D technologies such as motion capture and head mounted displays (e.g., Oculus Rift, HTC Vive).
Research
My main research interests are:
Immersive Technologies and Applications (AR/VR/MR/XR)
3D Data Visualisation
3D Graphics
Human-Computer Interaction
Computer Science Education
Projects
Sentience Lab - Immersive VR Visualisation
I am director of
Sentience Lab,
a research space which specialises on the combination of 3D data visualisation and storytelling,
using modern 3D technologies such as motion capture and head mounted displays (e.g., HTC Vive, Microsoft HoloLens).
We have developed a hardware and software framework that can be used flexibly
to create 3D virtual worlds that the user can explore by simply walking through them
and by interacting with elements by input controllers like motion tracked joysticks or gestures.
Examples of interactive visualisations include:
NeuCube, an artificial 3D neural network and its connections,
developed by KEDRI,
the Knowledge Engineering and Discovery Research Institute of AUT
New Zealand and global earthquake data from 1900 to the present
Anatomical models, e.g., human nasal cavity
NASA atmospheric datasets, e.g., CO2, water vapour, precipitation
Robotics, e.g., workspace and movement planning
3D Sketching, e.g., rapid prototyping, modelling
Entertainment, e.g., Pre-visualisation of motion capture scenes, gaming
Jetblack
This project was an investigation
into the design of a cockpit for a New Zealand landspeed record vehicle.
What is the best design for controls and instruments and the flow of information
in a vehicle that travels at supersonic speeds?
How can we organise and present vital vehicle information to the pilot
in a non-intrusive, but physiologically effective manner?
For the project, a life-sized cockpit was built,
running a physical simulation of the vehicle
that could be controlled with a yoke that provided force feedback.
A control centre software allows the monitoring and manipulation
of the simulated vehicle during each run,
e.g., triggering fires or component failures
to test the reaction time of the pilot
given a specific layout of the warning signals and indicators.
It is a Serious Game, using the Source Engine as a framework
for training medical teams to cooperate during a surgical task.
The specific focus was on incorporating nonverbal cues like head gestures
into the simulation to investigate the effect on realism and efficiency.
In a user study with 30 participants, we found that the gestures
did not significantly influence the performance of the teamwork,
but increased the perceived immersion and realism of the simulation.
Learning to Walk
This is my Master's Thesis project, developed between 2004 and 2005
at the University of Applied Sciences Gelsenkirchen (now the University of Applied Sciences Westphalia), Germany.
It uses a simulated virtual reality environment to evolve walking patterns of virtual 3D creatures
using evolutionary strategies and neural networks.
The original idea was to investigate whether or not
the combination of the above technologies would result
in a locomotion pattern that would be similar or superior
to patterns created by animators.
The results were unsatisfactory
in that it proved hard to define a “good” locomotion in mathematical terms
that the evolutionary algorithms needed for the selection process.