Academic Portfolio

The Virtual Environment for medical teamwork training, designed during my PhD thesis

During my academic career, I have ventured into quite a number of scientific areas. I started with a German “Diplom” in Microinformatics at the University of Applied Sciences Gelsenkirchen, Germany, transitioning seamlessly into my first industry job where I developed electronic components and software for mobile weighing systems used in the waste management industry.

Three years later, went back to Academia, studying towards a degree of Master of Science in Computer Science with a specialisation in Human-Computer Interaction. For my thesis, I created an immersive virtual environment for physical simulations, using magnetic tracking technology, VR helmets, and a 3D projection screen.

In 2007, I moved to New Zealand and continued my academic career at The University of Auckland, where I used game technology to create a collaborative virtual environment for medical teamwork training with support for nonverbal communication.

After submitting (and later successfully defending) my thesis, I started working at the Auckland University of Technology as lecturer, being able to return to students and colleagues the knowledge invested into me and to apply my scientific curiosity to the research that I am involved in.

In 2016, I advanced to senior lecturer and founded Sentience Lab, a research space which specialises on the combination of 3D data visualisation and storytelling, using modern 3D technologies such as motion capture and head mounted displays (e.g., Oculus Rift, HTC Vive).

Research

3D realtime visualisation of global earthquake data
3D realtime visualisation of an artificial 3D neural network
Testing the Virtual Environment created during my Master's Thesis in 2005

My main research interests are:

  • Immersive Technologies and Applications (AR/VR/MR/XR)
  • 3D Data Visualisation
  • 3D Graphics
  • Human-Computer Interaction
  • Computer Science Education

Projects

Sentience Lab - Immersive VR Visualisation

I am director of Sentience Lab, a research space which specialises on the combination of 3D data visualisation and storytelling, using modern 3D technologies such as motion capture and head mounted displays (e.g., HTC Vive, Microsoft HoloLens).

We have developed a hardware and software framework that can be used flexibly to create 3D virtual worlds that the user can explore by simply walking through them and by interacting with elements by input controllers like motion tracked joysticks or gestures. Examples of interactive visualisations include:

  • NeuCube, an artificial 3D neural network and its connections, developed by KEDRI, the Knowledge Engineering and Discovery Research Institute of AUT
  • New Zealand and global earthquake data from 1900 to the present
  • Anatomical models, e.g., human nasal cavity
  • NASA atmospheric datasets, e.g., CO2, water vapour, precipitation
  • Robotics, e.g., workspace and movement planning
  • 3D Sketching, e.g., rapid prototyping, modelling
  • Entertainment, e.g., Pre-visualisation of motion capture scenes, gaming

Navigating a Virtual Environment using the Oculus Rift and Motion Capture technology
Handheld camera for navigating through a Virtual Environment
Realtime 3D visualisation of Stonehenge with celestial simulation
Realtime 3D visualisation of a point cloud of Saint Sulpice, Paris
Realtime interactive 3D visualisation of an artificial 3D neural network with 1.5 Million neurons
Realtime interactive 3D visualisation of an artificial 3D neural network with 1.5 Million neurons

Jetblack

This project was an investigation into the design of a cockpit for a New Zealand landspeed record vehicle. What is the best design for controls and instruments and the flow of information in a vehicle that travels at supersonic speeds? How can we organise and present vital vehicle information to the pilot in a non-intrusive, but physiologically effective manner?

For the project, a life-sized cockpit was built, running a physical simulation of the vehicle that could be controlled with a yoke that provided force feedback. A control centre software allows the monitoring and manipulation of the simulated vehicle during each run, e.g., triggering fires or component failures to test the reaction time of the pilot given a specific layout of the warning signals and indicators.

Pilot's view of the Jetblack simulator
The simulator in the Unity engine

Virtual Medical Team Training

This is my PhD project, developed between 2007 and 2011 at The University of Auckland.

It is a Serious Game, using the Source Engine as a framework for training medical teams to cooperate during a surgical task. The specific focus was on incorporating nonverbal cues like head gestures into the simulation to investigate the effect on realism and efficiency. In a user study with 30 participants, we found that the gestures did not significantly influence the performance of the teamwork, but increased the perceived immersion and realism of the simulation.

Start screen of the training program
Screenshot of the simulation with three avatars
The program used to capture head gestures
Single user study testing the reliability of head gestures
Single user study testing the precision of avatar head gestures
The operating field in th simulator
The anaesthesia monitor in the simulator
Photo of the single user study setup
Photo of the multi user study showing three participants using the simulator

Learning to Walk

This is my Master's Thesis project, developed between 2004 and 2005 at the University of Applied Sciences Gelsenkirchen (now the University of Applied Sciences Westphalia), Germany.

It uses a simulated virtual reality environment to evolve walking patterns of virtual 3D creatures using evolutionary strategies and neural networks. The original idea was to investigate whether or not the combination of the above technologies would result in a locomotion pattern that would be similar or superior to patterns created by animators. The results were unsatisfactory in that it proved hard to define a “good” locomotion in mathematical terms that the evolutionary algorithms needed for the selection process.

3D Helmet for immersive exploration of the Virtual Environment
The 3D stereo vision projection screen
A control panel for monitoring the neural network
The Linux cluster control panel for running the genetic algorithms
A biped that has learned to walk forwards
A quadruped moving in a straight line

Publications

2021

Kruse, J., Connor, A. & Marks, S. (2021). An interactive multi-agent system for game design. The Computer Games Journal, 10(1):41-63, Springer Science and Business Media LLC

2020

Phan, T., Ramhormozian, S., Clifton, C., MacRae, G., Dhakal, R., Jia, L.J. & Marks, S. (2020). Development of a virtual construction approach for steel structures considering structural and non- structural elements, and installation equipment. In The 54th International Conference of the Architectural Science Association, , pages 405-414
Alex, M., Lottridge, D., Lee, J., Marks, S. & Wünsche, B. (2020). Discrete versus continuous colour pickers impact colour selection in virtual reality art-making. In Proceedings of the 32nd Australian Conference on Human-Computer-Interaction (OzCHI 2020), , pages 158-169, Association for Computing Machinery
Lee, Y., Marks, S. & Connor, A. (2020). An evaluation of the effectiveness of virtual reality in air traffic control. In Proceedings of the 4th International Conference on Virtual and Augmented Reality Simulations, , pages 7-17
Marks, S. & White, D. (2020). Multi-device collaboration in virtual environments. In ICVARS 2020: Proceedings of the 2020 4th International Conference on Virtual and Augmented Reality Simulations, , pages 35-38

2018

Magdics, M., White, D. & Marks, S. (2018). Extending a Virtual Reality Nasal Cavity Education Tool with Volume Rendering. In Proceedings of 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), 35:811-814, IEEE
Marks, S., White, D. & Magdics, M. (2018). Evaluation of a Virtual Reality Nasal Cavity Education Tool. In Proceedings of 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), 35:193-198, IEEE
Nikolai, J., Bennett, G., Marks, S. & Maynard, G. (2018). Active learning and teaching through digital technology and live performance; ‘choreographic thinking’ as art practice in the tertiary sector. International Journal of Art and Design Education, Wiley
Marks, S. (2018). Virtual Reality. In The SAGE Encyclopedia of the Internet, 3:906-911, SAGE
Sengupta, N., Ramos, J., Tu, E., Marks, S., Scott, N., Weclawski, J., Gollahalli, A., Doborjeh, M., Doborjeh, Z., Kumarasinghe, K., Breen, V. & Abbott, A. (2018). From von Neumann architecture and Atanasoffs ABC to neuro-morphic computation and Kasabov’s neuCube: Principles and implementations. In Studies in Computational Intelligence - Learning Systems: From Theory to Practice, , pages 1-28, Springer, Cham

2017

Marks, K. & Marks, S. (2017). Drawing on Hope: A Virtual Reality Project.
Marks, K., Marks, S. & Brown, A. (2017). Step into my (virtual) world: An (auto)ethnographic exploration of virtual reality drawing applications for arts therapy. Australian and New Zealand Journal of Arts Therapy (ANZJAT), 12(10):99-111, Australian and New Zealand Arts Therapy Association
Marks, S., White, D. & Singh, M. (2017). Getting up your nose: A virtual reality education tool for nasal cavity anatomy. In SIGGRAPH Asia 2017 Symposium on Education Proceedings, ACM
Marks, S. (2017). Immersive visualisation of 3-dimensional spiking neural networks. Evolving Systems, 8:193-201, Springer Berlin Heidelberg

2016

Marks, S. & Marks, K. (2016). Step into my (virtual) world.
Marks, S. & Marks, K. (2016). Step into my (virtual) world. In Festival of Artful Transitions
Marks, S. & Marks, K. (2016). Step into my (virtual) world. In Artful Transitions - ANZATA 2016 Symposium Programme
Connor, A., Sosa, R., Marks, S. & Jackson, A. (2016). Problem solving at the edge of disciplines. In Handbook of Research on Creative Problem-Solving Skill Development in Higher Education, IGI Global
Foottit, J., Brown, D., Marks, S. & Connor, A. (2016). A wearable haptic game controller. International Journal of Game Theory and Technology, 2:1-19, AIRCC Publishing Corporation
Connor, A. & Marks, S. (2016). Creative Technologies for Multidisciplinary Applications. IGI Global
Shaw, L., Tourrel, R., Wünsche, B., Lutteroth, C., Marks, S. & Buckley, J. (2016). Design of a virtual trainer for exergaming. In ACM International Conference Proceeding Series, Association for Computing Machinery
Connor, A., Sosa, R., Karmokar, S., Marks, S., Buxton, M., Gribble, A., Jackson, A. & Foottit, J. (2016). Exposing core competencies for future creative technologists. In Creative technologies for multidisciplinary applications, pages 377-397, IGI Global
Kasabov, N., Scott, N., Tu, E., Marks, S., Sengupta, N., Capecci, E., Othman, M., Doborjeh, M., Murli, N., Hartono, R., Espinosa-Ramos, J., Zhou, L., Alvi, F., Wang, G., Taylor, D., Feigin, V., Gulyaev, S., Mahmoud, M., Hou, Z.G. & Yang, J. (2016). Evolving spatio-temporal data machines based on the NeuCube neuromorphic framework: Design methodology and selected applications. Neural Networks, 78:1-14, Elsevier
Foottit, J., Brown, D., Marks, S. & Connor, A. (2016). Development of a wearable haptic game interface. EAI Endorsed Transactions on Creative Technologies, 16(e5):1-10, EAI

2015

Connor, A., Marks, S. & Walker, C. (2015). Creating creative technologists: Playing with(in) education. In Creativity in the Digital Age, pages 35-56, Springer
Marks, S., Estevez, J. & Scott, N. (2015). Immersive visualisation of 3-dimensional neural network structures. In 13th International Conference on Neuro-Computing and Evolving Intelligence (NCEI) 2015
Marks, S. & Blagojevic, R. (eds) (2015). Proceedings of the sixteenth Australasian User Interface Conference (AUIC 2015). , 162, Australian Computer Society Inc
Shaw, L., Wünsche, B., Lutteroth, C., Marks, S., Buckley, J. & Corballis, P. (2015). Development and Evaluation of an Exercycle Game Using Immersive Technologies. In Proceedings of the 8th Australasian Workshop on Health Informatics and Knowledge Management (HIKM 2015), 164:75-85, Australian Computer Society Inc
Shaw, L., Wünsche, B., Lutteroth, C., Marks, S. & Callies, R. (2015). Challenges in virtual reality exergame design. In Conferences in Research and Practice in Information Technology (CRPIT), 162, Australian Computer Society Inc.

2014

Connor, A., Berthelsen, C., Karmokar, S., Marks, S., Kenobi, B. & Walker, C. (2014). An unexpected journey: Experiences of learning through exploration and experimentation. In Action!-Doing Design Education
Foottit, J., Brown, D., Marks, S. & Connor, A. (2014). An Intuitive Tangible Game Controller. In The 10th Australasian Conference on Interactive Entertainment (IE 2014)
Marks, S., Estevez, J. & Connor, A. (2014). Towards the Holodeck: Fully immersive virtual reality visualisation of scientific and engineering data. In 29th International Conference on Image and Vision Computing New Zealand (IVCNZ) 2014, , pages 42-47, ACM
Wünsche, B. & Marks, S. (eds) (2014). Proceedings of the Fifteenth Australasian User Interface Conference (AUIC 2014). , 150, Australian Computer Society Inc

2013

Marks, S. & Wellington, R. (2013). Experimental study of steer-by-wire ratios and response curves in a simulated high speed vehicle. In Proceedings of the Fourteenth Australasian User Interface Conference (AUIC2013), 139:123-124
Wellington, R. & Marks, S. (2013). An ethnographic study of a high cognitive load driving environment. In Proceedings of the 14th Australasian User Interface Conference (AUIC 2013), 139:121-122, Australian Computer Society Inc

2012

Marks, S., Windsor, J. & Wünsche, B. (2012). Head Tracking Based Avatar Control for Virtual Environment Teamwork Training. Journal of Virtual Reality and Broadcasting, 9.2012
Marks, S., Windsor, J. & Wünsche, B. (2012). Using Game Engine Technology for Virtual Environment Teamwork Training. In WSCG '2012 Conference Proceedings - Part 1, , pages 169-177, WSCG Digital Library
Marks, S., Windsor, J. & Wünsche, B. (2012). Design and evaluation of a medical teamwork training simulator using consumer-level equipment. In Medicine Meets Virtual Reality 19, , pages 273-279, IOS Press

2011

Marks, S. (2011). A Virtual Environment for Medical Teamwork Training with Support for Non-Verbal Communication using Consumer-Level Hardware and Software. PhD Thesis, The University of Auckland
Marks, S. (2011). Virtual environment for and physical simulation of a supersonic land speed record vehicle.
Marks, S., Windsor, J. & Wünsche, B. (2011). Head tracking based avatar control for virtual environment teamwork training. In GRAPP 2011 - Proceedings of the International Conference on Computer Graphics Theory and Applications, , pages 257-269

2010

Marks, S., Windsor, J. & Wünsche, B. (2010). Evaluation of the Effectiveness of Head Tracking for View and Avatar Control in Virtual Environments. In 2010 25th International Conference Image and Vision Computing New Zealand, IVCNZ 2010 - Conference Proceedings, , pages 1-8, IEEE

2009

Marks, S., Windsor, J. & Wünsche, B. (2009). Optimisation and comparison framework for monocular camera-based face tracking. In 2009 24th International Conference Image and Vision Computing New Zealand, IVCNZ 2009 - Conference Proceedings, , pages 243-248, IEEE
Marks, S., Windsor, J. & Wünsche, B. (2009). The Impact of Non-Verbal Communication in Virtual-Environment-Based Teamwork Training. In SimTecT 2009 Health Simulation Conference
Marks, S., Windsor, J. & Wünsche, B. (2009). Enhancing Virtual-Environment-Based Teamwork Training with Non-Verbal Communication. In Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009, , pages 4133-4144, AACE
Marks, S., Windsor, J. & Wünsche, B. (2009). Enhancing Virtual Environment-Based Surgical Teamwork Training with Non-Verbal Communication. In New Zealand Computer Science Research Student Conference (NZCSRSC) 2009
Marks, S., Windsor, J. & Wünsche, B. (2009). Enhancing Virtual Environment-Based Surgical Teamwork Training With Non-Verbal Communication. In GRAPP 2009 - Proceedings of the 4th International Conference on Computer Graphics Theory and Applications, , pages 361-366, INSTICC Press

2008

Marks, S., Windsor, J. & Wünsche, B. (2008). Camera based face tracking for enhancing surgical teamwork training with non-verbal communication. In 23rd International Conference Image and Vision Computing New Zealand, IVCNZ, IEEE
Marks, S., Windsor, J. & Wünsche, B. (2008). Evaluation of Game Engines for Simulated Clinical Training. , Canterbury University

2007

Marks, S., Windsor, J. & Wünsche, B. (2007). Collaborative Soft Object Manipulation for Game Engine-Based Virtual Reality Surgery Simulators. In 22nd International Conference Image and Vision Computing New Zealand, IVCNZ 2007, , pages 205-210, University of Waikato
Marks, S., Windsor, J. & Wünsche, B. (2007). Evaluation of game engines for simulated surgical training. In Proceedings of the 5th International Conference on Computer Graphics and Interactive Techniques in Australia and Southeast Asia, , pages 273-280
Marks, S. (2007). Don't Shoot Them - Heal Them. , (This poster won the 1st prize at the University of Auckland 2007 Exposure poster competition).
Henriques, A., Wünsche, B. & Marks, S. (2007). An investigation of meshless deformation for fast soft tissue simulation in virtual surgery applications. Computer-Assisted Radiology and Surgery, 2:S169-S171

2006

Marks, S., Conen, W. & Lux, G. (2006). Evolving autonomous locomotion of virtual characters in simulated physical environment via neural networks and evolutionary strategies. In Proceedings of the ninth 3IA International Conference on Computer Graphics and Artificial Intelligence 3iA2006, , pages 183-190
Marks, S. (2006). Evolving autonomous locomotion of virtual characters in a simulated physical environment via neural networks and evolutionary strategies. Master's Thesis, University of Applied Sciences Gelsenkirchen