Matter is an Experience that explores User interface and Interaction in Virtual Reality. My role in this project was the development of mobile flocking Agents to roam the virtual environment. Conceptually Matter was conceived as a means of experimenting with gestural controls, using musical creation as a space to experiment and learn. Gesture would beget sound, sound would beget a response from the Agents. However after running into technical issue after issue the final product was considerably different from our initial vision. For more info visit this site
This project was my introduction into using the Unity game engine, C# scripting, and Artificial Intelligence. From a technical learning standpoint I am satisfied with my progress on these fronts. The use of flocking algorithms drove my learning into simulating Artificial life-like movements and behaviours and I am now comfortable using C# to script for Unity with plenty of progress to make. Conceptually I am less than happy with my final contribution, technical limitations and the resultant compromises left a considerable amount of potential on the cutting room floor and I cannot say I am satisfied with my final contribution to the whole project.
The group dynamic for this project was very modular with each member primarily focusing on their own tasks and interests. While this was effective for cutting down on the initial creative brainstorming and discussions that lead nowhere it did mean there was difficulty on finding the direction of the project as a whole and as a result I personally spent the initial weeks somewhat unsure of what I was expected to achieve and how it fit in to the greater whole or why it was necessary. Eventually these problems were noticed and addressed but I feel that too much time was wasted over poor communication.
The primary area of failure that I feel this project has revealed to me concerns the technical aspects of this project. Issues with plugins and various data pipelines meant that the gesture based control scheme had to be abandoned in place of interactive keys. While this change still allows for a fun, interactive Experience it cut away a major conceptual piece of the project, the purpose of using this to research gesture control in VR. Use of scripts that were too processor intensive meant that to avoid significant drops in framerate (vitally important at low framerate in VR can be incredibly nauseating) we had to remove the audio reactive behaviours in the flocking agents. I feel the result of this rendered my personal contribution to be little more than an aesthetic addition in the scope of the larger whole. The result of all these limitations turned what could have been interesting research into a new vastly unexplored field into little more than a neat interactive novelty.
I must now make it my mission to overcome the shortcomings I have revealed to myself throughout this semester in both technical and conceptual development. The events that transpired through working on this project taught me the value of technical scope. After all we can only make use of limited resources and processing power is no different. The cheaper it is to make something, the more room it has to grow. I have also shown poor conceptual development throughout this project and must stop asking myself “what do I want to make?” and instead ask “what do I want to find out”. It is important to remember that the artefact should be a representation of a learning journey and not the final destination. In the future I shall endeavour to manage myself in my own project to fulfil these goals.