If you have wandered into this section you may be wondering what projects ive been experimenting with in between jobs. Since June 2016 i’ve been spending time coming up with more ways to create mobile game play experiences for users while implementing augmented reality (as this is my field of interest).
Note: The Augmentation method used in these projects is called “User Defined Targetting” which as you can guess allows the user to choose where to augment for themselves. The method presented by Vuforia (the AR package used) causes the scene to be some what duplicated and causes script collisions which means game play is effected. I have adapted my own version that basically stops this duplication process and free’s up a little more processing power as a result.
Starting June 2016 to Early August 2016 I experimented with a potential rhythm game. The general idea is that once the scene was augmented. The user would play a capsule character on a 5X5 grid. This grid would start moving in different patterns to either push the user up to a barrier (that causes a game over) or push the user down past a lower barrier. The objective is for the capsule character to say alive in the middle.
The scripts I created for this game was to manage the grid of 25 as easily as possible for when creating patterns for different levels. Things that include, tile resets (in both what mode they were in and position), sequences to certain patterns (such as the flow pattern shown above). There was a fair bit of issues to begin with for the player controller character and how their physics were set up, usually with falling at a velocity that passes collisions but that was soon altered.
Before taking a break on the project for losing interest and not getting interest from others I have worked with in the past, I made a few extra functions for the tiles such as disappearing and reappearing and freezing the user in place which could potentially cause issues for users. About 20 seconds of one song was created as a demo as well as a rather nice menu screen which allows the user to access other sequence builds. The APK does exist for users to try out but on request. (Tweet @JakeX2005 to contact me / or email if you have my CV).
The month of August was quite busy as I was trying to do temporary work in a kitchen which took time away from developing Syncrosizer.
6 second fury
6 second fury is a small collection of mini games that the user would have to complete one after the other to “stay alive” until the end of the mini game list. This project was a stress relief project in late August-October period where I would make small mini game scripts within a day or two. This gave me a chance to look at what I love doing the most and looking at game mechanics and what fun mechanics I could make for mobile interaction. This project was designed to be just mobile with no Augmented Reality aspects, and the soul focus was to explore various game play mechanics for mobile. Listed below are the mini game ideas that were completed:
After creating one particular mini game I took the game play a little further and adapted it into my next AR project that I have been planning since late October which is the next section.
Scene transitions in AR
This project was something I revisited in summer 2016 after creating the initial idea back in December 2015. The initial concept involved there being a certain invisible amount of space that gets augmented by the user where the scene will take place. If the world map (which is invisible) comes into contact with this area, it will be augmented. As you can see with this early version:
When revisiting the idea I wanted to try and create more of a stable world for the user to explore. Creating barriers and limitations (like not falling off a cliff on the map) which when applying to AR is a little harder since you are already dealing with game objects collision meshes to handle the augmentation , as well as attempting NPC interaction and basic battle mechanics. The map was able to expand roughly 16 times its original size and include various textures, game objects and interactions.
Though I have packed away this particular project again until there are mobile devices with slightly stronger processors to handle these expansive maps, as I have pushed my mobile hardware to the furthest (Galaxy S5).
Still happy with the outcome has it taught me a fair few lessons on how to create and handle systems that are responsible for the world.
AR arm adaption
This particular project is a branch off one of my previous jobs with Portsmouth University. During the time of development I was under an NDA but since the projects end I have been given permission to freely work on my own adaptation of said project, as well as disclose certain information.
The project is designed for those suffering from Phantom Limb syndrome (those without their arms/legs who feel odd pains/signals where their appendage used to be), where the user would wear an EMG band further up their arm that would intemperate the electrical signals to a arm on screen showing exactly what movements they were pulling. 4 gestures were available at this point, spread fingers, wave in + out, make a fist.
I was able to make my own augmented variation of this project where instead of the user seeing their arm on screen they would wear a reactive armband (roughly where the EMG band would be) and this armband would react to the webcam on the PC running the project. This reaction would create an augmentation of an arm that would be in the place of a users forearm. In the picture below I used my own arm to demonstrate the accuracy of the augmented arms following.
Note: This project only works in Unity 4.6 with vuforia 5.0.9 or below. This is due to the webcam view script that fails to work in Unity 5 which Vuforia only supports now).
AR star ship game
This currently unnamed Augmented Reality project is what I am working on as the time of writing this ( January 2017) as is the project based off 1 of the 6 second mini games.
I enjoyed creating that small mini game and wished to expand upon it and attempt to make a full release for Google Play in mid/late 2017. In the time from October till now I have been working on a adapted variant of the mini game, applying augmented reality to it as well as shooting and rolling mechanics and created a programmer art level to test out the mechanics.
The level is based off a randomized sequence, though at certain stages say like 30 second in an ice wall is to appear, the level will randomize which position said ice wall will come from , forever giving users a challenge and to be wary.
The biggest challenge is sectioning out certain areas of the mobile screen for the user to move the ship but also shoot and roll without causing problems to the movement. With those particular issues solved I have been in contact with my good friend Alex who was the modeler for Sandwich Simulator 2016 and asked for his help in this project.
I have given him design tasks as I flesh out the rest of the title creating a title sequence, a map / level selector as well as handling data aspects that may be involved later into the project (making certain scripts with data “omni present” in the program but without it causing conflict). So the outcome of the project is currently uncertain but the progress is coming along.