At the AIMM after dark presentation, I created images of the three different levels and played the music in the background to receive feedback from the people in attendance. The first one was a spaceship themed level that featured an egg shaped spaceship that lands in the middle of the night and begins to fire eggs out of a canon at the user. The second theme was much darker situated in a graveyard with a skeleton that pops up from behind tombstones and throws bones at the user. The third was jungle themed and featured a monkey that throws bananas at the user.
To begin this prototype I downloaded Unity 3d and Maya as recommended by the IMM students I spoke to and have started designing sketches of the layout of this game. I began creating 3d models and getting acquainted with the program and its functions and features. I also got a chance to talk to Brett Taylor through email which was actually very helpful because he had done a project that included the use of the Kinect and Unity and actually has designed a couple of his own rhythm games. His advice was to Prioritize and Identify what exactly it was I wanted to accomplish with the game and be aware that as I progress through the making of the game, my priorities might be changed a little.
I created 3 track that aid the gameplay and gives the user a more immersive experience. I am used to creating music for the hip hop genre but making music for a game such as this was a different ball game as I’ve never done anything like that before. It involved the use of synths and drum samples as well as 808 to give the music more of a low end. The 3 tracks were stylistically very different to reflect the different themes that will be used in the game. I played these tracks for people in the classroom and got feedback for ideas as to how to design each level based on the sound.
Jesse’s talk was particularly intriguing to me because I am currently working on this thesis project which has to do with Virtual Reality. Because of this I was really listening and considering aspects of the elements of VR design he was talking about. I agreed with most of what he said such as the idea of managing motion sickness by designing games that don’t require the environment to move too much but rather have the elements come to the player or be around them so they can interact with them without disorienting their vision. He also touched on the fact that the square screen is kind of a setback for Virtual Reality and I am definitely interested to see how this problem is solved in the future. However I didn’t agree with his idea that AR’s biggest use is to make imaginary toys real. This actually raises psychological concerns and I don’t think would even be healthy for children as they might begin to have distortions between the AR and reality. It also could be used to do way more than VR from my perspective. After the conference I actually borrowed my friends oculus and played his game ‘I expect you to Die’ which is a very hard game if I might add. But I did it to get more of a feel for the VR environment and what I should be shooting for in terms of my game.
I spoke with Jim Margraff who I definitely resonated with the most about this and he seemed to agree with me about that. AR could be used to edit your reality as in, if you stepped outside and you wanted grass to be blue today, you could actually change that and it would look blue in your AR world. Jim actually also pointed out that if that was a thing we could actually be able to share each other’s worlds with people with the AR technology. However after talking about it for a while we arrived at the conclusion that. It might cause psychological issues as people might start to prefer their AR world instead of the real world and it might end up becoming kind of a mental addiction.
When I told him about my thesis project he sounded pretty intrigued as he had previously worked on some games and developed the LeapPad and actually pointed out that for most of these games and apps, music plays a huge part in setting the tone or emotion for the piece of media. He suggested I do some research on HRTF (Head-Related Transfer Function) which I did and recommended a couple of books for me to read. One of them being ‘Music and the mind’ along with other philosophical books. Talking to him was very insightful because although he was not a VR specialist he seemed to have his feet dipped in many ponds and was very willing to help and share information.
I will create a track that aids the gameplay and gives the user a more immersive experience. I am used to creating music for the hip hop genre but making music for a game such as this will be a different ball game as I’ve never done anything like that before. It will involve the use of synths and drum samples as well as 808 to give the music more of a low end.
Prototype 2- Experimenting in Unity
To begin this prototype I downloaded Unity 3d and Maya as recommended by the IMM students I spoke to and have started designing sketches of the layout of this game. I plan on creating a small rhythm runner game that will be similar to the game I’m trying to create and might even serve as a starting point for the final project. It will only be one level. This will test my proficiency in unity and help me decide what works within the game and what doesn’t so as to make the best final project I can. This game will be made for mac.
Prototype 3 – Kinect Control
and if I’m able to do this in time I’ll try to add the kinect control to the game. As of right now I plan to create a simple hallway that the player has to run through and turn at the right times. The user will just have to decide between turning left or right with hand gestures for the prototype. This will no doubt be the most challenging aspect as it will involve the use of both the kinect camera and unity 3d.
To create this project I will need a couple of programs. I talked to a couple of people in the IMM department and they told me that the first one being Unity 3d would be to design the game environment as well as the interface. I need to learn basic C# that I could use to create the game and this will probably be the most challenging part as it is a whole new language for me. I will also need a Kinect camera to read the movements. As previously mentioned I have no experience with either of these things and I would need to get familiar with them as much as I can. After doing some research I also found out I will have to download an SDK and for Kinect and Unity pro since that’s the only way I could run the kinect program. http://channel9.msdn.com/Series/Programming-Kinect-for-Windows-v2/04
I will also need to use programs FL studio and Logic pro x which I’m already proficient in to make the music for this game. I started making the music last week and have already made 2/3 tracks that I plan to use in the game. One of them is a scary theme made in the spirit of halloween which features a dark synth and fast moving drums and the other is a more orchestral theme which features the use of instruments such as violins, marimba and an artificial harp. They are both about a minute and a half to 2 minutes long.
This week I took the liberty to explore more “state of the art” projects in a similar field as my project is going to be in. Since I have already looked at trailblazing games such as the Old School playstation 2 classic Rez and the more recent Beat Saber I decided to expand to more platforms and research some rhythm games that could help inspire my project. On IOS I actually found a game that was pretty cool called Geometry Dash. The game designed by Robtop features a 2D environment where the user plays as a box moving from the left to right, with geometry shaped obstacles that have to be jumped over as you approach them. Each level of this game has it’s own theme song and although the game could be played without sound, the sound definitely aids the gameplay as there are jumps every time there is a note change in the music. Every time the user hits an obstacle they simply start over. However I believe that the sound is the appeal in this game. It features catchy music that once you’ve heard a couple of times can actually memorize and help time your jumps as you progress through the game. I would like implore a similar strategy in the making of my project and have the music be about 1 to 2 minutes long and be simple enough to be able to catch on to and recognize when changes will be happening in order to make the movements necessary to progress through the level.
In terms of the movements the player would be making, I took what Professor Ault said to me about trying to utilize the kinect movements as much as I can to create a more immersive gaming experience seriously and started researching some kinect rhythm games. I thought a game that does this very well was Dance Central 3. I had not been exposed to the game because as a Playstation fan I felt like I can’t cheat on my system. However after watching people play online I quickly saw the appeal to the game. This game actually uses pre-existing songs and lets the player choose a character to perform the song with. It features the users actual kinect signature on the top right while the screen is filled with a dance environment depending on the song that is chosen and the character performing the moves that go along with the song. Depending on how well the kinect signature matches the movements on screen the game grades the user on how well they are dancing. This is actaully a lot of movement and I don’t know if I would be able to execute anything as flawless as this but it definitely inspired me to add more elements to the game other than just turning left and right. I think I could use this in a way that the player would have to do more basic movements such as raising their arms or jumping/ ducking over or under obstacles in the way. Combining that with the moving environment, I think I could create a very immersive and exciting environment for the user in VR.
To begin research for my game I had to look at the history of rhythm games that had made an impact on the industry and came across some pretty popular ones such as Just Dance, Rock Band and Dance Central. I have actually referenced all of these games in my Research Plan post. (http://ault.immtcnj.com/thesis_fall_18/2018/10/11/research-plan-2/) However I didn’t know about one of the games that probably paved the way for more games of rhythm games of it’s caliber, Rez . So that was the first piece of researching I did after I was informed by Professor Ault about the game.
The game developed by United Game Artists, is a 3rd person rail shooter in which the player shoots at objects that appear around them. However the hook is, every time the player destroys a target a sound is made that adds to the music in the background therefore making the player the composer of the song. This is a quality that is very rare and which I thought was an amazing idea and still can’t seem to figure out how it worked so flawlessly. It also has a beautiful color quality that feels like a form of synesthesia when you are playing it. The game was such a huge success after it’s release in 2001 on the Sega Dreamcast and Playstation 2 that it was rereleased multiple times in the future for Xbox live and later for Playstation VR. Rez Infinite, the PS VR version, is currently one of the most acclaimed rhythm games in the world. My next step would be to speak to professor Fishburne about his experience playing the game and how I could incorporate that into my game.
I also got a chance to talk to Brett Taylor through email which was actually very helpful because he had done a project that included the use of the Kinect and Unity and actually has designed a couple of his own rhythm games. His advice was to Prioritize and Identify what exactly it was I wanted to accomplish with the game and be aware that as I progress through the making of the game, my priorities might be changed a little. So I came up with a list of priorities:
Learning Unity
Designing and Animating the environment
Design how the music interacts with the environment
Kinect Support
First person controls
A fun and smooth player experience
In researching the use of Kinect and Unity and actually came across a few examples. The problem was that this would be my first experience using the engine and animating and designing my own game so although the watching other people play their games was a pretty cool experience, I found it better to start looking up how to videos on using the unity engine and actually ended up finding quite a few. I also found how to videos on how to add music to the games made in unity as well.
Rhythm games were first popularized in Japan in the early 1970s with the first one being an electro-mechanical arcade game created by Kenzou Furukawa. The idea for the game was essentially for the player to lift up girls skirts in time to a rhythm that was being played and was inspired by Oh! Mouretsu commercials which were popular in Japan back then. Later another major release within Rhythm Gaming would be Simon, which was created by Ralph Baer and Howard Morrison. The game allowed players to take turns repeating patterns of button presses that got harder and harder as they progressed through. Soon more games using rhythm such as Dance Aerobics , Dance Dance Revolution, BeatMania and Guitar Hero would be created within the decade before the 2000s hereby making the genre even more popular. The rhythm games would get even better with the introduction of ps3’s Playstation move and Xbox Kinect in 2010 and 2011, introducing games such as Just Dance and Dance Central using motion sensing technology. In 2018, Beat Saber which brings the rhythm gaming industry into virtual reality became the top selling and highest rated virtual reality game on Steam.
Current State of the Field
As mentioned in the previous paragraph, Beat Saber currently is the top selling rhythm game in the industry. The game released on Steam and Oculus by Hyperbolic Magnetism was created in the Unity engine and soon made its way to Playstation VR. The game features a player slashing objects moving towards them in time to a rhythm. The player wields two virtual lightsabers that are basically extensions of the Playstation VR motion controllers. Within each level there are different songs and different obstacles such as mines not to be hit and pink transparent walls that the player must avoid. Hyperbolic Magnetism is a small indie gaming studio based in the Czech Republic founded by Lokiman and Split. The two decided to work together in High School in the late 90’s after working on a few collaborative projects and discovering their love for making their own video games. Once Apple launched the app store, the two young men quit their jobs and began making games for mobile phones, tablets and other devices and eventually ended up in VR with the hit game Beat Saber.
After our class discussions I’ve decided to create the VR platform rhythm game that would involve the player moving along a set path and turning where there are changes within the music. Music has been my main focus throughout my years as an IMM major but it seemed like it would be a cop out to just create a music album for my thesis especially when the project has to be interactive. Although this would be my first attempt at creating a game, let alone a VR game, a Rhythm game seemed like the best way to immerse the user in the music while they are also having fun playing the game. My main inspiration for this game was the PS4 game Beat Saber which is a similar rhythm game except the user has to cut through objects moving towards them.
I’m currently in an interactive music class with Dr. Nakra where we are studying how the music in games affect the gameplay. In almost all of these games you see that the music plays a huge part in the user experience. Games like the new Spiderman on PS4 and Assassin’s creed all have unique musical themes that make the user more immersed in the game whether they are at a dangerous point in the game or at a heroic point while others like The last of Us use the lack of music at certain points to make the mood even more eerie. However what these games have in common is that the visual aspect drives the music. This game would be doing the opposite of that.
The user would have to listen to the music and discern the right times to turn to avoid falling off a platform or running into a wall. I think users would enjoy the experience because not only would they be playing the game but they would be listening to music and actually start recognizing where and how the changes happen and might even notice these patterns in other music they listen to. For this demo I’m thinking of creating 1-3 levels, depending on how much I can do within the time allotted to us, with around 1-2 minutes of music per level that I will be making myself and have actually already started composing.
Unlike beat saber where the objects are coming towards you, on each level the user would be moving forward having to turn right or left at the right times and the speed would be determined by the level that they are playing. My idea is to create a hallway that the user would be moving through with an artifact to collect at the end of the level. It would have to be completed in one try and if the user hits the wall, they have to restart from the beginning. Some possible prototyping would be to actually play beat saber and get a feel for the game and it’s environment, animation of the hallway environment and testing the movement through the game. I would also have to get acquainted with the Unity engine by creating a test environment as that is the platform I’m going to be using to create this VR rhythm Game.