Friday, April 17, 2009

Final Week - Project Sketch 3

For the last two weeks, we have been continuously working on our Air Guitar project. I managed to finish all of the presentation which includes our project concept, research, logic flow and our experience with our project.

Besides finishing the project, I also met with my team to discuss our project and to put everything together to get it to work. The first time that we met up during the final weeks was to see if certain things worked with each other. It worked out the way we wanted, but with minor tweaks to improve it. We also took some pictures for our presentation and documentation.


During the final week before presenting, we met up again to finalize everything. We tested the project by ourselves and also did a multiplayer experience with each other. The Air Guitar that we made, not only acts as a single player instrument but also gives a multiplayer experience. How this works is that one person could be wearing one glove while someone else wears the other. One person can bend the fingers, while the other person strums to the combinations. A third person can also interact with it by playing with the pitch (proximity sensor).


The next day after our presentation day, we decided to start and finish the documentation video. We gathered our data from our presentation and wrote out an outline and script for our video. We decided to include background info, research, project concept and interviews. We got two people to test our project and give feedback. We managed to record our script and also interview our users. From the information we gathered, we created a documentation video.


The presentation, max patch and essay can be found below:

Presentation

Max Patch

Air Guitar Essay

Sunday, April 5, 2009

Final Project - Week 12-13

Been very busy the past two weeks with school coming to an end, and managed to get some stuff done for 320's final project.

We've met up with the team to discuss our project in each of our roles in Week 12, seeing where we are in terms of the project and talking to each other about different areas. Each of us have presented to the team what we have been doing, how much more we need to do and so on.

The research slides that I have completed were to be presented to Jinsil and Greg but I did not go due to an exam the next day so instead I gave Kay the slides I finished and told her to present it to Jinsil and Greg. Kay sent me feedback on the slides and said they wanted research on traditional guitar playing and the videogame Guitar Hero and why people were so interested in playing guitars. From their feedback, I began to do more research on this topic this week.

I have added two more slides on research to our presentation, traditional guitar playing; what it is and what it does and also a Guitar Hero slides where I talk about the game and why people like to play it.

I have also begun to work on the concept slides themselves, revealing what our project is and also creating schematics for our presentation. I have talked with Drew about the logic flow of our implementation/technical aspect of our project and how it exactly functions in Arduino and Max MSP.

I am planning to meet up with the team next week and hopefully we'll be mostly done so we can take pictures, document and I can finish up the presentation from the final implementations.

Below are just some reference pictures of some of the new research that I have done:

The above is a picture of an acoustic guitar. This is the classical version of a guitar. The first ones that were originally used by musicians.

This one above is the electric guitar. We can see that it's very similar style to the guitar in the Rockband game. These games try to mimic a guitar as closely as possible so that players can feel like they are really playing an actual guitar.


The above picture is a screen of the game guitar hero. Players have to match the notes on the screen when it slides down to the fret buttons.



People playing the Guitar Hero game. As we can see, they hold and carry it like a guitar. Their motions are also in line with how the guitar is used.

Tuesday, March 24, 2009

Week 11 - Sketch 3

This week, I mainly worked on the research with Kay. We gathered a ton of resources and I went through each one thoroughly, looking and noting down projects, theories and so forth that were related to our project.

The concept of Air Guitar has been approached many different ways and actually exists in the world. It wasn't too hard for us to find products and projects that could be related to ours.

Jinsil sent me a research project of Csiro's Air Guitar project. They embed sensors into a t-shirt so that it can be played like a real guitar. Movements by the wearer's arms are mapped and beamed wirelessly to a computer which interprest them and turns them into musical notes.




Another project I researched was Takara Tomy's Air Guitar which uses an infrared string system to detect your strums and a variety of buttons both on the fret boards and above it to produce a wide variety of music notes.



There are a lot of good projects and products we can relate our project to, so this is great because we can refer to how these products are done and further push our project.

I also started working on our final presentation this week. I started to gather all information into a word document and layed them out on a slide and started designing what information should be on the slide and what to talk about in terms of our presentation. Kay and I have finished most of the research at this point.

We did meet up this week to go over the patch for the programming that Drew put together. We gave input on the outcome and what we can change and add. We also talked about the next few weeks and what we would be doing and sharing the workload. We will be meeting up next week again to finalize tasks for everyone. Next week we will be discussing when our documentation video should be done and what we want to do in terms of our essay.

Busy weeks ahead of us!

Thursday, March 12, 2009

Week 10 - Sketch 3

This week, we got into our groups (Drew, Kath, Nabil, Kay and me) and discussed some ideas for Sketch 3.

Over the week, we brainstormed and thought about quite a few ideas, but none that stood out or was feasible in the time we were given.

A few ideas were:

1. Interactive Lamp
2. Interactive Heart Rate Monitor where a heart shaped led would blink and tell you the rate your heart is going.
3. An environmental interface where people can interactive with. There is a physical object aspect and a spacial environment aspect where the participant triggers the things on the screen and guides it towards the 3D object.

We discussed a lot of these ideas in class with Jinsil and Greg. However, as we continued to brainstorm, we reached a decision and gave them a finalized idea of an Air Guitar.

The invisible guitar is an abstracted version of a guitar. The user will put on gloves with various sensors embedded in them and be able to play as if they are holding a real guitar. The user will be able to select a combination of three notes multiplied by a frequency. They then will be able to choose a volume to play at. The notes are selected through the bending of the index, middle and ring ngers on the left hand. The pitch of the notes will be based on how far out they hold their left hand and the volume will be controlled by how fast the user strums their right hand. The sensors needed will be 3 bend sensors, a proximity sensor and an accelerometer. The gloves will then be plugged into an arduino board which leads to Max/MSP where the sounds are generated and played. Through the invisible guitar the user will be able to generate their own sounds and songs with relative ease.

We have created a final project plan that can be found here:
http://www.sfu.ca/~ach3/finalprojectplan.pdf

Friday, February 27, 2009

Week 9 - Updating Sketch 2

For week 9, we continued to try to update our main patch/programming for our project. We've been having some difficulties getting everything to work because of the complexity of our concept that we want to incorporate. We help our main programmer where we can. We've been consistently testing our project and working with it.




We have been having difficulties in areas such as speed and triggering certain things to happen within a certain range. (i.e. Glove triggers monster at x).

We've been also getting other people to test it too, to record the experience of our project. We continue this week to update and put more details into our project. After we get the primary concept to function correctly in our project, we went on to try and add small details like speed, different animations, etc.



This has been a tough week since we aren't too familiar with the potential of Max MSP. We continue to strive to get our project done to the best of our ability.

Most of the presentation was done last week as we presented to Jinsil. This week I mainly just touched it up a bit in terms of design and added a little bit more detail to some of the slides.


Our presentation file can be found here:

http://www.sfu.ca/~ach3/sketchtwopresentation.pdf.pdf

The Max Patch can be found here:

http://www.sfu.ca/~yds2/iat320/SKETCH_TWO_Project.zip

Thursday, February 26, 2009

Sketch Two - Continued

Last week was a very intense week for us as we continued to work on our Sketch Two. We met up and talked about what we wanted to get done and how we were going to do it and what each person was to do. Yosuke proceeded to handle the main programming and we helped where we could. We met up with Greg to ask for help on the programming on Friday.

As Yoshi was doing the programming, my main job was doing the research and the presentation. I researched on our concept and where it came from, what projects were similar to ours and what's been done out there in the world. We also researched on theories based on studies of real reaction time of people in general (drivers) and applied this to our concept.

A product, very similar to what we wanted to achieve was the EyeToy which I found quite interesting. They have a similar set up with a camera that's hooked onto the PS2. This camera tracks the users motion and sends the information to the PS2. Games that are made for the EyeToy lets users interact with objects on the screen/game. If you look at the picture below, it is of a girl using the EyeToy (the USB camera) to play with the game thats projected on the screen (TV). This is very similar to ours. We use colour tracking as a way to detect someone's movement. With this glove (our colour tracker), they use it to hit certain objects that are on the screen.


For the presentation, I included all that was required including concept research, theories, programming logic, pseudocode, flow chart, schematics, etc. I worked with Yoshi to correspond to the slides that talked about the coding, making sure that what I put on there, made sense with what Yoshi was doing and what our project programming consisted of.

We went to class prepared to show what we have. We didn't achieve what we wanted to, but hope to update it for next week for our real presentation. This week we presented Jinsil with our finished presentation and presented our work in progress of our programming to Greg. We had good feedback and hope to improve our Sketch Two for next week.

Thursday, February 19, 2009

Reading Break Week - Sketch Two

After Sketch One was done, we were introduced to Sketch Two which was to work with cameras and Max MSP to create an environment interaction.

During this week, we worked with Max 5 in the labs to get familiar with the program and to learn some of its capabilities and functions. We have also formed teams for Sketch Two.

During the Reading Break, my team and I have begun to brainstorm what we wanted to do for this project. We have had a few ideas and concepts. We didn't want to create something that would trigger just an object, but further push this idea. A few concepts that we came up with were:

1. Having the user interact with an environmental map. Since we're Super Mario fans, we wanted to create a Super Mario environment with pipes, mushrooms, monsters that could be interactive when we put our hands on the object triggering an event to occur.

2. Second, was a concept similar to the breath analyzer and bubble concept that was talked about in class. This was a very vague concept idea that we thought about but didn't really elaborate on this.

3. Our third concept was the concept game of Whack-A-Mole. Everyone has played whack-a-mole or is familiar with this game. It can be played in arcades, on the computer, etc. It's a game where moles pop up from holes and you whack it with a hammer. We wanted to create this, but in terms of an interactive environment. We all agreed on this concept after brainstorming for awhile and this was the one we wanted to produce.

With this concept in mind, we wanted to put this concept into Max 5 where we use a camera to show the map of the moles and the player interacting with it. On the patch will be pixels of pipes, the background and the moles coming out. We have thought about using a glove to use to track the motion (color tracking) to make this effective. The user will wear the glove to play the game. We have to program it so the distance between the gloves and the point of interaction will work. If the mole is at a certain coordinate, it will trigger a point in the game for the user and if the mole is not in that area, nothing will happen and no points will be recorded.

We took this idea and talked to Greg during our lab this Wednesday, he liked the idea but wanted to challenge us and push this project further. He gave us the idea of having more animals (i.e. Elephant, Rabbit) and depending on these types of animals, you must have a certain speed to gain points. So a rabbit, you can touch it at a slow speed because its a small animal, but if its an elephant, you have to go much faster (giving more strength) because its a huge animal. This challenges our initial idea a bit more because now we would have to figure out how to change the speed and getting the glove (color tracking) to track each object with our glove. Our biggest challenge is to program the mole to record the point. After achieving that goal, all we need to do is duplicate and change the speed of things.