Physical computing

Physical computing

Atomospheric Sound Lamp

https://vimeo.com/113624641 IMG_3873  1

2  3 4  5

Atmospheric Sound Lamp is a group project alongside Gabriel Andrade.

It is a physical instrument that combines two sensations - sound and light - to bring an atmospheric experience. It is made of three white paper mache lamps on a rack. When people touch a lamp, it will emit warm light and play cosmic sounds. More specifically, the light that comes out from the lamp case with many cuts creates many small shadows in the room. The volume is determined by the area of the touch, and tune depends on the length of the touch. In addition the paper mache provides a crafty texture that encourage touches from people. People could touch two or three lamps, or touch the lamps in different orders to play with the sound and also enjoy the warmth. For adults, these lamps will bring a peaceful meditative warm experience; for kids, they will increase their curiosity on light and sound.

ideation: I like projects that combines multi-channeled senses to diversify the experience. So this time I made touch as input, light and sound as output to creative a rich and consistent interaction. In addition, I have always been attracted to lights, especially in the winter time. I always try to bring warmth and comfort to people around me.

Implementation: We used MAX MSP to generate atmospheric sounds. Also we took aluminum paper, 12v lamps, 12v DC power supply, Arduino to build the circuits. In addition, we used old newspapers to build paper mache lamp case. Here's the circuits.

IMG_3826  IMG_3805

User test: We started with encouraging users to light the lamps by singing out loud. However, in the user test most people are not comfortable with doing so in public. Instead, they are more likely to touch and speak. Therefore, we changed our method of interaction to touching. We learned to respect the behaviors of users and create interactions that follow the trends.

 

 

Physical computing

PCOMP Final project documentary 2.0

For this week, Gabriel and I

  • made three shapes of the lamp by using paper mache;
  • built a prototype that has the circuits for 12v lamp and metal foil;
  • figured out the serial communication between MAX MSP and Arduino;(Because we want to add some effects to the sound, and also make sound loops in realtime, we choose to use MAX MSP instead of the Wave Shield in the end)
  • read the MAX MSP tutorial and learned how to make music in MAX MSP.

Here's some pictures and videos.

These are our three paper maches. We'll paint them when they totally dry out.

IMG_3701 IMG_3698 IMG_3694

This is the circuits with 12v lamp and metal foil.

IMG_3668

These are three videos testing the lamp. We decided to use the 20w yellow lamp and cut more holes on the shape of the lamp.

https://vimeo.com/111483956

https://vimeo.com/111483955

https://vimeo.com/111483957

This is a screenshot of our serial communication code in MAX MSP.

Screen Shot 2014-11-10 at 8.17.13 PM

 

 

Physical computing

PCOMP: Final Project Playtest Result

Last week, for the final project, Gabriel, Sweta and I had a play test last week. We showed users the pictures below and asked them "what will you do if you see this" and "the system t", and the circuits(the touching lamp with two metal foils). Final Mockup01 Final Mockup02

 

The result of the playtest is very helpful. It turns out that many users thinking about the input and output differently, so we need to change our original idea.

Playtest result:

First of all, most users don't like singing in public, and they preferred to talk first. I thought the singing part will be fun, but it turns out many people don't like it so much. One or two people said they'll trying to sing if someone told them to. But still, singing is a difficult part.

Secondly, they don't feel like to touch lamp. We think the two kinds of inputs may confused users. But if we only have one kind of input and make the lamp very beautiful, or maybe instruct users to try to touch them and see what will happen, maybe the input will be clear to users.

Because of the result, we are going to leave the "singing" input for a while, and just get done with the "touching" input first. And if we still have time, we'll try to use different way to convert sound input to visual output, like the singing input did.

----------------------------------------------------------------------

 

Here's some notes we prepared for the user test:

    What We Want Answered

  • Does the name “Piano Lamps” make sense? What does the user picture when they hear the name ‘Piano Lamps’?
  • Do we need a visible microphone?
  • Can the user easily figure out how to play with the installation?
  • Does the user get confused by being able to sing and touch in order to play the notes?
  • Does the user have to sing accurately?
  • Does the light get brighter if the user is singing more accurately or if the user is singing more loudly? how are we defining ‘accuracy’?
  • Does the user learn that if they sing a certain note the same lamp turns on?
  • Does the piece work with multiple people?
  • Do the lamps have to light up in a particular order to make sense?

    Before the Test

  • Do you like singing? Do you think you’re a good or a bad singer?

    Instructions We have a series of lamps here that will light up when you talk or sing. Each lamp lights up on a different note when you sing. If you touch it, the lamp will also light up and make a sound.

Don’t worry about singing, you can also talk if you’re not comfortable.

Now we’re going to sing together. Let’s see what happens if we both sing a song together.

    Ask the User - Post Test

  • Would you change the name from “Piano Lamps”? If so, what would you call it?
  • On a scale of 1 - 7, how easy or difficult was it to play with the lamps?
  • On a scale of 1 - 7, how easy or difficult was it to understand how to play the piece?
  • Did you expect the lamp to sing when you sang a note?
  • What do you think is happening when the light is dim? What do you think makes the light brighter?
  • Do you think you have to sing well to play with the lamps?
  • Would you want to sing with another person?

 

Physical computing

PCOMP Final Project plan, timeline and BOM

For this week, gabriel and I did the project plan, timeline, system diagram, testing plan and bill of materials. Project plan: 

"Piano Lamp" is a physical instrument that combines the sound and the light. There are 7 lamps on the wall, the lamp will light up and play different note when user touch them. We also have a application scenery in mind. This is like a challenging game(because we think game is a perfect example for interaction and easy to engage users into it). The lamp will automatically light up in certain order and play a short piece of music first. And users will be asked to repeat the piece of music by touching the lamp in the right order. A second design is to involve a processing sketch in our project to create a interface like "guitar hero".

Timeline:

The final presentation is on Wednesday, 12/3/2014.

11.5-11.12:

  • create sounds in MAX MSP
  • serial communication between Arduino and MAX MSP
  • control sound with Arduino
  • design sketch of the lamp
  • use balloons make two paper mache lamps
  • select the better 12v Lamp in two kinds of lamps

11.12-11.19:

  • build the whole system(lamp, Arduino circuit, MAX msp)
  • adjust and confirm the final design of the lamp
  • get all materials
  • build all the seven lamps
  • user test all the time as soon as we have a prototype

11.19-11.26:

  • continue user test and modify it
  • polish the prototype
  • ensemble the final project
  • troubleshoot and increase the stability of our system

11.26-12.3:

  • add the game scenery
  • continue user test and redesign the system

System diagram and brief description:

IMG_3641

Testing plan:

As said on the timeline, we want to try the first prototype (one or two lamps) on the week of the 12th of november, and the finish prototype (seven lamps) on the 19th of november. That would allow us to see if some changes are needed with the installation.

Bill of Materials:

12v 20W lamp                                                    $ 3.00            Radio Shack

10 LED lights 5mm pre wired 12 V                    $8.96             Amazon

newspaper

Arduino board

wires

balloons

12V transformers

paints

black fabric

Physical computing

PCOMP week 8: Labs!

Lab: Using a Transistor to Control High Current Loads with Arduino

I got the TIP120, diode, DC supply from the shop, and the 12v lamp from RadioShack.I made two silly mistakes in this lab.

Mistake 1: the lamp acts so wired, it lights up sometime, and it didn't do anything sometime. Finally I found that it's because there's no common ground and common power on the both edge sides on the breadboard.

Mistake 2: the lamp lights up immediately and is not controlled by potentiometer. This is because I connected the lamp to the base instead of the collector. The picture on the website has this little mistake.

LabHighCurrentArduinoLamp_bb

 

This is the video with the DC motor:

https://vimeo.com/110742647

This is the video with the 12v lamp:

https://vimeo.com/110742645

Lab: Controlling a DC Motor with an H-Bridge

https://vimeo.com/110742646

Physical computing

PCOMP Final project concept

1. Singing Lamp--a combination of light and music We'll have 7 lamps representing do, re, mi, fa, so, la, ti. when you sing a certain note, the related lamp will light up for a moment. when you sing a song, all the lamp will light up depending on the song.

This is one way to interaction with the lamp. The other way is to touch the lamp, when you touch it, it will play a note. You can even arrange the order of the lamps, and touch them in order, in this way, you can make your own song.

Jellyfish-Lamps-1 osram-lamps

These are two pictures I found on the internet. The project will be presented in a dark environment, and the lights brought by lamp will make a warm and sweet atmosphere. And the jelly fish lamp is amazing! And maybe when every jelly fish lamp sings a note, the jelly fish will get a little bigger, or pop out a little bit. Actually, I become loving this idea after I saw the two pictures. I hope this project could be fun and beautiful.

2. teach you how to dance

sometimes when I learn some kind of dance, I watch the dance video, and follow the teacher's move, but it's hard to know whether I'm doing right or not.

So, in this project, there will be a professional dancer on a projector. She will teach you some moves. But at the same time, you could see your body's shadow on the project, you can follow the teacher and repeat the move. Your shadow and the teacher's shadow are overlapped, so you could easily see whether you're doing right or not.

3. tech you how to play a song using a keyboard.

When you need to hit a piano key, the key will light up in advance.  So the user will know the next key. So everyone can use piano keyboard to play a song.

 

Physical computing

PCOMP Midterm project documentary

The Jurassic Park Remote is a collaborative PCOMP Midterm project with Cole Orloff. IMG_3446

The idea

We spent much time in developing a idea that we're satisfied with.

Here’s our idea. We want to build a “Book Remote”. Users can control the movie ”Jurassic Park”(play, pause, fast forward, rewind) by manipulating a physical “Jurassic Park“ book (open the book, close the book, turn the pages). The reason we chose “Jurassic Park” is that the movie completely follows the book.

We put a flex sensor on the spine of the book, and a force sensor on the back cover. When the book is closed, the flex sensor is bended; when the book is open, it’s flat. When different weights are put on the force sensor, the force sensor would tell which chapter is opened up. We used Arduino to collect the data from the sensors, used serial communication to connect Arduino to Processing, and used Processing to play the movie on a big screen.

This idea comes from that sometimes when we read a book, we find a certain chapter is very tempting, so we want to watch the movie about it. So we think it’d be fun to build a book that can control a movie. The fun part would be we can pick one interesting page in the book, and the movie will play from that page.

The circuit

Photo Oct 04, 8 01 38 PM Photo Oct 04, 8 20 42 PM (1)

The code

In Arduino, we did two things, passing the flex sensor and force sensor's value to processing, and control the LED light along with the changing of the force sensor.

屏幕快照 2014-10-22 下午1.03.27

In processing

We imported a video library, which allows us using "play()" and "jump()" function to play the movie from certain part.

屏幕快照 2014-10-22 下午1.04.28 屏幕快照 2014-10-22 下午1.05.03

We did three things to make FRS more stable.

1) we used the function "millis" to calculate the time inside the processing, slow down the Processing from reading the force sensor's value.

2) we used the function "floor" to round the force sensor's value, as long as the value didn't increase or decrease more than 20, the change will not effect the movie.

3) we also set a variable called "lastchapter". only when the chapter != chapter, the movie will change.

The prototype

laser cutter

IMG_3434 IMG_3433

 IMG_3439 IMG_3442

Physical computing

Physical Computing: Week 4 Labs!

Here's my questions: Call-and-Response is always better than punctuation, right?

Punctuation or Call-and-Response

 

During the labs, I made some mistakes. They seem very easy to fix, but also very easy to make.

1. When use CoolTerm, make sure choose the right port: USB, otherwise CoolTerm can't receive the data. There's also bluetooth port in the "serial port" folder.

2. The note says the port number 12 means “/dev/tty.usbmodem1421″number. I didn't understand why. Later I figured out it's just a sequence of ports in one computer ranging from 0 - 12. Every computer have a different port list.

3. Connect the components correctly. I was careless to put the potentiometer power leg to ground, and the potentiometer control the shape in processing in a wired way.

I kept some notes about some knowledge mentioned in the labs.

1. what's serialEvent?

The serial library has a special method called serialEvent(). Every time a new byte arrives in the serial port, serialEvent() is called. That way we needn't call serialEvent function in draw() or in setup().

2. What's the difference between  Serial.println and Serial.write?

For example, imagine that analogValue = 32:

  • Serial.println(analogValue) results in “32” with a linefeed and carriage return
  • Serial.write(analogValue) results in ” “, the space character, which has the ASCII value 32.

3. How many bytes does Serial.println(analogValue) send when analogValue = 32?

Serial.println(analogValue) actually sends FOUR bytes! It sent a byte to represent the 3, a byte to represent the 2, a byte to tell the Monitor to move the cursor down a line (newline), and a byte to move the cursor all the way to the left (carriage return). The raw binary values of those four bytes are 51 (ASCII for “3”), 50 (ASCII for “2”), 10 (ASCII for “newline”), and 13 (ASCII for “carriage return”).

4. What's trim() mean?

There’s a command that removed whitespace from a string, called trim().

The first trims the whitespace characters off. The second splits the string into three separate strings at the commas, the converts those strings to integers:

myString = trim(myString);
int sensors[] = int(split(myString, ','));

5. What's myPort.bufferUntil('\n');?

This line tells the sketch not to call serialEvent() unless an ASCII newline byte (value 10) comes in the serial port. It will save any other bytes it gets in the serial buffer, so you can read them all at once when the newline arrives.

Lab: Serial-to-processing

CoolTerm screenshot showing incoming bytes as hexadecimal values

cooltime

Here's code in Arduino, receiving potentiometer value from the circuit.

lab1-arduino

Here's code in processing. Imported processing.serial library. Found the right port in the Serial.list. Set up a object named myPort in the Serial class. In the serialEvent() function, used myPort.read to get the data, and then draw a line corresponding to the incoming data.

lab1-proces

This is my video:

https://vimeo.com/107435685

 

 

Lab: Two-way (Duplex) serial communication using Call-and-Response and Punctuation methods

This is the code in Arduino, receiving the data from A0, A1,D2 -- acceleromete's X, Y coordinates and a switch state.

week 4 lab2-arduino

This is the code in processing. Save the serial buffer in mystring, use trim to delete space, use split to get sensor[0], sensor[1], sensor[2], and draw ellipse depending on these three numbers.

week 4 lab2-process

This is my video:

https://vimeo.com/107435843

 

 

Physical computing

Physical computing week3: Labs!

I have a question for the labs. I know that analog input should range from 0-1023, and the analog output should range from 0-255. I don't know the reason why. I suppose it has something to do with the bit and byte, but I don'd know it clearly. -----Oh, I know the reason now. Because the analog input takes up 10 bits memory, the 10th power of 2 is 1024.

During the labs, I made two mistakes.

The first one is my servo motor didn't work. I checked the circuits and check the program for several times, but the motor just didn't work. And I asked a second-year student for help. She told me before I use the motor in my program, I should test the motor alone, using the example code in Arduino to see if it works. It turns out my motor is broken, so I changed it, it works. Thanks my classmate!

The second problem is I did not import the pitches.h library. When I compile the code, Arduino can not recognize the "Note" and the pitches.h. So I looked up in the Arduino website, and found out I should import the pitches.h myself, by adding a new tab and copy the pitches.h code in it. The link about how to import pitches.h is here.

1. Lab: Servo motor control

Here's my code.

lab1

 

 

 

 

 

 

 

 

 

 

 

 

Here's a video:

[embed]https://vimeo.com/106985065[/embed]

2. Lab: Tone output

here's my code.

屏幕快照 2014-09-20 下午9.14.19

[embed]https://vimeo.com/106985067[/embed]

I imported the pitches.h to make a little piece of music.

屏幕快照 2014-09-20 下午9.13.54

https://vimeo.com/106985068

 

I didn't find the force sensor, so I replaced them with flex sensors. to touch each sensor to make different note.

屏幕快照 2014-09-20 下午10.25.50

https://vimeo.com/106985066

 

 

 

 

 

 

 

Physical computing

Physical computing Week 3: Observation

This week I picked the subway Ticket Vending Machine in New York City as my observation object. The reason I started to pay attention to the Ticket Vending Machine is that this is designed by a ITP graduate. The old Ticket Vending Machine interface is very, very confusing. People don't know how to start with. But the new interface is very simple. Everything starts with the "start" button, and users can go though the whole process by making choice on the screen. The picture below is about the old and new interface. design_overview_2014_nn-026

This is the interface of the Ticket Vending Machine. There's a good design idea about the new machine. The designer uses color in purpose. The green region is where the green money goes. The yellow region is where the yellow Metro card goes. And the blue region is for credit cards which are most is blue. I think the matching color instinct can help users, even many users don't realize the design idea behind this.

IMG_3235

I think another cleverness about the new design is that instead of showing users the whole function the machine can do, it hide the confusing details to the users, and simply leave clear clues to the users, so that different users can get what they want and do not need know other functions. The new interface has a clear brach logic to help user go though the process, like the picture showed below.

Untitled-4

Here are steps for me to use the machine to fill my smartlink card (A type of card for people to take Path train unlimitedly in a month).

  • First step: click "start" button;
  • Second step: choose a language;
  • Third step: choose the type of the card;
  • Fourth step: choose the type of service(fill the card/look the information of the card)
  • Fifth step: put your card in the right place and take it back
  • Sixth step: choose the payment method
  • Seventh step: dip your debit card/insert the cash
  • Eighth step: confirm the payment.

Ok! I said a lot good words about the machine, it's time to complain about its certain features. :P

difficulties and the easy parts:

When the first time I used this machine to fill my card, I encounter a little bit serious problem -- I can't dip my debit card to make a successful transition. Maybe I dip the card too fast or too slow, I tried four times in total to make the payment. My chase account appears four times payments, and my card was only filled once. So apparently, the machine can recognize my card at each time, if not, my chase account would't report the transition. So this is a problem should be fixed.

The easy part is to touch the "start" button and make choice at each stage to go through the process.

Time:

I observed the people who using the machine in the 9th St Path station on sept. 22nd. I saw 6 people used the machine. Three of them seem familiar with the machine. They bought the ticket within 1:30 minutes. And for the rest of people, their action were much more slow. They hesitated about the information on the screen. One person looked around and wanted to ask for some help. And I came to help her, she couldn't insert cash successfully. I didn't know what to do either. So she used her credit card. So it seems to me that the most difficulties happens when users make the payment. Other parts are just making choice on the screen, which should be east for most people.

After reading Crawford and Norman's article, I became more aware of the importance of usability. Well-designed objects are very easy to interpret and understand. So I like the new  Ticket Vending Machine that hide the technical details from users and make the process easy to use and understand.

Physical computing

P-com: week 2 Reading

Norman, Design of Everyday Things: Norman emphasized in the first chapter that well-designed objects are very easy to interpret and understand. And the principle of them is: provide a good conceptual model, and make things visible. He gave a lot examples in everyday life to prove his view. I agree with him a lot. To me, human is sensitive and adaptive, and the machine follows a series of rigid rules. So it make the interaction between human and machine difficult. (Look at how we use a programming software to debug our program. When the program went wrong, the interpret that software gives us is so different with what we think of.) So a interaction designer plays a very important role in the whole process. It's a little bit ironic that it will takes so much effort to design a product that seems so easy and simple to use.

 

Norman, Emotional Design, “Attractive Things Work Better”:

Pleasing things work better, are easier to learn, and produce a more harmonious result. --Norman

This is the first time I heard this kind of opinion. I'm not entirely agree with these words. For many objects, yes,  they're subjective for users. Even they're all very usable, some users prefer this and others prefer that. Yes, the preferable depends upon the occasion, the context, and above all, upon my mood. However, there other designs that almost everyone likes. Like the Apple products. I don't think Apple fans don't like these products in some context or in some mood. Why many people are so obsess to Apple product is always a mystery to me, although I'm a fan too.

 

Igoe, Physical Computing’s Greatest Hits (and misses):

This article gives me so much inspiration. Because I have that kind of idea that many things have been done before, they're not original.  This article told me that there’s a lot I can add to these themes through my variation on them. And I was worried about what to do in the P-com class final project, these themes opened my eyes to some fields that I didn't considered before.

I think the Floor Pads is a smart idea, because it's simple to make and it gives user so much fun. And these floor cubes is corresponding to salsa dance steps. However, I think there's something can be added to this project. Dancing is not all about the steps (expect Tap dance, haha). It also contains movements of other parts of the body. I think we can use a Kinect to catch users body movement, and teach users how to dance in a more comprehensive way!

salsa-floor    drum-glove

I also like the Gloves project, because it's instinct for people to tapping on something with your fingers to make a rhythm. Also, a designer should make a very good rule system for users, like move your finger in this way can make this kind of music. I saw a project of my friends that is similar to the Gloves project. I think the rule system are pretty stranger for many people, and it almost ruins the user's instinct.  Also, I want to find a way that users are not require to wear sensors on their hands. Maybe we can use the Leap Motion device to achieve this.

Physical computing

What is Interaction?

What is Interaction? Chris Crawford said, interaction means conversation, requiring three steps, listen, think and speak. It's a brilliant idea to describe interaction. It reminds me of another example that is pretty similar to interaction: two people playing the chess. One person sees his component's last move, and thinks about what to do with it, and then makes his own move, and the other person continues and does the same three things. But I realized there's is a obvious difference between playing chess and interaction. The final goal for playing chess is to defeat your opponent. Everything you think, every move you take while you playing chess is to achieve this one single goal.

But how about interaction? What's the goal of interaction? I started to think the definition of interaction from this question. Cause nowadays everybody talks about interaction, I was wondering why the interaction is so important, why people are going to interact with other people or the computer. I'm not sure if this is a silly question. I cannot figure out the answer, so I went back to the beginning. Interaction share much common with conversation(listen, think and speak). And I believe the goal of conversation is to exchange dynamic ideas or feelings. So I think the goal of interaction is also to exchange dynamic ideas and feelings. Speaking of exchange, it requires two people understanding each other and expressing their ideas and feelings clearly. It's pretty much the same thing as listening and speaking. And when I say the ideas and feeling are dynamic, I mean different input causes different thinking and then, the thinking causes different speaking. The ideas and feelings changed in the whole process of interaction.

By the way, I had so much fun reading Chris's article. He gave many interesting examples to make the definition and the features of interaction much easier to understand. I also learned much important knowledge. I learned the difference between interaction design and interface design. the interactivity designer considers both form and function and is stronger in the art/humanities. Also I learned that interactivity is new and revolutionary, yet tried and true; it can make us more engaged into activities than other medium; it’s the essence of the computer revolution; and it’s unknown territory.

And Bret Victor’s rant talks much about the future interaction should be more powerful than Picture Under Glass. The future interaction should use our hands, because our hands can intuitively feel and manipulate things in rich ways. So Victor proposed a more intuitive interface.

 

How would you define physical interaction?

I think the physical interaction embraces the richness of human interaction with the physical world. Actually I think anything that jump out of the graphical interface(the 2D world) could become a kind of physical interaction. Not just our hands, we can use our mouth, our ears, our whole body to explore all the possibilities in physical interaction.

There's one thing I want to talk about. People may think that the graphical interaction (or like Picture Under Glass) is not as good as physical interaction. I don't agree with them. At first, I like the physical interaction more than the graphical interaction. But, there's no clear evidence that can prove physical interaction is better than graphical ones in the human-computer interaction research field what so ever. The situation is subtler. It just proved that the physical interface might be better for certain situations, or to be more specified, better for certain leaners in certain phases of the learning process. And from my point of view, graphical interface could offer richer visual information. So I think it's necessary to combine physical interface and graphical interface to offer users the flexibility to select the most appropriate interface for a given situation.

 

What makes for good physical interaction?

Intuitive. Human already learned many ways to feel and manipulate objects, so we could just use this advantage. In addition, it's hard for users to learn a brandy new interactive method. It takes time and patient, which will decrease the fun of using the tool.

 

Are there works from others that  are good examples of digital technology that are not interactive?

I think the good example will be the telegram. It's a great invention of that time, but as Chris said about spreadsheet programs on big computers, it takes too much time and contains too little information. One person went to the Telegraph Office and sent a few words to the another person and waited, waited until he got a very short message a couple days later.  I wouldn't say there is no interaction in it at all, but I think the interactive method in this case is too simple and boring to be called interaction.