Deprecated: Assigning the return value of new by reference is deprecated in /nfs/c02/h08/mnt/19116/domains/blog.emocapsule.com/html/wp-includes/cache.php on line 36

Deprecated: Assigning the return value of new by reference is deprecated in /nfs/c02/h08/mnt/19116/domains/blog.emocapsule.com/html/wp-includes/query.php on line 21

Deprecated: Assigning the return value of new by reference is deprecated in /nfs/c02/h08/mnt/19116/domains/blog.emocapsule.com/html/wp-includes/theme.php on line 507
Emo-Capsule » Accelerometers

Archive for the ‘Accelerometers’ Category

EmoCapsule.com is off to SIGGRAPH

Tuesday, July 8th, 2008

SIGGRAPH 2008

It has been a long while since we have posted anything on out blog so here is an exciting update!An update on Emocapsule.com, with great news our group has been ACCEPTED to take part in SIGGRAPH 2008 being hosted in LA. We will be presenting the week of August 11 - 15 at Los Angeles Convention Centre. This is big news, for those unaware as to what SIGGRAPH (short for Special Interest Group on GRAPHics and Interactive Techniques) is a Internation Conference and Exhibition on Computer Graphics and Interactivity Techniques which is being hosted at the LA Convention Center during August 11 - 15 2008. In 2005 25 000 people attended the conference, this is something to be pumped about being accepted to.

EmoCapsule is a study of interactivity. Participants influence the installation by inputting their emotions at emocapsule.com. The website provides users with emotional statistics and trends based on frequency, location, and weather information collected. The installation is dynamically updated to reflect the dominant emotional mood.

Visitors interact with the current emotional state of the EmoCapsule installation - depicted through sound, text, and colour. Participants move and catch emotion words using their own silhouette and other objects. Participants make loud noises, triggering the installation to emit reactive sounds and display new words. These form a sort of conversation between the user and the installation space.

So if you will please check out Emocapsule.com and or the SIGGRAPH website. With Love JSHAW and the EmoCapsule Team

Messy sketches

Monday, August 13th, 2007

I was talking to Eric about the limitations of accelerometers — we’d only be able to use a few suits. This would severely limit the fun of the installation. One possible solution I thought about was to divide the installation into phases. In the first phase, users would be given instruction to put on the accelerometers and go in. They would see various poses displayed on the screen to represent emotions. They would mimic said emotions and acquire the emotion-rings as seen in Eric’s sketch. Once they’d had their fun and acquired many rings, they would take off the suit and leave - sort of. They would go out the exit door and through some kind of hallway (I had visions of dark hallways with graffiti-style light following the user as he/she went through (through the use of another web cam or arduino sensors - which could also light up some floor panels or something cool like that). Then bam! the user enters a whole other installation.. they still have their emotion rings (somehow.. this we’d have to sort out). On the wall, since we probably wouldn’t have to show the emotion poses, we could instead display the whole web-inputted network of emotion compatibilities. It would be cool if people could visit the site, upload a photo of themselves and choose their emotions - then the circle would be generated and interact with other people’s representations (much the same way as the live-installation ones do on the floor.

I sketched it up very roughly as I was thinking it through so it might not be that clear, but I’m including it anyway. Ask me any questions.

Installation sketch

Obviously it’s just one of many possible solutions so feel free to tear it apart or suggest other directions!

Re: Eric’s Cool Sketches

Monday, August 13th, 2007

I think the idea of the rings around the user on the ground is awesome, probably not too tough either from what I’ve been looking at (haha, relatively…). I think we could definitely create a simple installation around this, or work it into something more complex, although I even like the idea of just simple white panels, with all of the focus being on the floor. The biggest unresolved issue is how do we assign an emotion to each person. Is is based on their movement? Something they touch or they type in.. or say .. or position their body (which could be difficult, but super cool if it worked.. alternatively, we could define poses for each emotion and flash them/display them in some other way so that people would know to “do a pose” in order to acquire an emotion ring.. which would be totally doable with current accelerometer code) or it could be based on where they walk initially when they enter, or the colour of their clothes? Or some combination of these, or something else entirely! What are you guys leaning towards? I’d really love to work in the accelerometers as I think the pose detection is still sort of novel and it works surprisingly well.. I’m also keen on RFID and/or touch/light/sound sensors with Arduino in some kind of subtle way (embedded into the floor or wall panels maybe or something..)  It would be cool if we could use some combination of things (it would also help our project come across as less “simple/easy” but I think it’s easier said than done to combine them.

What does everyone else think?

Story-telling gestures

Wednesday, June 20th, 2007

While it might be difficult to interpret gestures with simple accelerometer straps, I think it might be possible to use them in conjunction with the video processing. If there is high acceleration (the person is accelerating/flapping their arms or body around) this corresponds to an “intense” action. We don’t know whether it’s positive or negative, but we can use the facial and sound information to help derive the intended gesture. If the users are telling their emotion-filled story one at a time, putting on some straps (or however we choose to do this) should happy only during the first “stage” of the installation and shouldn’t be too cumbersome or interfering.

SenToy

Friday, May 25th, 2007

I have uploaded an article SenToyArticle.pdf. Read it if you can. The main idea is that they use a dolly and participants make it perform emotion-actions. These actions are recognized through accelerometers (placed on the doll) and then translated into character action in a game. We could simplify this and have users make the doll perform an emotion which is translated into an emotion word. Guaranteed it wouldn’t work 100% but I think it would be fun, it wouldn’t involve any kind of crazy suiting-up, and it’s already been done so it shouldn’t be impossible.

Stepmania

Friday, May 25th, 2007

Stepmania is an open source version of DDR. I play it at home with a standard PS2 dance mat (~$25 at future shop) and use a PS2 -> USB adapter (also ~$25 at video game type stores). I was thinking that the dance mat might be an interesting form of gesture-input. There are 8+ buttons on the mat and each arrow could be replaced (covered) with an emotion. I’ve read a few places that emotions can be classified as combinations of main categories of emotions. In this way, the user could jump on multiple emotions simultaneously to produce a resulting emotion, or simply step on one of the emotion categories. It would be dead simple to implement with pure processing, but I still think it might be fun. It would give kids (and probably adults too) the chance to rapidly influence the overall mood — think of someone standing on the mad repeatedly jumping on the buttons over and over and watching their mood ‘grow’ on the screen. I think it would give it more of a direct feedback/interactivity and if it turned out to be annoying or ruin the overall experience, we could always implement a delay (only one reading per minute or something).

Today I also tried playing Stepmania with accelerometers. This also works but I think might be more difficult.

Processing: Accelerometers: Continuous Graphing

Friday, May 18th, 2007
// Continuously Graphing Acceleration in the X, Y, and Z Axes
// Hannah Johnston
// May 2007

import procontroll.*;
import java.io.*;

ControllIO controll;

int[] xvals;
int[] yvals;
int[] bvals;
ControllSlider sliderX, sliderY, sliderZ;
float totalX = 0, totalY = 0, totalZ = 0;

void setup(){
size(400,400);
xvals = new int[width];
yvals = new int[width];
bvals = new int[width];

controll = ControllIO.getInstance(this);
controll.printDevices();

ControllDevice device = controll.getDevice(”Seng”);

println(device.getName()+” has:”);
println(” ” + device.getNumberOfSliders() + ” sliders”);
println(” ” + device.getNumberOfButtons() + ” buttons”);
println(” ” + device.getNumberOfSticks() + ” sticks”);

device.printSliders();
device.printButtons();
device.printSticks();

sliderX = device.getSlider(2);
sliderY = device.getSlider(1);
sliderZ = device.getSlider(0);

}
int arrayindex = 0;

void draw()
{
background(0);

totalX += sliderX.getValue();
totalY += sliderY.getValue();
totalZ += sliderZ.getValue();
xvals[0] = (int)(totalX)*10;
yvals[0] = (int)(totalY)*10;
bvals[0] = (int)(totalZ)*10;
noStroke();

int x = 200;
int y = 200;

x = xvals[0];
y = yvals[0];
int z = bvals[0];

fill(250);
ellipse(x, y, 50, 50);
}

Processing: Accelerometers: Colour Picker

Friday, May 18th, 2007
// Accelerometer Remote Colour Selection and Drawing
// Hannah Johnston
// May 2007

import procontroll.*;
import java.io.*;

ControllIO controll;

color userColour;
float paintX = 200;
float paintY = 200;

ControllSlider sliderX, sliderY, sliderZ;
ControllButton button0, button1;
float totalX = 0, totalY = 0, totalZ = 0;

void setup(){
size(400,400);
smooth();
background(0);
colorMode(HSB, 4);
controll = ControllIO.getInstance(this);
controll.printDevices();

Change the device name when using a different device
ControllDevice device = controll.getDevice(”Seng”);

// Using slider x, y, z to get the accelerometer value
// The slider values don’t correspond on the same scale as those used in C
// There is no calibration here yet either
sliderX = device.getSlider(2);
sliderY = device.getSlider(1);
sliderZ = device.getSlider(0);

button0 = device.getButton(0);
button1 = device.getButton(1);
}

int arrayindex = 0;

void draw()
{
float myX, myY;

// Multiplied by a fairly arbitrary value to get it into the 0-4 range
totalX = sliderX.getValue()*20;
totalY = sliderY.getValue()*15;
totalZ = sliderZ.getValue()*10;
println(” ” + totalX + ” ” + totalY + ” ” + totalZ);

float x = totalX;
float y = totalY;
float z = totalZ;

// Left Button: Draw with the current tool selection
// Hold it down when drawing
if(button0.pressed()==true){
int i=0;

myX = sliderX.getValue()*10;
myY = sliderY.getValue()*10;

paintX += myX;
paintY += myY;

fill(userColour);
ellipse(paintX, paintY, 10, 10);
println(”Painting ” + paintX + ” ” + paintY);
i++;
}

// Right Button: Clear screen (it gets messy)
else if(button1.pressed()==true){
fill(0);
rect( 0, 0, 400, 400 );
}

// No Button Press: Display the colour-picker rectangle
else{
noStroke();
userColour = color(abs(x), abs(y), abs(z));
fill(userColour);
rect(340, 340, 50, 50);
}
}

First Steps with Processing - Accelerometer Controller and Dance Mat

Tuesday, May 15th, 2007

I was playing around with a single accelerometer. For another project, it has been configured with 2 buttons and the 1, 3-axis accelerometer. Using the ProControll library with Processing I made a simple program that allows you to choose a colour, Hue Saturation and Brightness based on the accelerometer orientation. By holding down a button, the current colour is selected and you can paint with it (sort of like painting in air, except with 3 axes of rotation so it’s a bit tricky). And finally, the other button clears the screen. Starting from nothing, the whole project took a couple of hours.

After installing the ProControll library, I got to thinking that we could really use ANY kind of input device. I tried hooking up my dance mat (using a PS2 -> USB converter) and modified one of the example ProControll programs to use the dance mat arrows to rotate a cube (using the OpenGL library in Processing). It took litterally minutes.

I’ll post the source for both tomorrow.

If you have any kind of interesting hardware, I highly recommend hooking it up and playing with it. In a very short amount of time you can get stuff working in Processing. If nothing else, it will be good for mock-ups and prototypes. There are sound and video libraries too so check it out.

Fiiinally, one last thing. I just got a book out of the Carleton Library “Emotion and Adaptation” by Lazarus. It seems to offer a somewhat scientific background about emotions relating to psychology. It seems like it might be useful theory to help us make and support our design decisions. Time permitting and depending on the assigned readings, I will try to make some notes on the useful bits of the book.

That was long.