Deprecated: Assigning the return value of new by reference is deprecated in /nfs/c02/h08/mnt/19116/domains/ on line 36

Deprecated: Assigning the return value of new by reference is deprecated in /nfs/c02/h08/mnt/19116/domains/ on line 21

Deprecated: Assigning the return value of new by reference is deprecated in /nfs/c02/h08/mnt/19116/domains/ on line 507
Emo-Capsule »

Archive for the ‘’ Category is off to SIGGRAPH

Tuesday, July 8th, 2008


It has been a long while since we have posted anything on out blog so here is an exciting update!An update on, with great news our group has been ACCEPTED to take part in SIGGRAPH 2008 being hosted in LA. We will be presenting the week of August 11 - 15 at Los Angeles Convention Centre. This is big news, for those unaware as to what SIGGRAPH (short for Special Interest Group on GRAPHics and Interactive Techniques) is a Internation Conference and Exhibition on Computer Graphics and Interactivity Techniques which is being hosted at the LA Convention Center during August 11 - 15 2008. In 2005 25 000 people attended the conference, this is something to be pumped about being accepted to.

EmoCapsule is a study of interactivity. Participants influence the installation by inputting their emotions at The website provides users with emotional statistics and trends based on frequency, location, and weather information collected. The installation is dynamically updated to reflect the dominant emotional mood.

Visitors interact with the current emotional state of the EmoCapsule installation - depicted through sound, text, and colour. Participants move and catch emotion words using their own silhouette and other objects. Participants make loud noises, triggering the installation to emit reactive sounds and display new words. These form a sort of conversation between the user and the installation space.

So if you will please check out and or the SIGGRAPH website. With Love JSHAW and the EmoCapsule Team

Processing it up

Friday, September 7th, 2007

Hey guys, for the past day or two I’ve been hacking away at Processing…the more I’m using this the more I’m loving it. Amazon was very quick. I got the Processing text book the very next day I ordered it!

I have been looking into Sound stuff and have found some awsome libaries that we can definitly leverage for our proj. Check out this URL:
Taken from the site: “The Sonia Library provides advanced audio capabilities such as multiple sample playback, realtime sound synthesis, realtime FFT (frequency) analysis of the microphone input, and writing .wav files from samples.”

I’ve already got some visual output based on lineIn or mic inputs sounds.

I’ve been also studying the book and played around with some tutorials…check out this screen shot that I made. Basically it is animated and rotates around in 3D… pretty bleh but a start hehe.

Safari Tech Books Online!!

Sunday, August 5th, 2007

Safari: If you haven’t already heard about/used it - it’s amazing. Many many technological books available to read online. If you can stand reading off of your computer screen, it is well worth it.

I’m currently reading a book called Physical Computing: Sensing and Controlling the Physical World with Computers by Dan O’Sullivan, Tom Igoe. I think it could be verrrrry helpful for getting into the microcontrollers and sensors and stuff.

I know I already posted about Arduino, but I found a booklet online that describes how to get started using it. I know I’ll end up going through it and thought some of you might also be interested. I found some Ultrasonic Range Finders that are basically like bats (haha). They can detect the exact distance of objects that are a couple of cm to several meters away. They are a bit more expensive than other sensors (in the $30 each range) but could come in handing in a low-light environment. I’m in the process of tracking down an Arduino microcontroller and some sensors that I can start playing with. I will post what I find!

Arduino and Myron

Friday, August 3rd, 2007

So after a WinXP reinstall, complete with a new version of Processing, WinVDig, and Myron, I’ve now got some of the Myron video tracking samples working with my webcam. Things worth noting: It is sooooo light-dependent. Granted, my camera isn’t very good, but I still think it might be tricky in a dark installation. I’m thinking about solutions.

In the meantime, I’ve been looking into Arduino - A sister project of processing. I think it’s worth taking a look at the site. Essentially relatively inexpensive microcontrollers that allow the use of a wide variety of sensor input (temperature, light, movement) with USB interface. The programming language is all open source and apparently works well with Processing.  It sounds pretty good.

Some Sound Stuff for Processing

Thursday, June 28th, 2007

Hey all, I have tried to go out to best buy to buy myself a camera for my 3+ old Power Book and was surprised to find that all these camera’s do not support it.  Otherthan iSight which is discontinued and even if I had the opportunity to purchase one… it would be 200 bucks.  So I’ve taken the liberty to delve abit with Sound in Processing (sorry Alicia I know I am intruding into your zone but I thought I would get acquainted with it :) )So yea so far I’ve downloaded some open source libs that enables processing to either stream / play variety of sound formats such as .mp3, .wav, .aif etc.  Going through some tutorials I created a very simple spectrum meter thingy.  What I would like to do is figure out if there is a way to isolate a specific wavelength for lets say…heavy tones… or bass tones… If we can isolate that, then we are rollin~Ok time for me to dream. Here is the link to the open src sound stuffs: 

Processing + MySQL = Success

Saturday, June 9th, 2007

Sup guys,

An update with my processing stuffs.
Some good news, I’ve just got processing to connect with mysql database.
We now have control to create / insert / update / read data in processing.

I have mysql database installed on my powerbook and I’ve tested it and works beautifully.

Motion Text

Monday, May 28th, 2007

Sup all,

I’ve been playing around with Processing understanding its program methodoligies as it differs from the Flash programming background :). Nonetheless, I’m getting pretty comfortable with the coding and already implementing classes.

I’ve been playing around with Text in motion. (Basically its just a simple text that goes across the screen linear) but I plan to do is implement a class that will be dynamic enough to set size, get width / height values so it will be all contained. Hopefully this class will be useful in populating the environment with textural data.

I’ve also started in looking at sound as an external source for our interactive environment. There are some pretty cool open source sound libs available so I will begin to investigate them.

Processing: Accelerometers: Continuous Graphing

Friday, May 18th, 2007
// Continuously Graphing Acceleration in the X, Y, and Z Axes
// Hannah Johnston
// May 2007

import procontroll.*;

ControllIO controll;

int[] xvals;
int[] yvals;
int[] bvals;
ControllSlider sliderX, sliderY, sliderZ;
float totalX = 0, totalY = 0, totalZ = 0;

void setup(){
xvals = new int[width];
yvals = new int[width];
bvals = new int[width];

controll = ControllIO.getInstance(this);

ControllDevice device = controll.getDevice(”Seng”);

println(device.getName()+” has:”);
println(” ” + device.getNumberOfSliders() + ” sliders”);
println(” ” + device.getNumberOfButtons() + ” buttons”);
println(” ” + device.getNumberOfSticks() + ” sticks”);


sliderX = device.getSlider(2);
sliderY = device.getSlider(1);
sliderZ = device.getSlider(0);

int arrayindex = 0;

void draw()

totalX += sliderX.getValue();
totalY += sliderY.getValue();
totalZ += sliderZ.getValue();
xvals[0] = (int)(totalX)*10;
yvals[0] = (int)(totalY)*10;
bvals[0] = (int)(totalZ)*10;

int x = 200;
int y = 200;

x = xvals[0];
y = yvals[0];
int z = bvals[0];

ellipse(x, y, 50, 50);

Processing: Accelerometers: Colour Picker

Friday, May 18th, 2007
// Accelerometer Remote Colour Selection and Drawing
// Hannah Johnston
// May 2007

import procontroll.*;

ControllIO controll;

color userColour;
float paintX = 200;
float paintY = 200;

ControllSlider sliderX, sliderY, sliderZ;
ControllButton button0, button1;
float totalX = 0, totalY = 0, totalZ = 0;

void setup(){
colorMode(HSB, 4);
controll = ControllIO.getInstance(this);

Change the device name when using a different device
ControllDevice device = controll.getDevice(”Seng”);

// Using slider x, y, z to get the accelerometer value
// The slider values don’t correspond on the same scale as those used in C
// There is no calibration here yet either
sliderX = device.getSlider(2);
sliderY = device.getSlider(1);
sliderZ = device.getSlider(0);

button0 = device.getButton(0);
button1 = device.getButton(1);

int arrayindex = 0;

void draw()
float myX, myY;

// Multiplied by a fairly arbitrary value to get it into the 0-4 range
totalX = sliderX.getValue()*20;
totalY = sliderY.getValue()*15;
totalZ = sliderZ.getValue()*10;
println(” ” + totalX + ” ” + totalY + ” ” + totalZ);

float x = totalX;
float y = totalY;
float z = totalZ;

// Left Button: Draw with the current tool selection
// Hold it down when drawing
int i=0;

myX = sliderX.getValue()*10;
myY = sliderY.getValue()*10;

paintX += myX;
paintY += myY;

ellipse(paintX, paintY, 10, 10);
println(”Painting ” + paintX + ” ” + paintY);

// Right Button: Clear screen (it gets messy)
else if(button1.pressed()==true){
rect( 0, 0, 400, 400 );

// No Button Press: Display the colour-picker rectangle
userColour = color(abs(x), abs(y), abs(z));
rect(340, 340, 50, 50);

First Steps with Processing - Accelerometer Controller and Dance Mat

Tuesday, May 15th, 2007

I was playing around with a single accelerometer. For another project, it has been configured with 2 buttons and the 1, 3-axis accelerometer. Using the ProControll library with Processing I made a simple program that allows you to choose a colour, Hue Saturation and Brightness based on the accelerometer orientation. By holding down a button, the current colour is selected and you can paint with it (sort of like painting in air, except with 3 axes of rotation so it’s a bit tricky). And finally, the other button clears the screen. Starting from nothing, the whole project took a couple of hours.

After installing the ProControll library, I got to thinking that we could really use ANY kind of input device. I tried hooking up my dance mat (using a PS2 -> USB converter) and modified one of the example ProControll programs to use the dance mat arrows to rotate a cube (using the OpenGL library in Processing). It took litterally minutes.

I’ll post the source for both tomorrow.

If you have any kind of interesting hardware, I highly recommend hooking it up and playing with it. In a very short amount of time you can get stuff working in Processing. If nothing else, it will be good for mock-ups and prototypes. There are sound and video libraries too so check it out.

Fiiinally, one last thing. I just got a book out of the Carleton Library “Emotion and Adaptation” by Lazarus. It seems to offer a somewhat scientific background about emotions relating to psychology. It seems like it might be useful theory to help us make and support our design decisions. Time permitting and depending on the assigned readings, I will try to make some notes on the useful bits of the book.

That was long.