Skip to content

The Reactable

February 9, 2009

As a fan of electronic music, I thought this device was particularly interesting.  Today’s device is The Reactable and it is a musical instrument–a synthesizer which translates all the functions of a typical synthesizer into an intuitive interface of blocks which are moved around the surface of a table.  The shape of the blocks represent their function: a beat generator, for example, or an audio manipulation of some type.  The blocks interact with each other to make modified sounds and their interactions are represented on the table by a projector underneath the table…  The projector represents the audio interaction as a visual link between the two blocks on the table.  Rotating blocks adjusts the ‘output’ of the block–increasing or decreasing the frequency or pitch of the sound from the block, for example.  A good video of the device is here:

As a guy who has no musical training whatsoever, I really dig this project because it allows a no-training hack to pick up music-making as easily as picking a block up off a table.  With a similarly intuitive computer connected to the table, which could feed in samples which the user has an interest in, this concept for human/computer interaction could become huge, not to mention make this a successful product on its own (and it looks like they are planning on getting a business going to sell this thing).

This project piques my interest as it relates to distributed collaborative design as well.  My graduate class in design focused quite a bit on distributed collaborative engineering, and the reactable is an excellent illustration of how clever physical design merged with an understanding of epistemology can make complex tasks happen far more easily with innovative technology.  This project takes a very abstract task–music creation–and turns it into a physical activity which a small child could pick up in moments.  This is powerful–technologies which are designed to help stimulate thought in terms of visualization are only the beginning.  Humans are more visual creatures than anything else, in that vision is the way we get most information to our brain.  The next most important sense in terms of leveraging for creating thsi technology would probably be auditory and then tactility.  It’s hard to envision technologies which would be useful to collaborative engineering that stimulated taste or smell.  Tactility might be hard to implement, but maybe that’s a conceptual project I could work on–how can the human tactile sense be used to enable collaborative design technology?  But I think visual is the most promising–that’s where a lot of ‘low hanging fruit’ ought to be, for technologies like this.  Study of how people interpret images; field of perception; the ‘variables’ of sight: color, brightness, shadow, depth, etc?…  There’s a lot of things to ‘play with’ when creating visual interfaces for people.

The area of Human Computer Interaction is something I need to look into.  I believe they have a group at Georgia Tech who specialize in it.

Advertisement
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: