Joho the Blog » [liveblog][PAIR] Rebecca Fiebrink on how machines can create new things
EverydayChaos
Everyday Chaos
Too Big to Know
Too Big to Know
Cluetrain 10th Anniversary edition
Cluetrain 10th Anniversary
Everything Is Miscellaneous
Everything Is Miscellaneous
Small Pieces cover
Small Pieces Loosely Joined
Cluetrain cover
Cluetrain Manifesto
My face
Speaker info
Who am I? (Blog Disclosure Form) Copy this link as RSS address Atom Feed

[liveblog][PAIR] Rebecca Fiebrink on how machines can create new things

At the PAIR symposium, Rebecca Fiebrink of Goldsmiths University of London asks how machines can create new things.

NOTE: Live-blogging. Getting things wrong. Missing points. Omitting key information. Introducing artificial choppiness. Over-emphasizing small matters. Paraphrasing badly. Not running a spellpchecker. Mangling other people’s ideas and words. You are warned, people.

She works with sensors. ML can allow us to build new interactions from examples of human action and computer response. E.g., recognize my closed fist and use it to play some notes. Add more gestures. This is a conventional suprvised training framework. But suppose you want to build a new gesture recognizer?

The first problem is the data set: there isn’t an obvious one to use. Also, would a 99% recognition rate be great or not so much? It depends on what was happening. IF it goes wrong, you modify the training examples.

She gives a live demo — the Wekinator — using a very low-res camera (10×10 pixels maybe) image of her face to control a drum machine. It learns to play stuff based on whether she is leaning to the left or right, and immediately learns to change if she holds up her hand. She then complicates it, starting from scratch again, training it to play based on her hand position. Very impressive.

Ten years ago Rebecca began with the thought that ML can help unlock the interactive potential of sensors. She plays an early piece by Anne Hege using Playstation golf controllers to make music:

Others make music with instruments that don’t look normal. E.g., Laetitia Sonami uses springs as instruments.

She gives other examples. E.g., a facial expression to meme system.

Beyond building new things, what are the consequences, she asks?

First, faster creation means more prototyping and wider exploration, she says.

Second, ML opens up new creative roles for humans. For example, Sonami says, playing an instrument now can be a bit wild, like riding a bull.

Third, ML lets more people be creators and use their own data.

Rebecca teaches a free MOC on Kadenze
: Machine learning for artists and musicians.

Previous: « || Next: »

Leave a Reply

Comments (RSS).  RSS icon