BeatStreet

BeatStreet is a project that we built in 36 hours for PennApps X at University of Pennsylvania. It turns a dancer's motions into music, using an array of pressure sensors, accelerometers, and a Myo, all linked to a computer over low-energy bluetooth.

(video to the right, click to play)

The first dancer wears six pressure sensors. Each of her shoes contains a sensor on the bottom, as well as a sensor on the back of the heel. Her left shoe has an auxiliary sensor on the toe, and her right knee has a sensor on the kneecap. Each of these pressure sensors is wired to an Arduino on her belt, which reads the raw analog values, filters them to detect impacts, and sends serial packets to a bluetooth transmitter. The bluetooth transmitter is an Adafruit Bluefruit EZ-Key, which emulates a bluetooth keyboard, and sends our bytes to the computer as ASCII key presses.

The second dancer wears two 3-axis accelerometers. One accelerometer is taped to the back of each hand. Again, an Arduino interprets the raw accelerometer data and sends commands to the computer via a second, identical bluetooth module.

The last dancer wears a Myo gesture-detection wristband. This compact device uses a set of electrodes on the dancer's upper forarm to sense the movement of their hand and fingers, and contains a 9-axis IMU for detecting position and movement of her arm as a whole. The Myo sends all of its raw data to the laptop via a third bluetooth connection, and a C++ program on the laptop interprets this data.

Lastly, a python script running on the laptop gathers the data from these three dancers, and maps each movement to a sound or beat. The pressure sensors are each mapped to a unique drum or synth beat. The accelerometers are mapped to DJ turntable scratching sounds. The Myo motions are mapped to some harder drum beats and bass noises sampled from dubstep songs.

One of the most difficult aspects of this project was reducing latency, the delay between when a foot struck the ground and when the corresponding drum beat was played from the speaker. We began with about half a second of latency, which sounded awful, and also completely disoriented the dancer by making it impossible to keep a beat. Over the course of the 36 hours, we were able to cut out latency by streamlining our Arduino code, maxing out baud rates, and minimizing our buffer size and bit rates for the audio samples. When we finished the project, we were able to eliminate almost all of the latency, to the point where there was no noticeable lag.