Converting motion and heart beats to midi tones on a scale?

All your LV2 and LADSPA goodness and more.

Moderators: MattKingUSA, khz

Post Reply
AndyBuru
Established Member
Posts: 6
Joined: Tue Mar 20, 2018 4:27 pm

Converting motion and heart beats to midi tones on a scale?

Post by AndyBuru »

Hi,
First I must say that I'm a supernoob and this is my first post here. So I'm sorry if it's in a bad place, or stupid.

Anyhow I'm working on a coding project, I'm building a soundscape based on motion and heart beats. My setup is built on Ubuntu, JACK, DIN, and some home made code. My question is mainly, has this been done before? If so where can I find information? And secondly am I misunderstanding how audio architecture in Linux works, and therefore making some serious design flaws.

Here is a description of my setup, so far.

(1) I have written a python scripts that dumps information of an web camera and uses OpenCV to detect changes in the pictures. The amount of changes are converted into MIDI CC, and sent on an MIDI port using JACK as the backend. I control the flow of assigning the FPS dumped from the camera and the MIDI CC per second sent. I'm currently running it at 60 FPS and 5 MIDI CC per second.

(2) I have also written another python script that connects a heart beat sensor using BTLE, and then translates the pulse rate to a MIDI CC, again sent on another MIDI port with JACK as the backend. The conversion table between pulse and MIDI CC, I reversed engineered using Ableton, by sending 0 to 127 to the BPM controller in Ableton and checking what BPM each MIDI CC connected to. I stopped using Ableton because it was slow over Wine and I'm looking for a replacement.

(3) The MIDI CC output from the motion detection python script is fed into a third script. That converts the CC to a note on a scale. Right now from a three octave pentatonic scale. Higher CC messages equals a higher note, with a higher velocity. Lower CC messages give longer notes. And it's fully (but poorly) multithreaded so support paralel notes. The notes are send out as MIDI messages on yet another MIDI port.

(4) Right now I'm using DIN (https://dinisnoise.org/about/) to pickup the notes and make some sound. This is simply because I like the user interface and sound of DIN. But in the long run I would prefer something more lightweight and headless.

You can find the source code here https://github.com/andyburu/BioMIDI

I use Cadence and Catia to handle JACK and the connections.

Any feedback?
CrocoDuck
Established Member
Posts: 1133
Joined: Sat May 05, 2012 6:12 pm
Been thanked: 17 times

Re: Converting motion and heart beats to midi tones on a scale?

Post by CrocoDuck »

Hi there! Very interesting project!

It is hard to say anything about it really, before seeing it in operation. Just two things straight from the top of my mind:

Python: good stuff, but I see on your repo that at some point you would like to turn everything into LV2. Not sure if you can keep Python code for that. But it should work for prototyping, probably well. How is the performance doing? Are you trying to use Cython?

Synthesizer: not sure which one to recommend. Maybe zynaddsubfx has headless mode? I would suggest to have a look at the Faust programming language. It is a functional programming language that you can use to create complex DSP programs with a very succinct syntax. You could use it to cook your own super lightweight headless jack synth. Faust creates C++ code (which can compile directly to console or GUI applications or plugins), so it can be included in LV2 plugins in a straightforward way. You might even have your Python scripts generating Faust code based on the sensors inputs. That would be neat. You could have a look at the codebase of fastbreeder for genetic programming synth generation, or use any other optimization technique. Python should have plenty of those.

As for similar projects to use as a base, I am not sure whether there is something similar, but have a look at shows/concerts/workshops/theses at CCRMA, and maybe also some CCRMA academic like Romain Michon.

As headless seems to be your goal, I would suggest to have a look at bash based Jack session management. This is an old but perhaps not (too) outdated tutorial.

Good luck, and keep us posted!
AndyBuru
Established Member
Posts: 6
Joined: Tue Mar 20, 2018 4:27 pm

Re: Converting motion and heart beats to midi tones on a scale?

Post by AndyBuru »

CrocoDuck wrote::Python: good stuff, but I see on your repo that at some point you would like to turn everything into LV2. Not sure if you can keep Python code for that. But it should work for prototyping, probably well. How is the performance doing? Are you trying to use Cython?
So far it's just straight out python, because the performance has been okay running on a simple i5 laptop. I think it's because I'm only running one synth, on one scale, plus the background heart beat. Right now I'm focusing to make the existing components stable, then I want to add more complexity. Either by adding more cameras, or more synths, or in some combination. Also the webcam I use is very poor, so maybe I'll upgrade to an Kinect that supports IR, and depths analysis. Another idea is to use accelerometers but the bluetooth standard for that is not very easy or not followed.

The work kind of moves between two modes of operations, (1) adding more input, and (2) doing something artistic with what I have. I think experimenting with more scales and synths is the first step. After making things very stable. That might also help to detect performance issues. My estimated latency right now from for example a hand movement to a sound is 100ms. So it translates more of a feeling than a direct movement. I think it gets very interesting when it gets comparable to an human improvising based on visual input. Yet another aspect is that the machine will probably be able to build a more adaptive soundscape.

I'm studying theatre and direction, so that's how it connects to stage performance for me.
CrocoDuck wrote:Synthesizer: not sure which one to recommend. Maybe zynaddsubfx has headless mode? I would suggest to have a look at the Faust programming language. It is a functional programming language that you can use to create complex DSP programs with a very succinct syntax. You could use it to cook your own super lightweight headless jack synth. Faust creates C++ code (which can compile directly to console or GUI applications or plugins), so it can be included in LV2 plugins in a straightforward way. You might even have your Python scripts generating Faust code based on the sensors inputs. That would be neat. You could have a look at the codebase of fastbreeder for genetic programming synth generation, or use any other optimization technique. Python should have plenty of those.
Thanks for the tips!

I've seen Faust pop up here and there, but as you said prototyping in python is very agile and easy.
CrocoDuck wrote:As for similar projects to use as a base, I am not sure whether there is something similar, but have a look at shows/concerts/workshops/theses at CCRMA, and maybe also some CCRMA academic like Romain Michon.

As headless seems to be your goal, I would suggest to have a look at bash based Jack session management. This is an old but perhaps not (too) outdated tutorial.

Good luck, and keep us posted!
Yes, scripting my setup is also definitely on the todo list. Thank you again for the feedback.

There is a video from my first prototype of this. But it's very laggy. Due to Ableton under Wine, with a budget built in soundcard. Just be a little bit warn because the project involves rope suspensions and bondage. https://andyburu.se/project-rawmotions/
Post Reply