This is a Python based midi interpreter which gets data via an Xbox kinect.
Best if you have headphones or something with bass on for the video
I am running the program on a Linux Debian distribution but hopefully being on python should mean it can be used on any operating system which python installed on it.
The program uses the OpenKinect python wrapper. The GitHub page has detailed instructions on how to install the library on your computer.
https://github.com/OpenKinect/libfreenect
The OpenKinect wrapper connects you to your kinect but to interpret the depth and picture information I used
OpenCV If you have python-pip installed on your computer it is easy to install OpenCV simply follow the instruction on the below link.
pip install opencv-python
https://pypi.org/project/opencv-python/
And then to create and handle the midi interface I installed RTmidi
again you can install this library with the pip command like in the link below
pip install python-rtmidi
https://pypi.org/project/python-rtmidi/
To create and send the note information from the program through the interface I used the mido library
pip install mido
https://pypi.org/project/mido/
The program is not finished and is still very messy with a few unnecessary implementations in and an unrefined structure but it works. I thought I would share before it gets refined so different branches could spur off if desired.
The program will complain if there is something in the feild of view of the kinect when starting up so make sure to run the program "python midi-main.py" with and empty space in front of the kinect.
Now this is where it Will get OS specific. The next stage depends on how your system handles its midi. it mindlessly simple on Debian with the "Jack audio server" and "Qjackctl" and the synthesizer "Amsynth".
Firstly I open Amsynth three times (you can just click on the same start icon and it opens another instance of amsynth in another window)
I then go to config and change the midi channels on each instrument to 3, 9 an 10. These are the three channels the kinect program sends its midi information out on. each amsynth instrument can have a different sound loaded in to it. So I get to control the different sounds with my movements.
One instruments pitch is controlled by one hands movements in the X axis of the field of view of the kinect.
another instruments pitch is controlled by movements in the y axis of the field of view of the kinect.
the third instrument is controlled by placing both hands into the field of view of the kinect and the pitch is determined by the distance between the two hands.
The program auto steps the position of your hand to a note in a scale. the scale selected in this program is the dorian scale. I will provide a key for using other scales at a later date when I have completed it. the process is simple you simply enter the number of steps between the notes in an array if you wish to implement a different scale yourself. the array is called "dorian" in this program but it should have been named scale.
After setting up the three amsynth instances I run the python program "ppython midi-main.py"
The program opens a single window showing you an outline of what the kinect sees.
Then I open Qjackctl and open the connect menu button and connect the three separate instances of amsynth (drag and drop or right click) to the RTmidiout device the python programs creates when you start it up.
And The video at the top shows what happens next.
I will also attach the three amsynth preset files I for the instruments I used in this video.
I'm not great (i.e I don't care enough about coming off all polished and crisp) at presenting stuff so if my description of the installation process is not clear enough to follow please feel free to message me and ask for clarification.
Comments
Please log in or sign up to comment.