Ataraxia and Power of People (PoP), two interactive works presented under the Energy for Life project run by the Agoni Grammi Gonimi non-profit organisation. It is an environmental awareness program that calls on fifteen remote destinations, spreading ideas, thoughts and insights of ecological interest to students as well as to the inhabitants of the places it visits.

 During this project I have demonstrated the aforementioned interactive systems in various primary and high-schools in rural locations across Greece. The workshops mainly elaborate in concepts of new digital instruments, and interactive music compositions, such as Ataraxia, as well as in sound art installations using environmental information for sonification systems like PoP, which were developed as research projects.

 Students have the opportunity to experiment with the systems and take a look into aesthetical and technical concerns of the projects and get concise knowledge in this particular field of research of interactive music composition. Photo material and further information of the visits can be found here.




Sound Sculpt: SuperCollider + Chuck workshop at Ionian University.

In this workshop, we will use contact mikes and the Leap Motion free hand gesture controller together with SuperCollider and ChucK, for "sculpting sound" in real time with live coding. 

Sound input and gestural data are going to be used to trigger sounds and control sound parameters. Furthermore, sound input is used as source material that is "sculpted" in real time through filters, spectral transformations, and other sample-based techniques.  For further info of the workshop as well as technical requirements follow this link.




Further developments and creation of the instruments to interact with, live, through SuperCollider. The instruments include the Waveguide modelling unit generators in ChucK, which can be found here. Code snippets and further examples can be found in the Sculpt repository, as usual.




Some further developments on Sculpt. ChucK & SuperCollider are now binded. We have created the necessary code snippets to communicate these two via OSC, in order to control the Waveguide instruments of ChucK through the SuperCollider patterns. Having done so, we implemented a more convenient way to bind the control parameters of the instruments with the patterns programatically.

So the user doesn't have to hardcode all the control parameters down in ChucK in order these to be accesible by SuperCollider. For more details go to the Sculpt repository, methodology and 'how to' run the code snippets included. More to follow.




ModalBar is a ugen part of WaveGuide synthesis (physical modelling) in ChucK. Waveguide will be used as the synthesis engine of Sculpt. This is the first experiment using ChucK and WaveGuide controlled by SuperCollider. It includes two files: a chuck and a SuperCollider document that implements an OSC communication between the former and the later. In the current example SuperCollider controls the parameters of the ModalBar class. You may find these two docs within the ChucK folder of the Sculpt repository, as usual.




Chuck & SC code

Some coding to test OSC communication between ChucK & SuperCollider. Followed IXI tutorial here.




Includes:

  • SuperCollider classes and workspace files.
  • ChucK code files.
  • Config files for LeapMotion.
  • Design and help doc.

Hardware:

  • ca 5 contact piezos, connected to any surface of objects, big or small.  Note: the piezos must be soldered to common audio (mike) cable leading to Arduino (soldering/breadboard) or sound card (TS 0.5" input jack).
  • 1 Arduino board for triggering from piezos
  • 1 Sound card for audio input from piezos, and for sound output.
  •  Laptops / macmin




Following the pair programming way of working, we touched upon the mapping & interfacing with the Leap Motion device.




performing with Leap Motion

Sculpt project day: One. The project will elaborate in using a Leap Motion controller integrated with a bespoke software for sound synthesis developed in SuperCollider & Chuck in order to create a computer based musical environment for live improvisation informed by the concept of "tangibility". 




Some footage from the workshop I delivered at the AudioVisual department of the Ionian University on networked coding using Utopia and JITLib (SuperCollider library for live coding). This workshop was held under the 'artist in residence' program at IU. The workshop culminated in a performance of ataraxia, a live electroacoustic piece using the LeapMotion interface. Some video snippets performing ataraxia (K.Vasilakos 2014) at "Aithrio Skarpa" can be seen here.