I like to study the writings of computer music practitioners and try to implement their ideas in Csound. This expands my Csound skillset and exposes me to different compositional strategies. For example, I spent a few weeks this fall closely reading composer James Tenney’s essay “Computer Music Experiences, 1961-1964,” published in 1969. This essay documents Tenney’s experiences working with the MUSIC-N computer synthesis program at Bell Labs from September 1961 to March 1964.
“Dialogue” is one of the pieces Tenney produced at Bell Labs. You can hear how he explores the use of noise and the randomization of musical parameters like note duration, frequency, and amplitude envelope shape. In his essay he explains the philosophical and aesthetic foundations of his compositional practice at the time. From a technical standpoint, Tenney only illustrates his ideas with prose, schematics, and graphs, so it was up to me to figure out how to translate those into Csound code.
At the start of this project I was more familiar with Python than Csound, so I wrote a Python program to generate the entire Csound score. A simple Csound instrument (a sine wave oscillator with an amplitude envelope) would then perform the score. In order to write the program that generated the score I had to figure out how to 1) randomize several musical parameters and 2) control and direct this randomness in interesting ways.
Above is a recording I made of my initial Tenney instrument. You’ll notice that I didn’t explore noise or complex waveforms like he did; it’s all sine waves. Here are a few things to listen for:
The major drawback to generating the score in Python was that I couldn’t interact with the music in real time. The entire score was generated at the outset and then performed by Csound, and each time I ran it the score was different because of the randomization. All I could do was sit back and listen to the results, which was thrilling in its own way. I suppose this was the more historically accurate way to approach this project given that Tenney also had to generate his pieces ahead of time and then listen to the results.
In order to interact with the music in real time I needed to create a virtual interface with buttons, sliders, and checkboxes. To do this I used CsoundQt, a free Csound development environment created by Tarmo Johannes. One big advantage of CsoundQt is that it allows you to create interfaces very easily. However, before I could create an interface I needed to rewrite my Python score generation program in Csound. This was very challenging unto itself, but it was a necessary deep dive into Csound that made me much more comfortable with the language.
My Tenney instrument quickly took on a life of its own once I began building an interface. It began to feel less like a program and more like an instrument I could learn to play, and the more I played the instrument the more new ideas I would think of for enhancing it. Below is a screenshot of the interface I created.
Below is a recording of a live performance I did with the Tenney instrument. Perfoming this instrument consists of manipulating the base frequencies and durations and then sculpting the degree of random deviation around them. I was pleased with the harmonic, rhythmic, and dynamic variation this instrument was able to achieve.
This was as far as I went with the Tenney project, but it served its purpose well. I learned a tremendous amount about Csound and felt empowered to move on to my next project -- building a drum machine in Csound. More on that in my next post.