Back in the Midwest, in Cincinnati in 1966 or 1967 a random event occurred that sparked off another branch of do it yourself electronics. Qubais Reed Ghazala was looking for something in a drawer and he had given up on his search and shoved it closed. “The air was suddenly filled with cascading electronic sounds! I couldn’t believe it! I looked around, but saw nothing to give me a clue. Could it be the drawer?” When he opened the drawer back up the sound stopped, but when he started poking around in the mess of wires and electronics again the sound reemerged and it seemed to come from the clutter. But at first he couldn’t see what was making the sound. “Then I saw it. A palm-sized transistor amplifier, left turned on with its back panel off and circuitry exposed, was shorting out amidst the decaying trinkets and salvaged parts.” The exposed transistor was a lucky accident for Ghazala and his ticket to a lifelong pursuit and exploration of electronics. A junk drawer really is the best friend of someone who wants to tinker. In Ghazala’s case it gave him the necessary voltage to start making his own alien instruments and in turn inspired the world wide movement known as circuit bending. He was thirteen when he heard the joyful noise emerging from his “broken” 9-volt transistor amplifier as it scraped up against another metal object in his desk. He had left it on, so when it hit up against something and the unusual sound it made struck his fervent imagination. It had reminded him of the early synthesizers, the kind that were only available for ungodly sums, such as the machine at Columbia-Princeton which cost the institute $250,000 dollars. For a broke teenager from Ohio who was an aspiring artist and musician interested in all things unusual his junk drawer hit the right note, and for the right price: scrounged up and dirt cheap. He immediately had two ideas. “If these sounds are being created by accident, what could be done by purpose? If this can be done to an amplifier, meant to amplify a sound but to make NO SOUND itself, what would happen to SOUND-MAKING electronics when purposely shorted-out in the same way?” It was as if the keys to the kingdom of electronic music had just been given to him. And he got them without having to go to one of the specialized electronic music studios. It was something he could do at home. Ghazala is rich with curiosity, and his natural inclinations led him down a path of tinkering and benevolent mad science. So he started playing. “Working with this toy I discovered many really wonderful things! I found lots of these creative short-circuits, with many different responses to be had. I found that just touching the circuit with bare fingers allowed electricity to flow through the body, further shaping the sounds. I found points that would illuminate lights, and began adding other electronic components to the path of the short-circuits... capacitors, variable resistors; whatever I could find. I discovered also that when the line-output of the now circuit-bent amplifier was fed into a real stage amplifier, one of those big Vox or Fender stacks, the sound projected had nothing anymore to do with toys.” The art of circuit bending grew out of this chance happening, though Ghazala didn’t give it that name until 1992. The technique typically uses low voltage gadgetry, so the person bending the circuits to their own whims doesn’t get fried. Battery powered toys and sound making devices of 9 volts or less are typical; and there is a plethora of disposable toys available from bargain bins in the second hand market. The ease of access to materials makes circuit bending affordable even to those musicians and makers on the tightest of budgets. Like Gordon Mumma before him Ghazala uses the surplus of his time –in his case, cheap plastic toys housing sophisticated circuits- to make music. The process is relatively straight forward, and can be considered a kind of audio hardware hacking. First he would remove the plastic panels housing the toy to expose the electronic nervous system. With batteries inserted the toy or device is turned on he allows it to make sounds. Using an insulated test wire Ghazala touches the exposed end to various points on the circuit board and makes notes about the sounds that interest him when various connections are made. This list of compelling contact points on the circuit board becomes Ghazala’s map for making a unique musical instrument. Using the map he can creatively rewire the salvaged toy in ways it wasn’t wired when he first opened it up. He often adds additional knobs and switches in the process. One of the key elements that many of Ghazala’s instruments have is body contact points. These are typically little metal plates that connect the musician directly in-circuit to the instrument. Because the instruments are low voltage, and because the human body is a natural conductor of electricity, a player can touch two of the contact points, and becomes part of the instrument, altering the flow of the bent circuit, making the instrument burble out a surprising variety of sounds. These sounds are variable because each player conducts the instrument differently, and even the same player will get different reactions from the instrument at different times. Within these general parameters of practice whole worlds were waiting to be born. Reed says “that beyond the obvious and delightful giddiness associated with toys being transformed into capable and outlandish synthesis equipment, when stripped of their target-sales housings and names all that remains of these toys is an electronic circuit lying there. And in many cases, these circuits contain sophisticated electronics capable of very high quality voices, just waiting to be nudged toward circuit-bending's anti-theory edge.” Reed Ghazala has created a whole suite of circuit bent instruments. The main families are based on the kind of musical and sound making toys he most often found out in the wilds of the thrift stores and second hand shops of Ohio, where he spent time hunting down items that could mutated into true artifacts. The prominent groupings are the Incantors, Aleatrons, Insectaphones, Morphiums, and Photon Clarinets. Numerous prototypes, one-offs, and variety of commissions have also come out of Reed’s Anti-Theory workshop. The Incantor, Morpheum and Aleatron series I look at here offer a window into the praxis and philosophy of circuit bending from the man who pioneered this strategy. The Incantor’s are a grouping of aleatoric electronic instruments made by deliberately mis-wiring and short-circuiting the once common electronic Speak & Spell toy. Speak & Spell’s were a product of Texas Instruments first introduced to the public in 1978 at the Consumer Electronics Show. This high tech toy consisted of a TMC0280 linear predictive coding speech synthesizer, an alphabetic keyboard, and a receptor slot to receive one of a collection of ROM game library modules. The toy had originated as an outgrowth of Texas Instrument’s research into speech synthesis. The Speak & Spell used trademark Solid State Speech technology that stored full words in a solid state memory format that was similar to the way calculators from the same era stored numbers. The expansion modules could be inserted through the battery receptacle to load new libraries and games. It was the first educational toy to reproduce speech without relying on tape or phonograph recording and words could be punched in and spoken in a way similar to how Texas Instrument calculators could solve a math problem. The original intention of the unit as advertised was as a tool for helping kids around age seven and up to learn the spelling and pronunciation of difficult and commonly misspelled words. The phoneme data for the synthesis of speech was stored on two 128 kbit ROMs, the largest capacity ROM then in use. The word libraries were created from recordings of professional speakers brought in by Texas Instruments to utter and say the words. Once the voices were captured they needed to be further processed to fit the limited memory of the ROMs and this was accomplished using a computer. Once processed the words often needed further editing because of the sharp reduction and cut of the original data rate. Information had been lost and noise had been introduced into the system. Some of the recorded words had become completely unintelligible. All the hard work required by the technicians and engineers at Texas Instruments got bent to other purposes when Reed Ghazala got his hands on a Speak & Spell. A Speak & Spell out of the box is already musical, after one of the units terminals get cross-wired, after additional electronic components such as potentiometers get installed, after the normal functioning is completely disrupted it becomes an Incantor, capable of incanting from the basic parameters of letters, words, phonemes and vowels to speaking in tongues and talking in alien languages. These modifications overwhelm the unit's keyboard switch matrix and trigger an effect known in the field of electronics as key jamming or ghosting. This is something that happened on older matrix keyboards when these three keys were pressed together at once making a fourth keypress to be erroneously registered by the keyboard controller. In this manner a glitch became a feature when repurposed for music. Once rewired all of Ghazala’s instruments get the beauty treatment. They are repainted and made into true one of kind art objects, the equivalent of a luthier applying the final stains and varnish to violin or guitar. The Trigon Incantor Bolstered by the success of their Speak & Spell, Texas Instruments came out with a few variations, the Speak & Read, and the Speak & Math. With these selling along quite nicely, the company made a toy for younger kids, aiming for the toddler market. What they came up with, sick creepers be warned, was called the Touch & Tell. The games for a Speak & Spell might ask a youngster “Can you spell the word CAR?” the Touch & Tell would just give the kid a prompt saying “Can you find the CAR? Press a picture.” The pictures in the Touch & Tell can be switched out, interchanged. As Ghazala wrote in an article for the journal Experimental Musical Instruments (EMI), “Each displays a variable number of images and is hole-punched along the edge so as to reset hidden switches when located in place. In this way, alternate switch settings program the synthesizer's computer relative to each picture sheet. Although regarded as toys, the only thing childish about these curious talking boxes is their vocabulary. But with circuit-bending ...” The Touch & Tell thus became subject to Reed’s recombinant techniques and once transformed became the basis for the Trigon Incantors. The “Trigon” part of these Incantors came from the three metal balls that he used to roll across the surface of the circuit bent Touch & Tell. Ghazala speaks, “With voice characteristics being somewhat similar, the standard and Trigon Incantor differ greatly in the playing techniques (keypad vs. steel balls). Both are capable of producing short as well as on-going streams of finely delineated digital sounds. These sounds, which range from percussive to melodic to vocal, and are constantly re-evolving through abstraction after abstraction, can be initiated on each instrument through various data entries involving the new circuit-bending switches, steel ball positioning, as well as standard keypad actuation.” Silence the Tongues of Prophecy is one of the songs Ghazala has made with his Incantor’s. It also features an er hu, or Chinese spiked fiddle, and another instrument he created called R.A.P (Readily Available Phonemes). This latter instrument predates the birth of hip-hop and rap music and is another one of his instruments that utilizes synthetic speech. The Morpheum series was distinguished by the use of children’s toys that produced animal and railroad sounds. The extensive use of conductive flesh to circuit contact points made these instruments extra raw, and yet dreamy, and are perfect for being patched into an array of effects pedals. Ghazala recalls that first experience of playing with body contacts. “When I felt the jolts of electricity coursing through my body back in '67 as I began playing the Odor Box body-contacts, it struck me that I had become part of the instrument's circuitry, as in-the-matrix as any other component on the board. The circuit no longer was limited to dead matter. It didn't stop at its ‘ends’ anymore... and neither did I. This is definitely a new creature, it lives and shares electricity... the same electricity that, if taken away, would cause each to die.” The Morpheums can be outfitted with strap locks so they can be worn like an accordion. Only this accordion makes truly bizarre howls. Reed suggests using it as a lead instrument. The Aleatrons are a series of circuit bent Casio keyboards that combine many of the properties Reed had previously explored in his other bending excursions: body-contact control, human voice synthesis, digital samples, the equal-tempered scale and aleatoric instruments. The Aleatrons, and specifically the SA-2 Aleatron, combine all of these techniques into one hallmark instrument. Since the SA-2 is a thirty-two note keyboard with a built in sound library, the equal temperament bending box gets ticked first. The aleatoric element comes from a trigger switch that disrupts the normal programming routines of the instrument. The SA-2 does not have voice synthesis, but it does have a number of human-like sounds, fractal chants that emerge from chance. It also contains a plethora of digital samples from the bank of sampled percussion sounds. "Whereas analog signals tend to deteriorate into static-like decay when exposed to certain circuit-bending applications, digital signals break-down into distorted routines rather than distorted tones. The tones can therefore remain sharp while their harmonic content, envelope and assembly behavior is altered. Likewise, just as it is with the musical notes, digital percussion sequences are similarly transformed. Cymbals become backward gongs, kick-drums blend into bass lines, snare drum decays are frozen into crystalline seas of sizzling metallic hiss.” Some instruments Reed built from the ground up like his Photon Clarinet and a variety he calls the Insectaphone. But even these latter that he built from scratch, he would then alter with circuit bending techniques to further alienize the sound. Requiem for a Radio Ghazala is no stranger to the creative power of destruction. One of his musical projects and recordings was called Requiem for a Radio. For this piece he took a small plastic transistor radio, forcibly pried it apart, crushed it to bits into an electronic grinder, then melted all the pieces into a disc before sawing the disc into forty small pieces. He recorded sounds from throughout the whole process. Each of the four movements, Kyrie, Dies Irae, Sanctus and Agnus Dei are each composed exclusively from the sounds of one of the four processes the transistor radio underwent in its transformation. The resultant work is musique concrete as only Ghazala could have imagined it. With the initial distribution of the recording he sent some of the sawed pieces of the disc out along with the CD. Part of the beauty of the circuit bending technique is that it brings instrument making and basic electronics into the hands of anyone who wants to do it. As Ghazala says, “Working with toys has advantages beyond the eccentricities and power of the final voices: No knowledge of electronic theory is needed whatsoever to circuit-bend. Toys open themselves to the process. Anyone can do it. Simply, a wire is used to make connections between arbitrary points on the circuit while the toy is making its usual sounds. A switch is then wired between points discovered that produce an interesting sound so that the effect can be turned on at will. This procedure will usually result in a number of switches that can often be mounted on the toy's housing. If you learn to solder and can drill holes in which to mount your switches, you can circuit-bend.” What better way to get into the electronics hobby than by reclaiming what other people no longer want and transforming them into new works of art. Reed Ghazala has built a life out of doing just that. In the process he has created a folk art for the electronic age, as he wrote articles on his instruments, and taught others how to go about bending. He even wrote a whole book on the subject. “The discards of our society pile up around us like coconuts in the surf. Picking up an abandoned toy, picking up a coconut, rewiring the toy, poking holes in the coconut, flipping the new switches on the toy, blowing over the new holes in the coconut, letting the toy's new music direct you to it, letting the coconut's new music direct you to it... these things are part of us. This is how musical thought systems are born.” Reed Ghazala’s Anti-Theory Workshop is a true laboratory of radiophonics and alchemical imagination. References:
Circuit-Bending: Build Your Own Alien Instruments, by Q. Reed Ghazala, Wiley, 2005 Gravikords, Whirlies and Pyrophones: Experimental Musical Instruments, written and produced by Bart Hopkins. Ellipsis Arts, 1998 [Book and CD set.] http://www.anti-theory.com/ various pages within Ghazala’s website http://www.furious.com/perfect/emi/reedghazala.html https://en.wikipedia.org/wiki/Speak_%26_Spell_(toy) https://cdm.link/2018/07/the-strange-cartridge-powered-speech-of-ti-touch-tell/ .:. .:. .:. Read the rest of the Radiophonic Laboratory: Telecommunications, Electronic Music, and the Voice of the Ether.
0 Comments
The first time Chris Brown heard the League of Automatic Music Composers was on KPFA as he was driving to a piano-tuning appointment in 1981. The music was wild, unified as an organism, yet with divergent tentacles or strands wiggling off in multiple directions like a psychedelic octopi. It was Chris’ first exposure to networked computer music, and the wriggling tentacles had put their first hooks into his brain.
Five years later Chris was working with a group who had dubbed themselves Ubu, incorporated, named after the 1896 play Ubu Roi by Alfred Jarry. This group had members from the LAMC and was now at work organizing experimental music concerts at galleries and community music spaces. One of the concerts the group decided to organize was called THE NETWORK MUSE – Automatic Music Band Festival. Held in an old church it brought together four different groups working with homebrewed computer music and presented performances over a few days. One of these groups was the duo of The Hub, then comprised of just Tim Perkis and John Bischoff. At the concert Bischoff and Perkis were using a KIM-1 as a mailbox to post data used in controlling their individual music systems. This information then became available to the other player to use however and whenever they chose as they performed their combined system. The Hub had been their solution to the often messy tangle of wires and electronics that had been common during the LAMC years. Their interface was an elegant solution and a variety of computers and their users could plug into the system.
In 1987 composers Phil Niblock and Nick Collins instigated the formation of an expanded ensemble when some members of The Hub were invited to New York to give a performance at two separate locations linked together by a modem. This required the additional players and they were readily pooled from the other groups who had participated in the Network Muse. The two locations to be linked were both performance spaces, Exerimental Intermedia (XI) run by Niblock and the Clocktower (now MoMA PS1). The idea was to have a trio play at each location, that when connected via the modem became a sextet.
Bischoff and Perkis had already started playing as a trio with Mark Trayle in a group called Zero Chat Chat in the aftermath of the Automatic Music Band Festival, so it was a simple matter to recruit Chris Brown, Phil Stone, and Scott Gresham-Lancaster, who had all played in different groups at the festival to form a second trio. This expanded sextet became the Hub. They designed three pieces to play for the network, using the modem that divided the acoustics of the sextet into two trios that were still joined via the wires of information. These pieces were “Borrowing and Stealing”, “Simple Degradation” and “Vague Notions”. They also played three other pieces that were improvised independently, local to each group. As Kyle Gann wrote in a review of the piece for the Village Voice at the time, “Equally peculiar (for those who attended a different space each night) was the oblique correspondence of identical pieces between the Clocktower and EIF, for the two audiences did not hear the same sounds. Each group fed information into the others' performance, but basic materials differed, making each piece a kind of sonic conceptual butterfly: same body, wildly different wings.” To many people having a group playing in two different physical locations was just a neat technological stunt. While interesting to promoters, it wasn’t the main interest of the band, though the performance did help congeal the Hub and the six composers continued to work together under the rubric. Yet the idea of the modem concert continued to haunt them and it was a spectacle they were asked to repeat in different forms. Their interest wasn’t however in the distances that separated them, but in the interactivity of the network itself, and the sounds of iconoclastic music programming of each musician that could be influenced by the musical programs of the others.
The Hub also kept up with the new computers that continued to hit the market. The next iteration of the Hub device was based on the SYM-1 single-board computer made by Synertek. The processor was 1 MHz and it had 8k of RAM on the deck and a hexadecimal keypad for programming in machine language like the KIM. What made this an upgrade for the computer music chamber ensemble was that they built an expansion board onto the SYM that had four 6850 ACIAs (asynchronous communication interface adapters). These had connections to the 8-bit databus, seven address lines, system clock, and read and write controls. This bit of hacked together gear gave them options for connecting, interacting, musically communicating.
The homebrewed circuits were housed inside a box of clear plastic underneath the SYM with connectors on the outside. Three of the connectors were used to network three players with 1200 baud RS232 serial connections. The fourth connector went to an identical SYM-HUB they had built to host the other trio -the other half of the six piece band. These two Hubs could now communicate with each other quite speedily at 9600 baud, even though most modems in that era couldn’t send information that fast. Phil Stone and Tim Perkis wrote a program in an assembly language used to receive and transmit messages from the players, each with their own serial port, to the Hub. The program also constantly copied stored data to the second Hub so that both memory areas had data from all the members of the group. Stone and Perk’s wrote some comments on the program, “Devices connected to each channel make requests to write to the HUB processor table memory, and to read it. Each makes its request by sending command bytes of which the high four bits form a command field (CF) and the low four a data field (DF). In the HUB processor there are three variables kept for each channel: a current WRITE.ADDRESS (12 bits); the current READ.ADDRESS, (12 bits) and the current WRITE.DATA (8 bits). These variables for each channel can be set only by commands from that channel. All channel commands are dedicated to setting these variables, or initiating a read or write to the HUB table memory.” The music of the Hub is in its way just as cerebral as the means used to make it. Having assembled their gear and membership, they now set about to play the endless game of composition, programming and recombination. The group were musicians first and technologists as a close secondary interest. Where most musicians work from a score, the Hub works from a spec. Individual notes are not preordained, but specifications for how a piece is to be constructed is all put in the spec. The spec can be read closely along with the schematics of the Hub. Like the blueprint for a house, the spec gives an outline or structure to the game of networked music. Even though the spec is often designed by one composer, the individual aspects of how it is prepared are left up to the programmers individual.
HubRenga
Being based in the Bay Area, having a history with CCM and Mills College, and being part of the experimental music and arts scene meant there was a great deal of overlap between people, and a lot of potential for fruitful collaborations. Several members of the Hub knew Ramón Sender. During the Hub years Sender had gotten interested in the collaborative aspects of writing made possible with computer networks. A fruitful collaboration was cooked up between the Hub, Ramon Sender and the Poetry group on the WELL, the Whole Earth ‘Lectronic Link one of the oldest virtual communities and a regular online hangout spot for members of the counterculture. The first version ofHubRenga was performed over the air on a KPFA’s Music Special radio show hosted by Charles Amirkhanian on September 7, 1989. In this transmission the Hub was joined by novelist and musician Ramón Sender, and poets from another network, the poetry conference of The Well (Whole Earth ‘Lectronic Link), a pioneering electronic community that operated in the Bay Area to facilitate communication between people interested in arts and alternative lifestyles. The poetry conference was a forum about poetry which subscribers to The Well could join to exchange ideas and work collaboratively. Sender was one of the hosts of the forum for a number of years. For the HubRenga piece, the computer network of the Hub was connected to the network of the WELL. For this performance, the Japanese poetry game called the Renga was used as a format for the textual aspect of the work. Renga is a genre of collaborative Japanese poetry where alternating stanzas are linked in succession by multiple poets. Renga is typically composed live when a group of poets are gathered together. For HubRenga Ramón acted as moderator inside the KPFA studio, and browsed the poetic submissions as they came into the poetry conference forum on WELL, reading them aloud as part of the music, accompanied by an unnamed female reader. The WELL poetry group, had been working, through Sender, for a few months with the Hub before the big date at KPFA. In keeping with traditional Renga practices, the poets worked around a theme. In departing from those practices they used a non-traditional theme. Usually the themes are based on the season when they are performed: summer, spring, autumn, winter. In this case the poets chose to use Earth as the theme. The poets came up with a common list of set words to use throughout the performance and this was given to the composer-programmers. They wrote programs that used these words as triggers. When a Hub member received a text from the WELL on his computer, their program filtered it for specific keywords, determined in advance from the list to trigger specific musical responses. The keywords chosen by the Hub as triggers were: embrace echo twist rumble keystone whisper charm magic worth Kaiser schlep habit mirth swap split join plus minus grace change grope skip virtuoso root bind zing wow earth intimidate outside phrase honor silt dust scan coffee vertigo online transfer hold message quote shimmer swell ricochet pour ripple rebound duck dink scintillate old retreat non-conformist flower sky cage synthesis silence crump trump immediate smack blink This was the kind of interactive system the Hub thrived on and HubRenga was performed again in Los Angeles, along with Bonnie Barnett, an original member of Pauline Oliveros’ Womens Ensemble, who declaimed the power words. In this iteration Ramón Sender and members of the WELL Poetry Conference, participated via modem from the Bay Area. The Hug Goes MIDI In 1990 the Hub brought their wrecking ball to the world of MIDI music, a technical standard and communications protocol that was then only nine years old. Scott Gresham-Lancaster had been tasked with exploring its possibilities for the group. MIDI, which stands for Musical Instrument Digital Interface, allows for a plethora of electronic instruments, synthesizers, computers and other audio devices to be connected together to play, record and edit music. One single MIDI link on one single cable can carry up to sixteen discrete channels of information and these can all be sent to different instruments or devices, say a synth, drum machine or computer. The information carried on one of those channels includes musical instructions for pitch, velocity or attack, notation, vibrato, panning within the stereo field, and clock signals that allow one device to control the tempo of the other devices in the MIDI network. As a musician plays something that is using MIDI it all gets converted to information that is commonly used to control other sound producing modules. For instance a person is playing a synthesizer and it is triggering an external drum machine, sequencer, or other digital sound module. It is also used for recording and writing music. A player can hook a MIDI capable instrument up to a computer which then records the data. This information can be assigned different voices in a digital audio workstation, modified, and edited. This typical way of using MIDI –as one musician controlling an array of other instruments from one station- had no interest or appeal to members of the Hub. They wanted to break MIDI and use it for their own purposes. Scott beta-tested the then new Opcode Studio 5 MIDI interface. It was a single box unit that functioned as computer interface and MIDI patchbay with 15 inputs and outputs, processor and synchronizer. Scott played around with the hardware and learned how to program it so it could work as MIDI version of their namesake Hub. The new protocol would give them a faster messaging system that was also more flexible than their homebrewed system. Another advantage was that by using a standardized platform they would be able to share their working methods with other musicians in a way that was more accessible and closer to open source. Yet the switch to MIDI meant a drastic change from the system they had been using. In the world of electronic music a new system means a new sound and they would either have to alter their existing pieces to fit with MIDI or start writing brand new pieces. It also changed the operational mode they had become accustomed too. Instead of the common memory shared between members, where data in any customized format could be deposited, the MIDI-HUB worked as a switchboard. Each player now had their musical data tagged and in this way identified them. “No longer was it up to each musician to specifically look at information from other players, but instead information would arrive in each player's MIDI input queue unrequested. Information about current states had to be requested from players, rather than being held on a machine that always contained the latest information. This networking system was more private, enabling person-to-person messaging, but making broadcasting more problematic. To send messages to everyone, a player would need to send the same message out individually addressed to each player. If a player failed to handle the message sent, its information was gone forever. And messages were sent more quickly under the MIDI-HUB, leading to an intensity of data traffic that was new in the music. The MIDI-HUB pieces reflected the nature of this new aspect of the band's network instrumentation.” Waxlips was the first piece written for the MIDI-HUB and it was designed by Tim Perkis as a simple way of exploring the architecture of the network and it ended up becoming a “tune up” piece for the ensemble in their performances and tours, a way to test the system and get it up to speed before tackling other pieces from their repertoire. It was written to be simple and with minimal musical structure. Each player sends and receives requests to play one note. Once the request comes in and is received, the note message gets transformed in a fixed way and is sent on to someone else. The message can be modified by any musical rule. The only limiting factor was that in the various sections of the piece, specified with signals from a lead player, the same rule must be followed so a new-message-in gets followed by the same new-message- out. The lead player “jump-starts the process by spraying the network with a burst of requests.” Tim Perkis writes in the liner notes to the Wreckin’ Ball CD that contains recordings of Waxlips, “The network action had an unexpected living and liquid behavior: the number of possible interactions is astronomical in scale, and the evolution of the network is always different, sometimes terminating in complex (chaotic) states, including near repetitions, sometimes ending in simple loops, repeated notes, or just dying out altogether. In initially trying to get the piece going, the main problem was one of plugging leaks: if one player missed some note requests and didn't send anything when he should, the notes would all trickle out. Different rule sets seem to have different degrees of ‘leakiness’, due to imperfect behavior of the network, and as a lead player I would occassionally double up -- sending out two requests for every one received -- to revitalize a tired net." One of the ways the MIDI-Hub enabled the ensemble to collaborate was by receiving the output data from another musicians set up. For Alvin Curran’s Electric Rags III composition, Curran improvised on his Yamaha Disklavier electric piano. The MIDI output of his improvisation was sent through the Hub system and the ensemble players used it whatever ways they wished.
They used a similar set up again for Scot Gresham-Lancaster's Vex, a take on Erik Satie's proto-minimalist and extremely long piano piece Vexations. For this version they took Satie’s score and fed it into the HUB for a synchronized performance of the piece by Alvin Curran and the Rova Saxophone Quartet. As each note arrived into their system the Hub took the notes to create an electronic embellishment for the acoustic players they were working with.
Curran was a frequent collaborator and they worked with him on a studio version of his Erat Verbum (1993 iteration). This was a six part radio composition piece made for the Studio Akustischer Kunst of the WDR, and they worked with him on the Delta section. The piece utilizes recordings of John Cage’s famous Norton Lectures, also known as I-IV, that were fed into the HUB. The members of the group perused these and retranslated them instantly into Morse Code. Curran than live mixed the dots and dashes into a stunning fantasia. The stamp John Cage left across various musical subcultures and musicians was also evident in the work of The Hub. His spirit was kind of hovering in the background of things as they went about their work. “One of the strands in the musical philosophy of The Hub was the interest in defining musical processes that generated, rather than absolutely controlled, the details of a musical composition. An acknowledged influence on this interest was the work of John Cage, and it seemed a natural extension to us to try to automate the indeterminate processes used in his work. Many of these processes are extremely time-consuming and tedious; and given that Cage was himself involved for a long time in live electronic performance, we felt a real-time realization of these processes during the progress of a performance was not only feasible, but aesthetically implied.” In 1995 they got the opportunity to do a live realization of Cages’s Variations II at Mills College for a happening put together by David Bernstein called “Here Comes Everybody: A Conference on the Music, Writing, and Art of John Cage”. As part of the activities one evening of concerts was devoted to Cage’s electronic music and The Hub performed their version of his iconic composition.
Disconnectivity
Ever since the Hub had played together at their XI/ClockTower premiere in NYC, in two separate locations connected by modem over the telephone wires, there had been pressure on the group from the many techies interested in their music for them to switch from their serial communications network to ethernet. There had also been pressure on them to do further concerts where the musicians were playing in different locations but connected via a network. In a way they had done this with the HubRenga concerts where the poets connected to the Hub via the WELL. Yet they hadn’t played together as a spatially disconnected group since the first concert. In a way this was something that was expected of them, even if they really preferred to be in each other’s company while playing. The public fascination with the idea of musicians playing together though separated but vast distances in physical space remained a constant even though they had never repeated the experiment or incorporated as a regular part of their practice as a network ensemble. They preferred the local area network of being in each-others company as they played. They sought a balance between the spontaneous interactions of the electronic systems they set up and the reciprocal feedback between themselves as humans making music together -an inherently social activity. Chris Brown writes, “Since that event we have continued to receive requests for concerts to be performed remotely, that is, without all of us being physically in the same space, but have always declined, in part because we really prefer to be in the space where we can hear each other's sound directly and to see each other and communicate live. The Hub is a band of composers who use computers in their live electronic music, and our practice has been to create pieces that involve sharing data in specific ways that shape the sound and structure of each piece. We are all programmers, and instrument builders in the sense that we take the hardware and software tools available to us and reshape them to realize unconventional musical ideas.” Eventually however The Hub succumbed to pressures to produce another concert where the members were separated in two different locations. “Points of Presence” was produced in 1997 by the Institute for Studies in the Arts (ISA) at Arizona State University (ASU), that linked to members of The Hub at Mills College, California Institute for the Arts and ASU over the internet. The piece nearly spelled the end of the Hub after a decade of cooperative engagement in network music composition. “Now in 1997 new tools have become available that allow us to reapproach the remote music idea - telharmonium, points-of-presence - in a new way. Personal computers are now fast enough to produce high-quality electronic sound in real-time, allowing instrument-builders like Mike Berry to choose a purely software environment to produce home-made musical instruments. His Grainwave software, a shareware application for MacOS PowerPCs, was adopted by the group for this piece because it allows each of us to design our own sounds, and these sounds/instruments can be installed at any physical location that has a PC on which they can play - we can be independent of the hardware that produces our music, our instruments have become data which can be replicated easily in any place. At the same time we, along with the rest of our culture, have been spending more and more time in our lives and our work communicating and collaborating on the internet. Why should we not extend our musical practice into this domain? Can we retain here the ability to define our own musical worlds, avoiding the commercial, prefab, and controlling musical aesthetics of the technological culture?” Yet the performance itself was plagued by technical failures. They ran into many issues with the software and couldn’t debug it easily on the fly with a room full of people expecting to hear a concert. Because they weren’t in the same place they had to rely on internet chat and telephone calls to try and fix the issues. And with the different parts unable to work together as a network, the music was never able to work or lift off the ground. They were only able to play for ten minutes as a full network and they had to supply those who came to hear them with clumsy explanations of what they were trying to do. “The technology had defeated the music. And after the concert, one by one, the Hub members turned in their resignations from the band.” It wasn’t to be the very end of the band. Having been built as an ad hoc network they eventually found themselves reassembled again, ready for action, and all of the members of the Hub have lively musical activities they are involved with outside of the network -bringing in new information and new ideas to their working methods. References: The League of Automatic Music Composers: 1978-1983, New World Records No. 80671, released 2007. Collection compiled by Jon Leidecker (Wobbly). The Hub: Boundary Layer. Tzadik. 8050-3. Three CD set with extensive liner notes and CD-Rom text files. At a Distance: Precursors to Art and Activism on the Internet. Edited by Annmarie Chandler and Norie Nuemark. MIT Press. Cambridge, Massachusetts. 2005 .:. .:. .:. Read the rest of the Radiophonic Laboratory: Telecommunications, Electronic Music, and the Voice of the Ether.
As the musical computers at Bell Labs in New Jersey were winding down in the late 70s, people in the California homebrew microcomputer scene were just starting to get wound up. DIY computers had arrived and a group of electronic music experimentalists in the San Francisco Bay Area were writing programs, networking them together and seeing how they sounded in various configurations. The group was known as the League of Automatic Music Composers (LAMC), active between 1977 to 1983 before being reassembled into another musical configuration known as The Hub. LAMC can rightly be considered the first computer music group, and first network music group.
The League had its beginnings in the CCM during the time when Robert Ashley was the director. It was also the time when the first fruits of Silicon Valley were beginning to ripen and were able to be plucked off the shelf by hackers and hobbyists. At CCM these hackers and hobbyists were also experimental musicians. Because the CCM allowed for open access to its studio it drew a large crowd of people outside of strictly academic art music into its doors where they were all able to freely mix and mingle. Rock musicians met hackers, and hackers met free improvisers and jazz heads, who all met those studying the radical end of western classical music as it had evolved in the 20th century. One of the mottos of the CCM was “if you’re not weird, get out!” It became a home for an assortment of musically inclined misfits, a place where they could fit in. Part of this already strange and heady brew was the homebrew tradition, which was very active at the Center due in part to its proximity to new integrated circuits being produced in Silicon Valley, in part due to its history as the place where the Buchla Box had been invented, and its association with the original composers who had formed SFTMC. Many of those luminaries, such as David Tudor, came to lecture and give concerts at CCM. The students had taken to the idea that building and designing circuits was part and parcel of the compositional process. The schematic diagram was seen as directly related to the graphic scores that had been innovated by the likes of John Cage, Morton Feldman and Karlheinz Stockhausen. David Tudor and Gordon Mumma had already paved the way in their creation of electronic musical systems that once designed and built could be turned on to produce the music. These cybernetic systems were often autonomous and required little intervention from the composer as player after the system had been set up. Tudor had spent time at CCM as a composer in residence and his influence permeated the atmosphere there, particularly his idea that the job of the composer was to listen rather than to dogmatically determine every last note of a piece of music. This emphasis on listening is a theme that runs through contemporary musical practice and can be traced to this rich heritage left to us by Cage, Oliveros, and Tudor. In Tudor’s case he emphasized the setting up of autonomous, or automatic networks of electronics; systems that were made up of phase shifters, attenuators, amplifiers, and filters such as in his Untitled piece from 1972. The aesthetic beauty of such a piece lies in the enjoyment of listening deeply to the complex interactions of the system. This system music presents a mirror to other types of systems: human social systems, the diverse ecological systems of the natural world, complex electronic communication systems, and the way the human body is a system of organs, cells, tissues, nerves, and parts all moving together, sometimes in harmony, sometimes creating dissonant tones and clashing with noise.
By the mid-seventies the first commercial microcomputers had been made available for the average consumer. They were called micro at this time to differentiate them from their mainframe predecessors that took up entire rooms in the halls of industry and the academy. This availability meant that anyone who was willing to fork over the $250 bones one of these machines cost could have their own computer. Free from the oversight of how it was used by the folks who were in charge of the institutional mainframes, enthusiasts were able to dabble. These micro computers were integrated into the circuit of California’s music scene.
Jim Horton was an early adopter, and he was quick to get his hands on one of these computers. It was 1976 and the contraption was the KIM-1. This was a single board device and its name stood for how it worked: Keyboard Input Monitor. Jim’s love of KIM soon spread out like a virus around the community and many other people started saving up their dollars to get these machines. The KIM-1 itself consisted of just a single printed circuit board. All the components were on one side and it had a whopping memory of 1k RAM. The unit had a hexadecimal keypad used for programming. The programs themselves were saved to audiocassette. An add-on keyboard could be attached and up to 4000 characters displayed on a television or monitor. As more people bought the machines, they started to share the programs they had written for them, and helped each-other troubleshoot the persnickety machine, and so a community of devotees grew around the devices. The KIM-1 wasn’t Horton’s first experience working with new technology. As a musician he was trained as a flutist, but had also gotten in on the game of analog synthesis. He had gained a reputation for building very large modular patches that had the ability to self-modify. He would get his friends to bring along their synths and he would connect his synth to theirs building networks of synthesizers. After building a huge and complex patch he would let the system play itself in long eight hour concerts that lasted all night. These concerts were similar to the all night concerts Terry Riley gave and a precursor to the sleep concerts later given by electronic musician Robert Rich. Jim Horton was the quintessential starving artist and he did his work for the glory not the gold. He had saved his meager welfare checks, and instead of buying food, literally starved himself for a synthesizer. He sacrificed to acquire the equipment necessary for realizing his soundworld. Forgoing creature comforts for greater achievement, he was known for plugging straight in to whatever work was at hand, and just getting on with things. One of his bandmates, Tim Perkis, recalls that meeting Jim was a liberating experience. He said, “Horton would show up at a gig with his tangle of loose wires and electronic components in a dresser drawer he would temporarily press into service. With my head full of hesitations born of half-digested conventional wisdom about audio circuitry, it was mind-blowing to see someone just go directly to the heart of the matter, twisting bare wires together, connecting anything to anything, and doing the deeply conceptual musical work which drove him without waiting for the right equipment to appear. He lived in a poverty that never seemed like a limitation to him, and worked with whatever means he had at hand.” In 1977 it was Jim Horton who first proposed the idea of making a microcomputer network band. It happened in an organic way. There was already a group getting together on a regular basis to share the music they were making on their KIM computers. Some of this music was also made with analog circuits and other instruments. At one of these gatherings Horton shared his idea of banding together to create a “silicon orchestra”. He had already demonstrated that synthesizers could be networked together into self-generative, ever shifting systems of musical patches. It was a natural next step to network the computers and other circuits they were building into their own system and listen to the experimental results. Later in the year at Mills College Horton worked with Rich Gold, one of the founding members behind LAMC. The pair put on a concert where the two of them linked their KIMs together. For the performance Horton ran an algorithmic music program based on the harmonic theories of eighteenth century mathematician Leonhard Euler. Rich Gold had written an artificial language program and these two programs interacted with each other for the show. Jim also was working with other future band member John Bischoff at the time and one of the things they had figured out was a piece where tones from John’s KIM would make Jim’s KIM transpose its melodic activity according to a set key note. Then in 1978 John, Jim and Rich all joined together as a trio to give a performance at an artist space in Berkley. Next they were joined by composer David Behrman who had come to California to co-direct the CCM with Robert Ashley, his friend and fellow member of the Sonic Arts Union. Rich Gold and Jim Horton were studying with Behrman at CCM. It was around this time when Behrman recorded his landmark album On the Other Ocean. This album is equally at home in the related but differing milieus of New Music, Ambient, and Minimalism, and on comfortable footing displaying sustained harmonies between electronic and acoustic sounds that slowly dance and revolve around each other until the difference between them blurs. The two pieces on the album feature the KIM-1 microcomputer with flute and bassoon on the title piece, and cello and the KIM-1 on the flip side, Figure in a Clearing. In these pieces the KIM-1 “listens” to the live performers, and accompanies or marks points when particular pitches are played. When Behrman joined LAMC this principle became a recurring theme in their music.
Behrman talks of his time at Mills College, “Some of the students began bringing computers to the Mills Center for Contemporary Music; on the advice of a wise Bay Area artist, Jim Horton, Paul DeMarinis and I bought KIM-1 microcomputers. KIM-1 weighed about 10 ounces and cost around 200 dollars. Around that time I'd been building switching circuits that were placed between primitive pitch-sensors and homemade synthesizers consisting mostly of triangle-wave generators. The switching circuits took a long time to solder together and could only do one thing. It seemed that this new device called the microcomputer could simulate one of these switching networks for a while and then change, whenever you wanted, to some other one. It was fun connecting its port lines to homemade synthesizers, and also to sensors, and writing very simple software to link sensor activity with synthesizer sounds. There was something fascinating about the design of software, even though on the KIM-1 it had to be done in machine language, by pressing keys on a little hexadecimal pad. This was the dawn of 'interactivity' in California, the moment when Jobs and Wozniack were introducing the Apple computer. There was a Bay Area composers group of that era, the Microcomputer Network Band, which liked to do concerts in which the participants would wire together a group of computers on a table, turn them all on, and stand back and watch to see what would happen.”
In November of 1978, now a quartet, the League of Automatic Music Composer gave its first performance using the name. Two years later Rich Gold and David Behrman had left the group to work on other projects. That’s when Tim Perkis swooped in to fill the spots. Tim was interested in music made with alternate tuning systems from various parts of the globe, even playing in a local gamelan group. He was also a Just Intonation fanatic who happened to be skilled with electronics, having a graduate degree in video from California College of Arts and Crafts. If building your own homebrewed electronic instruments is a new kind of folk craft, than Perkis excelled at this craft work, programming his circuits to play in the various tuning systems he collected in his research.
Now in trio form, with a cadre of Bay area musicians and improvisers joining the festivities on occasion at various performances, they played together for four more years in this configuration. They had a habit of getting together on alternate Sundays to play at the Finnish Hall in Berkley, and people were welcome to come in and take in the scene.
Perkis writes, “Audience members could come and go as they wished, ask questions, or just sit and listen. This was a community event of sorts as other composers would show up and play or share electronic circuits they had designed and built. An interest in electronic instrument building of all kinds seemed to be ‘in the air.’ The Finnish Hall events made for quite a Berkeley scene as computer-generated sonic landscapes mixed with the sounds of folk dancing troupes rehearsing upstairs and the occasional Communist Party meeting in the back room of the venerable old building.” During their time the LAMC distilled the spirit of the Bay area and infused its essence into their playful work practice and the music that came out of their curious explorations. Part band and part collective, they blended the communal zeitgeist of the day, with the fermenting intellectual and cultural atmosphere at work in such staples as the Whole Earth Catalog that promoted the use of personal computers alongside solar cells and sprout growing kits as part of the wave of interest in self-sufficiency and appropriate technology prevalent during a decade when the realities of hard limits were entering people’s consciousness. The members of the League had taken mega doses of the do it yourself ethos with regards to technical innovations. Everything they used was homebrewed or built from kits and modular components. All of it was on the table and subject to being taken apart, tinkered with, put to use in experiments. Then they would put it all back together again to see how it worked in a variety of combinations. The League created networks of microcomputers and circuits with an ear towards making one large interactive musical instrument out of the member’s individual computers and components. One came from many. The members of the collective were all interested in computers and programming them to make music. They learned that when they networked their machines together and sent instructions to each other, the amassed circuits of silicon and solder were capable of eliciting what they called new “musical artificial intelligences.” The sound of the leagues music is like a noisy arcade that has been rewired and rerouted in an ad hoc fashion. Amidst the distortion, the random generated tones, and the disorienting arpeggios produced by the circuits and programs, something beautiful occasionally emerges, but the sounds are always interesting and stimulating to the intellect. It’s often messy and unpredictable, but what comes out of the apparent chaos has the feel of sentience and is full of life.
Without the same kind of tools being used by Max Matthews and Laurie Spiegel and others at the big institutions, it should come no surprise that the sounds the League conjured up had more in common with 8-Bit gaming soundtracks, albeit highly dosed and on a recombinant and aleatory West Coast trip, than with the kind of sounds the bigger mainframe computers were making. It was done by a group of individuals dedicated to the notion that computers and people could create their own independent networks, built at home from the circuit board up. Their music has as much in common with the lo-fi aesthetics of garage rock as it does with the pristine waveforms built from code at Bell Labs. The limitations in computer memory, the limits of space on the circuit board, and the haphazard way it all got connected to other components gave their music the flavor of strong home brewed hooch. The sounds get the job done, and in their miasmic chaos, what comes out of the mess of wires is sublime.
The LAMC embraced their role as musical bricoleurs. According to Perkis, “We felt our work was more akin to that of our mentors and friends building gamelans (Lou Harrison and Bill Colvig), mechanical or electro-mechanical musical instruments (Tom Nunn, Chris Brown), or incorporating hacked versions of electrical and new electronic musical toys into their work (Paul DeMarinis, Laetitia Sonami), than to the contemporary institutional computer music. There was always the sense that the music arose out of the material situation, out of idiosyncratic individual players and the anarchic, ad-hoc arrangements they made.” Theirs was a mechanical musical conversation that ranged from noisy arguments to anarchic harmonies.
Their music was also steeped in the traditions of free improvisation that had developed on the West Coast. When they set up their systems, at Finnish hall, or in the living room of a bandmate, they didn’t set about to practice a certain song or pre-composed piece of music, it was rather the ever evolving continual music of the patch in progress, the program in process, the new circuit being added to the mix, or the old circuit being mixed in a new way. Each member had a station of their own equipment, running their own programs, making their own sounds and contributing them to the spontaneous mix. The stations were set up in such a way that the microcomputers could send and receive information from each other, hence being a network band. The novel interactions of each new set up became the piece. It was composed, but it was spontaneous. With each new system set up the result was automatic.
So, as with David Tudor and Pauline Oliveros, the main activity of the musician was in listening. Making adjustments, tinkering with the system, the listening to what happened, after listening again and making new adjustments, tinkering some more and listening again in and endless cycle of discovery and surprise. When they noticed a set up that elicited sounds of beauty, or a sublime alien strangeness, they took notes so they could try to realize that same musical state again. It was true experimental music made in a laboratory they put together themselves. In 1983 all the tinkering and hauling gear was beginning to take a toll on Jim Horton. He had been suffering from rheumatoid arthritis already for some time, and in his way, endured the pain with stoic fortitude, pushing it to one side to continue living his Spartan artistic lifestyle. But it became too much. Eventually the human power supply running the operation had to be unplugged. The LAMC slowed down and then decided to disband. Yet the end of the LAMC wasn’t the end of what Jim and the others had started, but rather a new beginning. Tim Perkis and John Bischoff went on to try and bring a touch of order to the chaotic mess of wires, gadgets and connections that had become their musical practice. They envisioned building a standard interface they could more easily network their computers together with. This they achieved and became the seed for Perkis and Bischoff’s next project, The Hub.
.:. .:. .:.
Read the rest of the Radiophonic Laboratory: Telecommunications, Electronic Music, and the Voice of the Ether. Mumma’s early encounters with John Cage and David Tudor, his work with them in the ONCE Festival and other situations primed him for his eventual work with the Merce Cunningham Dance Company. Merce Cunningham was one of the great American dance artists of the 20th century. Cunningham was born in Centralia, Washington in 1919. He started off learning tap dancing from a local teacher where his ear for rhythm and sense of timing were honed from an early age. He later attended the Cornish School in Seattle from 1937 to study acting and mime, but didn’t take to it. He loved the way dance could be ambiguous while also allowing for full expression of movement. Martha Graham had seen him dance during this time period and she invited him to join her company. It was through Graham that Cunnigham’s life intersected with Cage in something of a chance operation. Graham had needed a musical accompanist for her dancers. One of her pupil’s, Bonnie Bird, recommended composer Lou Harrison, who declined, but suggested in his place the young Cage. Cunningham and Cage met in 1938 and later became romantically involved, and life partners until Cage’s death in 1992. Cunningham sometimes played in Cage’s percussion group at the time, and they had become quick friends. Over the subsequent years Merce loved to talk to John about ideas. As each of their personal situations evolved in art and life, Merce finally took the step of establishing his own dance company in 1953 and Cage came along Cage for the ride as companies music director. Cunnigham’s Company had many opportunities as it grew over the years. Cage’s own career continued with more and more in the late 60s and throughout the 70s. As each pursued their vision other musicians needed to step in to the role of director when Cage wasn’t available. Mumma and Behrman, among others, were natural choices, due to their friendship and affinity. Mumma states it was never very clear how he ended up working with the Cunningham Dance Company, but it was something he just drifted into through these associations. In the 60s and 70s Cunningham’s troupe made increasing use of electronics and this was an area where Mumma’s expertise could shine. He was a perfect fit; primed by his dedicated work as a creative composer, a cunning electronic technician, and as someone for whom the collaborative mode was second nature. In Mumma’s work with Cunningham’s troupe he got a chance to use all of these aspects of his character and put them to the test on tours that tested the endurance and dedication of everyone involved. The programs often involved collaborative music making and separate choreography, the latter determined by chance operations. The musicians were free to draw from their personal repertoires, and combine it with original material. Mesa The first major piece Mumma wrote for Cunningham’s company was Mesa in 1966 for the dance Place. He was already working on something with David Tudor, who worked regularly in the company, when this came about. Instead of starting over he decided to alter the work in progress to accommodate the commission. Tudor had gotten into playing the Bandoneon, a relative of the accordion and squeezebox that has become popular in Argentina. It became the perfect instrument for Mesa because of its wide frequency and dynamic range. The Bandoneon can also produce long sustained drones and sounds, just what Mumma for the monolith that was taking shape. Like the geological feature after which it is named, Mesa, is a tectonic slab of music sustained at one level of thrust with occasional interruptions. He had thought of using tape for the piece, but the dynamic range he wanted couldn’t be contained with the tape. That was one concept for the piece. The other was his desire to use “an inharmonic frequency spectrum with extremes of sound density.” In the performace space the placement of different portions of the sound in different loudspeakers creates a spatial diffusion. The final mixing of the sound is in the ears of the listener. To further extend the dynamic range of Tudor’s instrument and create the timbres he imagined Mumma needed to design a circuit. The piece represented a creative problem and a technical challenge. His electronics needed to be able to translate frequencies, equalize, and required the use of logic circuitry in tricky configurations to control musical continuity. It’s another composition where the circuit diagram and instructions are more of the score than notated music. Mumma developed Voltage Controlled Attenuators (VCA) in collaboration with Dr. William Ribbens in Ann Arbor. These extended the range while also including envelope controls. Ribbens is a Professor Emeritus of Electrical Engineering and Computer Science, and Aerospace Engineering at the University. In performance six microphones are attached to the Bandoneon, three on each side. The microphones are different with each being sensitive to different frequency bands. As a way of “thickening the plot” and for other reasons Mumma fed one mic from each side into the other side of the circuit. Six channels of sound from one instrument source are being processed to create this massive place. Using a logic circuit Mumma was able to route control signals and program signals to different channels during performance. He used a frequency shifter with equalization that processed parts of the sound determined by internal control signals or from Tudor playing the Bandoneon. The logic circuit itself determined the source and nature of the control signal. Mumma used a multiplier to take portions of the spectrum and transform it by whole integers to further equalize the sounds. Phase and amplitude modulators also work with portions of the sound, gating parts of the spectrum transfer with the output from the multiplier. Further gates, formant modulators, pass band filters and other baroque electrical wizardry were also built into the circuit score of Mesa. In creating the piece he was setting up a cybersonic system. The VCA also included delays that further shaped the envelope of the program signal. Mumma wanted to use very specific delays that were not possible with either electronic manipulation, or from a mechanical source, such as building a tape delay. Mumma writes, “the solution to this problem is inherent in the concept of MESA itself, since at this point in the system it is the envelope of the otherwise sustained sounds which is to be shaped. This is achieved by subjecting the VCA control signals to frequency-sensitive thermal-delay circuitry. The wide dynamic range of the VGA is due to special bias procedures.” Every control signal for sound modification first comes from the Bandoneon. “Because the control signals are automatically derived from the sound materials themselves, I call the process, and the music, "cybersonic".” What Mumma has created in Mesa is a situation where the Bandoneonist can play a duet with a piece of electronic circuitry. A third person, most often Mumma, in performance, tweaks the circuit live to override parts of the internal logic with an artist’s intuition. Telepos One of the pieces by Mumma used by Cunnigham in a variety of settings including TV Rerun was his Telepos (1972). For this he made belts to be worn by the dancers that contained small accelerometers, a device that measures vibrations and accelerations in motion. The belts were also equipped with voltage controlled oscillators and a miniature UHF transmitter. Inspired by telemetry, or the transmission of device data that is read remotely at a different point of reception, the dancers made music by their movements “in a process similar to that encountered in space travel, undersea, or biomedical research.” REUNION Mumma worked with the group for seven full seasons and also collaborated on works with individuals from the circle. He also continued to work with Cage. One such instance was as part of the creation of a soundtrack to an electronic game of chess. Reunion was a big piece conceived by John Cage as a chess game to be played between himself and Marcel Duchamp and a second match with Teeny Duchamp. It had a collaborative musical element performed by Gordon Mumma, David Behrman, and David Tudor on an electronic chessboard designed by Lowell Cross. The chess board controlled certain aspects of the live electronic music. Cage had first met Marcel in the early 1940’s when they were both in New York, but the meeting had been awkward, due to a blowup between Cage and Peggy Guggenheim, who had first introduced them. At that time Cage and his then-wife Xenia were being put up by Peggy after they had moved from Chicago. Cage took a gig at the Museum of Modern Art, when he also had a gig at Peggy’s new art gallery. She felt snubbed by him having a show she thought stole the spotlight from her presentation of his music in the city. At the time he was so in awe of Duchamp, he didn’t want to disturb him, but simply enjoyed in his presence. In the winter of 1965-66 Cage’s circle and Duchamp’s overlapped again and they found themselves at the same parties. Cage had long been an admirer of Duchamp and they shared a number of sensibilities, one appreciating readymade objects and the appreciating readymade music of sound occurring everywhere in life. He wanted to reacquaint himself with Duchamp, but wasn’t sure how to go about it, until he asked Teeny if she thought Duchamp would tutor him in chess. She said to ask the man himself, and when he got the gumption to do so, Duchamp said yes. He started to meet with Duchamp once a week to learn the game, and other social visits followed, including vacationing with the couple in Spain. Though he had used chess as a ruse to get to know the artist he admired, Cage was fascinated with the game and became a serious player. More often than not he lost to his teacher, who had played chess for decades. In 1968 the idea for Reunion was hatched. According to Mumma it “descended upon us at the same time” and the exact source of it was obscured amongst the collaborators. At the time Cage was very interested in expanding the people with whom he collaborated beyond the group of musicians and electronic pioneers who had clustered around him and Cunningham. Lowell Cross was one of the people Cage was interested in working with. At the time Cross was writing a thesis that explored the history of electronic music and electronic music studios from between 1948 and 1953, and Cage played a large role in his thesis. Cross was studying media and society under Marshall McLuhan at the University of Toronto, and also ethnomusicology with Mieczysław Koliński, and electronic music with Gustav Ciamaga and Myron Schaeffer. Cage had been interested in Lowell’s work as an instrument builder, and had known about his device called the Stirrer, which was a panning system for moving up to four sounds in space which he had created between 1963-65. Cage called him in February of 1968 and asked him if he could build him an electronic chessboard capable of selecting and diffusing sounds around an audience in a concert hall as a game unfolded. Cross at first declined, politely, because he was swamped with his work at school. Cage then made his move and said, “Perhaps you will change your mind if I tell you who my chess partner will be.” When Duchamp’s name was dropped it was enough to persuade the assiduous student to get even busier and build what would become the 16-input, 8-output chessboard used in the subsequent performance. The chessboard had sensors that triggered the electronic music being produced by the musicians according to the way the pieces were moved. The outcome musically was beyond the control of the performers, who each had their own systems and set-ups feeding into the mix. The board was also equipped with contact microphones that picked up the movement of the pieces. At the performance on March 5th, which kicked of the “Sightsoundsystems” performance series organized by composer Udo Kasmets, the chess players sat and smoked cigarettes and drank wine while the musicians made electronic sounds. The performance lasted for four hours and was a celebration of everyday life as a form of art. Marshall McCluhan was noted to have been in the audience. It was these kind of collaborative group work situations that Mumma found himself to be drawn to and a part of over and over again. Mumma’s talent as a composer, player, electronics specialist and creative thinker made him an invaluable asset to all the groups and milieus he circulated within and between. .:. .:. .:.
Read the rest of the Radiophonic Laboratory: Telecommunications, Electronic Music, and the Voice of the Ether. |
Justin Patrick MooreAuthor of The Radio Phonics Laboratory: Telecommunications, Speech Synthesis, and the Birth of Electronic Music. Archives
August 2024
Categories
All
|