Stockhausen picked up his interest in information theory by way of Werner Meyer-Eppler during his time as a student at the Bonn. Meyer-Eppler himself was something of a scientific polymath, having studied mathematics, chemistry and physics at the University of Cologne in the late 1930s before going to the Bonn where he became a scientific assistant in the physics department, and then a lecturer in experimental physics. After WWII ended his attention turned with laser beam focus to the subject of phonetics and speech synthesis. In 1947 Paul Menzerath brought him into faculty of the Phonetic Institute of the University of Bonn. It was in this time period when Meyer-Eppler started publishing essays on the Voder, Vocoder, and Visible Speech Machine. One of his contributions to the field that is still in use today was his work on the development of the electrolarynx.
Information theory made many contributions to many fields. Linguistics was one of those fields where it was influential in studying how frequently words were used, word length, and the speed words could be read. Shannon had tested the information theory principle of redundancy, or the amount of wasted space used to transmit a message, by having his wife predict the number of repeating letters in a random crime novel he pulled off his bookshelf. Sometimes redundant is better, when it comes to getting a message across. Redundancy is added while communicating over noisy channels as a method of error correction.
Shannon had the insight that this was a baked-in purpose behind the repetition of letters in Englsih. He had also showed that he could use stochastic processes to build something that resembled the English language from scratch.
Werner had been following these developments of information theory with special attention to their applications in linguistics and speech. Later in the 50’s Meyer-Eppler became concerned with how statistics and probability, core tools of information theory, might be applied to creating electronic music as explored in his book Statistic and Psychologic Problems of Sound. In this work Meyer-Eppler introduced the word “aleatoric” into the musical lexicon. According to his definition “a process is said to be aleatoric ... if its course is determined in general but depends on chance in detail”.
Aleatoric music is made when some element of the composition is left to chance or when a significant portion of how the composition is realized is left up to the performer or performers. Aleatoric composition has a precedent in the dice games of the 18th century. The word itself comes from the Latin alea, meaning dice.
There are many methods for applying aleatoric processes to music. One of the ways Stockhausen tackled it was by using a polyvalent structure, or writing a piece that was open to a number of different interpretations. Klavierstucke XI is an example of such a piece that he wrote for piano.
The piece is made up of 19 fragments printed on a very large piece of paper. There is no turning of the sheet music. The pianist may start with any fragment they wish and from there continue on to any other fragment they wish to play. It is polyvalent because each performance could begin and end in new places. There is no set musical narrative; it is more like reading a choose-your-own adventure, or wondering through a maze, or labyrinth which the pianist enters, circumnavigates, and then returns. Each time the pianist may enter the labyrinth from a new entrance and likewise, reemerge in a different place.
The pianist shares responsibility with the composer for the eventual shape of any given performances. The possible permutations are vast, yet even in different interpretations it may be heard as the same piece of music, its essential characteristics remaining the same no matter the order they are played.
Commenting on the piece the composer said, "Piano Piece XI is nothing but a sound in which certain partials, components, are behaving statistically... If I make a whole piece similar to the ways in which (a complex noise) is organized, then naturally the individual components of this piece could also be exchanged, permutated, without changing its basic quality."
Considered as a whole Piece XI will sound the same even though every time it is played it will sound different. It is a system unto itself, and as a system, even when the component parts are rearranged in the order they are played it is still the same system, and will still sound like itself. Listened to statistically the musical values remain the same.
Stockhausen would go on to use polyvalent form again and again. In his percussion piece Zyklus (Cycle) from 1959 the score is printed as a spiral and the performer may start anywhere within the spiral he or she chooses. Furthermore, they may play the piece from left to right or right to left. The piece is finished when the player reaches the original starting point. In the performance space the cycle is shown again visually with the percussion pieces laid out in a circle with the performer moving around them in the manner determined by a chosen starting point.
Zyklus also shows the amazing diversity of possible interpretations demonstrated before in Piece XI. It is however the interpretation of the scored is a bit more closed. On one side of the score the music becomes increasingly aleatoric, giving more freedom to the player in how it is interpreted. On the other side of the spiral the composition is exactly fixed and predetermined. Played on way it moves from fixed to open, and in the other direction from open to closed.
Stockhausen was obsessed with cycles. Specifically cycles of time. His mid-seventies composition Tierkreis (Zodiac) consisted of twelve melodies for each of the twelve zodiac signs. Originally written for custom made music boxes, Tierkreis can be played on any melody instrument and peformed in a number of different ways. For the purpose here a complete performance begins with the melody for the corresponding zodiac sign for the day when the performance is being held. For instance, if the performance was held on August 22 the performers would begin with the Leo melody and proceed through Virgo, Libra, and the rest until they return to the starting melody of Leo. Each melody is played at least three times and may be improvised upon. This gives considerable variation to individual performances. Further variations are specified by the composer.
In his chamber opera Sirius written a few years later the Tierkreis melodies are employed again in a section of the piece called The Wheel. Here the music may be heard in four different ways, depending on the season it is performed. If played in the Winter the section starts with the melody for Capricorn, if in the Spring with Aries, Summer starts with Cancer, and Autumn with Libra.
In all of these cyclical works an echo of the tape loop may be heard. Stockhausen had worked with tape loops extensively in his piece Kontakte, using them to show relationships between pitch, timbre and the way musical events can be perceived in time and space through the process of slowing things up or down. I wonder, if besides the strong grounding Stockhausen had in religion, philosophy, and science if the eternal return and recurrence of the tape loop at all framed his cosmic conception of the vast cycles of time.
The cycles continued in his magnum opus LICHT (Light): Die sieben Tage der Woche (The Seven Days of the Week). Written between 1977 and 2003 it is a cycle of seven operas, one for each of the seven days of the week. Stockhausen described the work as an “eternal spiral” considering there to be “neither end nor beginning to the week.” Clocking in at a total duration of 29 hours, deft intricacies exist within the piece on a micro and macro scale and many volumes have already been and will continue to be written about it. Within the broad palette afforded by an opera cycle longer than Wagner’s the Ring, Stockhausen was able to play the role of a Magister Ludi, or master of the Glass Bead Game. LICHT is a system, and within that system Stockhausen playfully and masterfully displayed with pyrotechnic virtuosity a comprehensive knowledge of combinatorial and permutative arts as applied to music.
These arts of combination were a central component of the Glass Bead Game as played in the novel.
To show how all of these interlocking parts fit together the basic structure of the opera must be examined. And to understand LICHT as a system a slight change of lanes onto the parallel track of Norbert Wiener and his theory of cybernetics is in order.
Read the rest of the Radiophonic Laboratory Series.
From the ice cold farms and fields of Michigan to the halls of MIT and then onwards to Bell Labs at Murray Hill, Claude Shannon was a mathematical maverick and inveterate tinkerer. In the 1920s, in those places where the phone company had not deigned to bring their network, around three million farmers built their own by connecting telegraph keys to the barbed wire fences that stretched between properties. As a young boy Shannon rigged up one of these “farm networks so he and one his friend who lived half a mile away could talk to each other at night in Morse code. He was also the local kid people in the town would bring their radios to when they needed repair and he got them to work. He had the knack.
He also had an aptitude for the more abstract side of a math and his mind could handle complex equations with ease. At the age of seventeen he was already in college at the University of Michigan and had published his first work in an academic journal, a solution to a math problem presented in the pages of American Mathematical Monthly. He did a double major in school and graduated with degrees in electrical engineering and mathematics then headed off to MIT for his masters.
While there he got under the wing of Vannevar Bush. Vannevar had followed in the footsteps of Lord Kelvin, who had created one of the world’s first analog computers, the harmonic analyzer, used to measure the ebb and flow of the tides. Vannevar’s differential analyzer was a huge electromechanical computer that was the size of a room. It solved differential equations by integration, using a wheel-and-disc mechanisms to perform the integration.
At school he was also introduced to the work of mathematician George Boole, whose 1854 book on algebraic logic The Laws of Thought laid down some of the essential foundations for the creation of computers. George Boole had in turn taken up the system of logic developed by Gottfried Wilhelm Leibniz. Might Boole have also been familiar with Leibniz’s book De Arte Combinatoria? In this book Leibniz proposed an alphabet of human thought, and was himself inspired by the Ars Magna of Ramon Lull. Leibniz wanted to take the Ars Magna, or “ultimate general art” developed by Lull as a debating tool that helped speakers combine ideas through a compilation of lists, and bring it closer to mathematics and turn it into a kind of calculus. Shannon became the inheritor of these strands of thought, through their development in the mathematics and formal logic that became Boolean algebra.
Between working with Bush’s differential analyzer and his study of Boolean algebra, Shannon was able to design switching circuits. This became the subject of his 1937 master thesis, A Symbolic Analysis of Relay and Switching Circuits.
Shannon was able to prove his switching circuit could be used simplify the complex and baroque system of electromechanical relays used in AT&T’s routing switches. Then he expanded his concept and showed that his circuits could solve any Boolean algebra problem. He finalized the work with a series of circuit diagrams.
In writing his paper Shannon took George Boole’s algebraic insights and made them practical. Electrical switches could now implement logic. It was a watershed moment that established the integral concept behind all electronic digital computers. Digital circuit design was born.
Next he had to get his PhD. It took him three more years, and his subject matter showed the first signs of multidisciplinary inclination that would later become a dominant feature of information theory. Vannevar Bush compelled him to go to Cold Spring Harbor Laboratory to work on his dissertation in the field of genetics. For Vannevar the logic was that if Shannon’s algebra could work on electrical relays it might also prove to be of value in the study of Mendelian heredity. His research in this area resulted in his work An Algebra for Theoretical Genetics, for which he received his PhD in 1940.
The work proved to be too abstract to be useful and during his time at Cold Spring Harbor he was often distracted. In a letter to his mentor Vannevar he wrote, “I’ve been working on three different ideas simultaneously, and strangely enough it seems a more productive method that sticking to one problem… Off and on I have been working on an analysis of some of the fundamental properties of general systems for the transmission of intelligence, including telephony, radio, television, telegraphy, etc…”
With a doctorate under his belt Shannon went on to the Institute of Advanced Study in Princeton, New Jersey where his mind was able to wonder across disciplines and where he rubbed elbows with other great minds, including on occasion, Albert Einstein and Kurt Gödel. He discussed science, math and engineering with Hermann Weyl and John Von Neumann. All of these encounters fed his mind.
It wasn’t long before Shannon went elsewhere in New Jersey, to Bell Labs. There he got to rub elbows with other great minds such as Thornton Fry and Alan Turing. His prodigious talents were also being put to work for the war effort.
It started with a study of noise. During WWII Shannon had worked on the SIGSALY system that was used for encrypting voice conversations between Franklin D. Roosevelt and Winston Churchill. It worked by sampling the voice signal fifty times a second, digitizing it, and then masking it with a random key that sounded like the circuit noise so familiar to electrical engineers.
Shannon hadn’t designed the system, but he had been tasked with trying to break it, like a hacker, to see what its weak spots were, to find out if it was an impenetrable fortress that could withstand the attempts of an enemy assault.
Alan Turing was also working at Bell Labs on SIGSALY. The British had sent him over to also make sure the system was secure. If Churchill was to be communicating on it, it needed to be uncrackable. During the war effort Turing got to know Claude. The two weren’t allowed to talk about their top secret projects, cryptography, or anything related to their efforts against the Axis powers but they had plenty of other stuff to talk about, and they explored their shared passions, namely, math and the idea that machines might one day be able to learn and think.
Are all numbers computable? This was a question Turing asked in his famous 1937 paper On Computable Numbers. He had shown the paper to Shannon. In it Turing defined calculation as a mechanical procedure or algorithm.
This paper got the pistons in Shannon’s mind firing. Alan had said, “It is always possible to use sequences of symbols in the place of single symbols.” Shannon was already thinking of the way information gets transmitted from one place to the next. Turing used statistical analysis as part of his arsenal when breaking the Enigma ciphers. Information theory in turn ended up being based on statistics and probability theory.
The meeting of these two preeminent minds was just one catalyst for the creation of the large field and sandbox of information theory. Important legwork had already been done by other investigators who had made brief excursions into the territory later mapped out by Shannon.
Telecommunications in general already contained within it many ideas that would later become part of the theories core. Starting with telegraphy and Morse code in the 1830s common letters expressed with the least amount of variation, as in E, one dot. Letters not used as often have a longer expression, such as B, a dash and three dots. The whole idea of lossless data compression is embedded as a seed pattern within this system of encoding information.
In 1924 Harry Nyquist published the exciting Certain Factors Affecting Telegraph Speed in the Bell System Technical Journal. Nyquist’s research was focused on increasing the speed of a telegraph circuit. One of the first things an engineer runs into when working on this problem is how to transmit the maximum amount of intelligence on a given range of frequencies without causing interference in the circuit or others that it might be connected to. In other words how do you increase speed and amount of intelligence without adding distortion, noise or create spurious signals?
In 1928, Ralph Hartley, also at Bell Labs, wrote his paper the Transmission of Information. He made it explicit that information was a measurable quantity. Information could only reflect the ability of the receiver to distinguish that one sequence of symbols had been intended by the sender rather than any other, that the letter A means A and not E.
Jump forward another decade to the invention of the vocoder. It was designed to use less bandwidth, compressing the voice of the speaker into less space. Now that same technology is used in cellphones as codecs to compress the voice and so more lines of communication can be used on the phone companies allocated frequencies.
WWII had a way of producing scientific side effects, discoveries that would break on through to affect civilian life after the war. While Shannon worked on SIGSALY and other cryptic work he continued to tinker on other projects. Shannon’s paper was one of the things he tinkered and had profound side effects. Twenty years after Hartley addressed the way information is transmitted, Shannon stated it this way, "The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."
In addition to the ideas of clear communication across a channel Information theory also brought the following ideas into play:
-The Bit, or binary digit. One bit is the information entropy of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known.
-The Shannon Limit: A formula for channel capacity. This is the speed limit for a given communication channel.
-Within that limit there must always be techniques for error correction that can overcome the noise level on a given channel. A transmitter may have to send more bits to a receiver at a slower rate but eventually the message will get there.
His theory was a strange attractor in a chaotic system of noisy information. Noise itself tends to bring diverse disciplinary approaches together, interfering in their constitution and their dynamics. Information theory, in transmitting its own intelligence, has in its own way, interfered with other circuits of knowledge it has come in contact with.
A few years later psychologist and computer scientist J.C. R. Licklider said, “It is probably dangerous to use this theory of information in fields for which it was not designed, but I think the danger will not keep people from using it.”
Information theory encompasses every other field it can get its hands on. It’s like a black hole, and everything in its gravitational path gets sucked in. Formed at the spoked crossroads of cryptography, mathematics, statistics, computer science, thermal physics, neurobiology, information engineering, and electrical engineering it has been applied to even more fields of study and practice: statistical inference, natural language processing, the evolution and function of molecular codes (bioinformatics), model selection in statistics, quantum computing, linguistics, plagiarism detection. It is the source code behind pattern recognition and anomaly detection, two human skills in great demand in the 21st century.
I wonder if Shannon knew when he wrote ‘A Mathematical Theory of Communication’ for the 1948 issue of the Bell Systems Technical Journal that his theory would go on to unify, fragment, and spin off into multiple disciplines and fields of human endeavor, music just one among a plethora.
Yet music is a form of information. It is always in formation. And information can be sonified and used to make music. Raw data becomes audio dada. Music is communication and one way of listening to it is as a transmission of information. The principles Shannon elucidated are form of noise in the systems of world knowledge, and highlight one way of connecting different fields of study together. As information theory exploded it was quickly picked up as a tool among the more adventurous music composers.
Information theory could be at the heart of making the fictional Glass Bead Game of Herman Hesse a reality. Herman Hesse also dropped several hints and clues in his work that connected it with the same thinkers whose work served as a link to Boolean algebra, namely Athanasius Kircher, Lull and Leibniz who were all practitioners and advocates of the mnemonic and combinatorial arts. Like its predecessors, Information Theory is well suited to connecting the spaces between different fields. In Hesse’s masterpiece the game was created by a musician as a way of “represent[ing] with beads musical quotations or invented themes, could alter, transpose, and develop them, change them and set them in counterpoint to one another.” After some time passed the game was taken up by mathematicians. “…the Game was so far developed it was capable of expressing mathematical processes by special symbols and abbreviations. The players, mutually elaborating these processes, threw these abstract formulas at one another, displaying the sequences and possibilities of their science.”
Hesse goes on to explain, “At various times the Game was taken up and imitated by nearly all the scientific and scholarly disciplines, that is, adapted to the special fields. There is documented evidence for its application to the fields of classical philology and logic. The analytical study had led to the reduction of musical events to physical and mathematical formulas. Soon after philology borrowed this method and began to measure linguistic configurations as physics measured processes in nature. The visual arts soon followed suit, architecture having already led the way in establishing the links between visual art and mathematics. Thereafter more and more new relations, analogies, and correspondences were discovered among the abstract formulas obtained this way.”
In the next sections I will explore the way information theory was used and applied in the music of Karlheinz Stockhausen.
Read the rest of the Radiophonic Laboratory series.
A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni and Rob Goodman, Simon & Schuster, 2018
The Information: a history, a theory, a flood by James Gleick, Pantheon, 2011
The Glass Bead Game by Herman Hesse, translated by Clara and Richard Winston, Holt, Rinehart and Winston, 1990
Information Theory and Music by Joel Cohen, Behavioral Science, 7:2
Information Theory and the Digital Age by Aftab, Cheung, Kim, Thakkar, Yeddanapudi
Logic and the art of memory: the quest for a universal language, by Paolo Rossi, The Athlone Press, University of Chicago, 2000.
Justin Patrick Moore
Husband. Father/Grandfather. Writer. Green wizard. Ham radio operator (KE8COY). Electronic musician. Library cataloger.