History of AI
4. Generative Modelling
4. Generative Modelling
4.1 Artificial Life
In the mid-1980s a transformative shift occurred, marking the transition from the pioneering era. Cybernetics, once a dominant science, gradually faded away, making room for a new epoch characterised by advancements in computational power, the emergence of early graphical applications, and the advent of synthetic sound production. Amidst these evolving landscapes a resurgence of interest in neural networks took root, propelled by their capacity to navigate increasingly intricate configurations, building upon the foundations laid by Rosenblatt's perceptron.
Simultaneously, rule-based methodologies gained attention, propelled by the development of expert systems designed to encapsulate human expertise within logical frameworks. Inspired by cybernetics, complexity theory and the broader domain of AI, a nascent field emerged: artificial life (ALife). ALife sought to emulate vital processes using computational means, employing a bottom-up approach to construct intricate systems from elemental components. This methodology frequently simulated evolutionary and adaptive processes, which were inherently challenging to capture through top-down approaches that deconstructed systems into their constituent parts for analytical purposes. This field drew inspiration from John Conway's seminal Game of Life, a cellular automaton introduced in Scientific American in 1970, laying the groundwork for evolutionary algorithms.
Epic Conway's Game of Life.
4.2 Evolutionary Algorithms
One such influential algorithm was the biomorph, conceived by ethologist Richard Dawkins, which recursively generated branching lines reminiscent of biological structures. In 1986 William Latham fused Dawkins's evolutionary engine with novel geometric shapes and 3D graphics, collaborating with Stephen Todd to develop the Mutator program. Mutator aimed to construct a virtual ecosystem where forms could evolve and mutate over time through manipulation of parameters and growth rules. The works produced by Mutator often exhibited organic and abstract structures reminiscent of natural and biological forms, offering profound insights into the nexus between nature, technology and human creativity. Latham's endeavours represented a watershed moment in digital art, showcasing the creative potential of evolutionary algorithms in graphic design and inspiring artists, designers and architects from the early 1990s onwards.
During the same years we also witnessed the theoretical development of the first recurrent neural networks (RNN), notably through the work of John Hopfield who, in 1982, proposed the use of a recurrent structure for data processing (Hopfield Network). However, their practical application and widespread use in the fields of music generation and natural language processing (NLP) would only occur in the subsequent decades.
4.3 Second AI Winter
By the late 1980s, however, the resurgence of interest and funding in artificial intelligence and artificial life encountered headwinds due to limitations in the real-world applicability of the introduced algorithms, precipitating what became known as the "second AI winter."
This shift towards artificial intelligence systems developing their own autonomous understanding of musical elements represents the foundation of today's advanced musical intelligence.
4.4 Experiments in Musical Intelligence
In the 1980s, the composer and researcher David Cope, with his Experiments in Musical Intelligence (EMI), strongly advocated that computer-assisted composition could embrace a deeper understanding of music from three distinct perspectives:
- analysis and segmentation into parts,
- identification of common elements and patterns that define what is perceived as style, and
- recombination of musical elements to create new works.
His work was based on the idea of recombining elements from previous compositions to create new musical pieces. Many of the greatest composers of all time have explored this concept, consciously or not, as they reshaped existing ideas and styles in their work, e.g. the ReComposed series by Deutsche Grammophon. With EMI, Cope aimed to replicate this process through the use of computers and their computing power. Cope's work laid the groundwork for many of the current AI models on the market. Initially, music and its attributes are encoded into databases, then recombinant segments are extracted using specific identifiers and pattern matching systems. From there, musical segments are categorised and reconstructed in a logical and musical order using augmented transition networks until new music output is produced. This type of 'regenerative' construction of music is reminiscent of many of today's neural networks such as MuseNet by OpenAI, which uses a transformer-based architecture to generate compositions with multiple instruments in a variety of styles.
David Cope's "Chorale (after Bach)," from the album Bach by Design (1994).
Other developments in this period continued to explore the boundaries of computational creativity. Composer Robert Rowe devised a system enabling a machine to deduce metre, tempo and note durations as someone plays freely on a keyboard. Furthermore, in 1995 Stephen Thaler's company, Imagination Engines Inc., utilised reinforcement learning to train a neural network with popular melodies, resulting in the creation of over 10,000 new musical choruses. This method involves rewarding or penalising the model based on its decisions, aiming to achieve predefined goals.
The shift towards AI systems that autonomously develop their understanding of musical elements forms the cornerstone of contemporary, advanced musical intelligence. In the mid-1980s the first AI research endeavours also commenced at IRCAM, a research and artistic production institute in Paris founded by Pierre Boulez about a decade earlier. Specifically, their focus lay primarily on the development of rule-based systems for sound synthesis (Formes) and environments based on LISP for the symbolic manipulation of musical structures (PatchWork, OpenMusic). Research has relentlessly continued to the present day, addressing the creation of instrumental sample databases and assisted orchestration (Orchidea).