Wednesday, April 29, 2009

More on Grammatical Subdivision

Here's a rough diagram of the process with the entities shown on the left and the process shown on the right.

Grammatical Subdivision

I had somewhat of a small breakthrough several minutes ago. Grammatical percussion is really all about subdividing time. It's about taking a given chunk of time and doing something with it. Unlike the melodic grammatical system, the percussive system should have "words" that correspond to different ways to subdivide a set amount of time. This way, one does not have to deal with overlapping words and other problems that arise when the length of a word isn't absolutely defined.

The process of subdivision is recursive, so the process might start at the level of a measure and ask, "ok, how do I want to subdivide this measure?" The response may be, "I want to equal subdivisions." The process will then make another decision: "Do I want to go ahead and replace this chunk of time (half a measure) with an 8-beat word? Or do I want to subdivide it further?" Let's say the computer decides to split the first chunk of time into two quarter notes (4 beats, also 1/4 of a measure). It decides to replace the second chunk with a word.

So the system really operates around two entities and one process: time "chunks ," words, and subdivision, respectively.

Conceptually this is a big step forward for grammatical drumming. With any luck it will make the process a lot less painful.

Tuesday, April 28, 2009

Representing Drums With Grammar

I'm still trying to resolve the grammatical method to work with drums and other percussive instruments. The problem is that, while melodic instruments only require notes (and details such as velocity, panning, etc.), drums require notes that correspond to different drums. Thus, the notes are not related in the same way that melodic notes are related in. Drums have no "key" and do not play contoured riffs.

Representing drum patterns within a grammar will require a more flexible grammar that allows for multiple notes on the same beat. I'm thinking of using a very low-level grammar to describe individual hit patterns on the different drums, then combining these in higher-level grammars that describe riffs, fills, and grooves that finally combine into the highest-level grammar that describes an entire style of playing.

Monday, April 27, 2009

Module Instructions

After the recent data structure overhaul, adding new features to mGen feels like a breeze, which is a nice treat. Today I wrote a basic module instruction mechanism that allows the structure module to work with the generative modules to coordinate the composition at a higher level. This is, of course, essential to the coherence of the composition and is a feature that will require a lot of refining if I hope to get good material out of mGen.

Basically, the structure module can now coordinate, for instance, when the piano should come in, when the drums should make an entrance, when things should get softer, and when things should get heavier. There's now an overlaying set of general module instructions to help the generative modules achieve a greater coherence.

Also, the data structure now allows other modules to effectively "see" each other even before they have generated any output. This was necessary at first because the structure module needed to be able to see the generative modules before it could start giving part instructions...you can't rely on a nonexistent pianist to start a song, nor a nonexistent drummer to get fancy with a solo! As a consequence, modules can also now see each other. Conceivably, this could be used for a dynamic interactivity between them. Although there is no data structure in place yet to allow modules to communicate between each other, that may be a feature in the future. This could allow, for example, the drum and the bass modules to "establish a groove" before they start generative the composition. Communication is essential in a real band, so it should be essential in mGen as well.

Lots of progress is being made these past few days. mGen's compositions are taking less and less of my intervention to sound good. I usually just plop in a synth doing some rhythm work or harmony and then lay down a drum groove. Throw in some nice mixer effects and it all sounds pretty darn impressive, or at least I think. Soon enough mGen will be doing all of that autonomously. It's a scary thought. But it's the future of music.

Thursday, April 23, 2009

Complete Internal Overhaul

Tonight was a long night for mGen. At the begin of the night, the poor thing was told it had an obsolete internal data structure that lacked much structure to speak of. Furthermore, mGen was accused of inefficient data handling that was making other modules work harder than necessary.

After an immense surgery that lasted about four hours, mGen is now smiling with a brand-new, sparkly internal data structure. The structure now conforms to the data structures used by other modules and makes it much easier for all other components of the program to access information about the composition on the fly. The surgery has, however, rendered mGen uncompatible with most of the previous plugins I wrote to accompany it. They too much schedule an appointment for surgery to become capable of taking advantage of mGen's new data structure.

In short, I rebuilt the internals of the mGen framework from scratch. It's an investment in the future, where data handling will be done much more efficiently by the program. It's really quite a huge change/improvement, as reflected by the half a thousand line increase in code length.

Wednesday, April 22, 2009

Tweaking the GUI

I made the GUI look a little prettier today. I am still withholding screenshots from the blog, however, because I do not want it to be seen until it is ready.

I am also working on filling up the main panels of the program, since the border regions are populated with functional tools, but the center area is completely blank. I'm not sure what to put there.

Monday, April 20, 2009

Grammar: Success?

Well, after coming back from a long break, I have some good progress to report.

I tried implementing a grammatical system for algorithmic composition over the break and had a good deal of success in my endeavors. Although I created only a rudimentary composition language and a very basic phrase generator, the results seem more natural and interesting than any other method explored thus far, which means progress!

After a great deal of thinking on the subject, I've decided that a grammatical system might be the key I've been looking for to a successful path to my goals. The trick is that I can use a grammatical system as the underlying paradigm for other methods. In other words, I could have an evolutionary model that uses an underlying higher-level grammar to generate phrases. In theory, this idea is really no different than using an evolutionary model to generate an equally-abstract number that corresponds to a certain pitch (a MIDI note event). I could do the same thing with Markov chains and state-transition matrices.

There are a lot of places to go with grammatical algorithmic composition. I have a feeling I'm just scraping the surface of something big. Let's hope I'm not let down.

Thursday, April 9, 2009

Tuesday, April 7, 2009

Rhythm and Meter

As the foundations upon which music is built, rhythm and meter will play an obvious and pivotal role in my program. Unfortunately, I have read very little on the topics, as Music, the Brain, and Ecstasy devoted only a single chapter to the subject in general. I need to delve deeper into the topic. To do so, I'll need some good sources.

Here are some books I'm looking at:
The first one looks extremely comprehensive and helpful.

Monday, April 6, 2009

The Musicality of Language

The Musicality of Language

A very interesting article that touches on some rhythmic similarities between language and music.

Sunday, April 5, 2009

Sweet Success.

As dramatic as it may seem, I was almost moved to tears earlier tonight when mGen generated something completely unexpected. I can't really say that what I heard was entirely original or breathtaking, but I wasn't expecting it. I had pretty much given up on the Conscious Model plugin and was getting ready to move back to ChillZone and try something else. I had also thrown together a new progression plugin based off of hard-coded progressions since I was tired of hearing crappy progressions. And then I hit generate with a ChillZone pianist plugin, a Conscious Model plugin, and a Contour Arp plugin, just for old time's sake.

What I heard was a gentle ambient pad laying down two simple seventh chords. A very pleasing background. A low arpeggiated drone created an eerie feeling. But the best part was up top...the Conscious Model plugin had generated an indescribably simple but beautiful high melody. The most unexpected part, however, was how the melody remained coherent throughout the whole piece - returning to certain motifs and elaborating on them - while changing and undergoing subtle variations. The melody was perfectly predictable and perfectly unpredictable. It made sense but I couldn't say for sure what would come next.

I'll post the new sample clip in a day or two. It's probably nothing amazing to anyone else. But I think it's the first time I've really been taken aback by the creative ability of the program. I didn't know I had programmed a plugin capable of doing anything coherent to this point. And yet there it was; here it is.

I feel accomplished. If mGen fails in terms of everyone else's standards, if the generated music makes the ears of others bleed, if people say I failed and computers will never know anything about music, at least I have this. I could listen to this kind of output for hours on end. I could sleep to this, I could dream to this, I could do homework to this. It doesn't even matter anymore. I succeeded. I would listen to this music. That's all I cared about in the first place.

I succeeded.

Saturday, April 4, 2009

Computer Models of Musical Creativity

Computer Models of Musical Creativity
Chapter 4: Recombinance

  • Western tonal music generally follows simple principles that drive melody, harmony, voice leading, and hierarchical form
  • One can create music by programming such principles into a computer
  • Such an approach often creates stale music
  • Recombinance is a method of using existing music and recombining it logically to create new music
  • Cope uses destination pitches and beat-size groupings to split chorales into smaller groups called lexicons that can be recombined using the pitch and beat data
  • Such syntactic networking actually preserves a great deal of the music's integrity while generating new output
  • To further extend the abilities of recombinance, Cope had his program analyze the source piece's "distance to cadence, position of groupings in relation to meter, and other context-sensitive features"
  • Artists often use musical signatures, patterns of notes that recur in many works of a composer
  • Recombinance can be described in terms of Markov chains
  • Recombinance can work both vertically and horizontally
  • Generation of music must start with an abstract hierarchy and move towards specifics (this is exactly what I foresaw and intended when I made the structure module the foundation upon which mGen works! Cope agrees!)
  • Rule acquisition from music models the musical training of humans
  • Machine renditions of music are often crude and dead...successful algorithmic composition requires dynamics
  • An improviser basically has a repertory and an idea of how he or she wants an improvised idea to flow into the next
  • "Recombinance, or rules acquisition, provides more logical and successful approaches to composing in tonal music styles"
  • "Every work of music, I feel, contains a set of instructions for creating different but highly related replications of itself"
  • "The secret of successful creativity lies not in the invention of new alphabet letters or musical pitches, but in the elegance of the combination and recombination of existing letters and pitches"
  • "In recombination, rules are not necessary, since the destination notes provide all of the requisite information"
  • "While recombinance of this type ensures beat-to-beat logic in new compositions, it does not guarantee the same logic at higher levels"
  • "The initial and final groupings of a phrase are most pivotal"
  • "Experiments in Musical Intelligence protects signatures from being fragmented into smaller groupings, thus ensuring that these signatures will survive the recombination process"
  • "A Markovian description of recombinant processes does not allow for the broader control of larger-scale structure"
  • "In music, what happens in measure 5 may directly influence what happens in measure 55, without necessarily affecting any of the intervening measures"
  • "The top-down approach is necessary because choosing new beat-to-beat groupings must be informed by hierarchy, and not the reverse. No new grouping of a work-in-progress can be selected until its implications for the entire structure of the work are determined"
  • "Acquired rules are often more accurate since, by default, they originate from the music itself and not from generalizations about the music"
  • "Having a program first derive rules and then apply these rules during composition, though a simple notion, is critically important to the basic thrust of my modeling creativity"
  • "I continue to maintain that computer-composed music in any style is as real as human-composed music in any style"
  • "I see no reason why computer-created music cannot move us to tears, find roots in our cultures, and reveal or obscure its internal implications as much as any music composed in more traditional ways"
  • "Improvisation consists of either generating music associatively to maintain continuity, or interruptively striking out in apparently new directions"
  • "Improvisers associate rhythmic patterns, melodic contours, and harmony"
  • "Improvisation tends to function as a series of gestures that themselves have a sense of beat and that, when performed one after another, make musical, rhythmic, and metric sense"
These constitute the notes I took on my reading today.

Thursday, April 2, 2009

Coordination Module

As I have continued to work on generative plugins and attempted to reverse-engineer familiar music, I have found something lacking in my Core Modules group. I need more than just a structure and progression module - I need a meter/rhythmic module. Such a module would generate information concerning where the strongest beats of each measure lie, how melody should interact with harmony, - with room for syncopation and counterpoint - and more for each segment of a composition.

Although such a module wouldn't make much difference in the case of a standard 4/4 beat where ,beats 1, 5, 9, and 13 are heavily accented, it would shine through when unusual rhythmic and metric patterns occured. Almost any metric pattern can sound good if used consistently and coherently. This new module would allow the drum plugin to make sure it hits the accented beats harder - and choose its base beat accordingly, as well as enable the melody to follow a predictable metric pattern.

In short, I need a coordination module.

Now the real question: what rules govern the metric and rhythmic patterns of compositions?

cgMusic

I just have one question - how is cgMusic able to create coherency? I need to dig deeper into the scripts to find out how it works. I should learn from my predecessors.

Wednesday, April 1, 2009

ChillZone Pianist

One of the first plugins I ever wrote for mGen was the ChillZone Pianist, designed to be a piano module for down tempo applications like Buddha Bar, Hotel Costes, and Cafe del Mar. The ChillZone Pianist uses an interesting data structure for generation: hands. The pianist essentially has two "hands," each consisting of five fingers. The fingers can move around different keys, and the plugin generates output by invoking functions built into the hand data structures that 'play' the fingers in a desired order and with a desired time map. Using this method, I hope to recreate more realistic patterns that a real pianist would play.

After dropping the ChillZone plugins for quite a while in favor of certain others, I have come back to the pianist. I experimented with other models for generation such as my consciousness model theory, but ultimatelely came back to the original thought that mimicing a pianist could produce solid results. Thus, I am now back at work on the ChillZone pianist and hope to complete all the features, such as rhythmic chords and abstract motifs.