or "Fretboard Sound Indexer", or "6-string Secret Commodity Fetish"
by Steev Hise
I've been playing guitar for about 12 years now. However, in the last 6 years or so I have been playing less and less "real guitar" as i got more interested in other things, first free improvisation, then "noise" music, then sample-based music. First I began playing the guitar in non-standard ways, using alternate tunings, objects attached to the strings ("prepared guitar" ala Fred Frith) and using a bigger and bigger array of electronic processors. I became more and more bored with the standard timbral possiblities of the guitar, and also more tired of the cliched social significance of the guitar ("guitar gods", rock histrionics, macho phallic symbolism, etc etc.) My pseudo-farewell to the guitar and its connection to pop music tropes was realized in a performance piece i presented in 1995 called Requiem for (Un)dead Popstars.
Still, the guitar held a certain appeal to me, mainly because it was the "real" instrument that I had the most "chops" with. I had played saxophone in high school, and i had some skill at keyboards, as anyone involved with electronic music develops eventually, but the guitar was really what i took to and spent the most time mastering back in my blues/metal/punk/grunge days. So I led a dual life. Playing live improvisational prepared guitar, and composing intricate sample collages with computers and midi gear. For years I wondered, how to combine the two?
About a year ago, in June 1998, the opportunity presented itself, in the form of a friend giving me a Roland GS-70 guitar midi interface (I think that's the model number, i don't generally care much about gear names and numbers). He was tired of it, and it was basically obsolete technology compared with the newer models. Of course i had thought about using a midi interface before, and the thought of abandoning the midi keyboard as a controller was immensely appealing. But this had been out of my financial range till now.
Most of the uses of the midi guitar i had seen/heard were relatively pedestrian, bordering on banal. Most guitarists, infatuated with the guitar sound they've been conditioned to love, simply use a midi interface to "sweeten" their "tone", adding in some guitaroid patch that makes them sound more like Eddie Van Halen or whoever. Nick Didkovsky's performances featuring interaction with HMSL was actually the most interesting use of guitar midi that i had seen, and it was the reason i had first become interested in HMSL, back in 1992.
No, that's not HTML, stupid, I'm talking about Hierarchical Music Specification Language, a music programming environment created by Phil Burke, Larry Polanksy, and David Rosenboom at Mills College in the late 80s. HMSL was developed as a powerful and fully customizable realtime algorithmic control language. Based on Forth (which, incidentally, was a language used by astronomers to control telescopes, and an artificial intelligence language, sort of the alternative to LISP in certain computer science circles), HMSL is object-oriented, making it a neat way to compose complex behaviors and aleatoric processes. Objects like "players" and "instruments" talk to other things like "actions" and "shapes". (In recent years with the growing popularity and power of MAX by Opcode, HMSL usage has declined. It's a command-line environment, and most composers took to MAX's intuitive graphic interface much more readily. But i still think HMSL is pretty useful.) At the University of Michigan I tried to find a professor who would teach me Forth, so I could learn HMSL, but no one cared. "Who told you Forth was worth anything?" was the general concensus. So, I put HMSL on the back burner.
Finally I first started teaching myself and using HMSL in 1994 and played my first gig with it at the University of South Carolina's Computer Music Festival that March. Then when I was in grad school at CalArts 2 years later I took an advanced HSML course with David Rosenboom, and I coded some routines for sending system exclusive commands to my Roland S-550 sampler. You can do a lot with system exclusive commands,especially on the S-550, though it took me awhile to figure out how to send which codes. At first my program consisted of an onscreen interface, a simple mouse-controlled cursor on a 2D grid. One axis of the grid was the starting point of a sample. the other axis was the sample length. So while a sound was playing, I could drag the cursor around and radically alter what section of the sound was heard. I experimented with other parameters too like controlling resonance and filter frequency.
That was pretty neat, but mousing around a screen isnt very interesting for live performance. So when I procured the guitar interface, I put 2 and 2 together and decided it would make a beautiful controller for these system exclusive routines. I created a program that read in note data from the guitar interface, and turned it into loop point information. How? Well, the S-550 has 4 sample banks of 14 seconds each (at 15kHz); so, I assigned a 14 second sample to each of 4 strings on the guitar, and where i fret determines the start point of the loop. So instead of playing boring old pitches, I'm actually playing places in a sound. A fretboard sample indexer! Velocity is picked up by the interface and still controls volume of the samples, and the guitar interface has a few knobs on it too, which i mapped to things like sample length, filter cutoff and resonance, so i could really fine tune the sounds as i played. ( see source code )
It turned out to be a pretty versatile instrument, providing me with a decent amount of realtime control, but also just enough elements of chance to make things interesting. Playing it involved learning the geography of each sample Ioaded into each string, where different features of the sound "occured" on the fretboard, a totally different experience than just laying out a keyboard patch that assigned different samples to different keys. I enjoyed the randomness of this, it kept me on my toes and open to happy accidents, which to me has always been part of the core of free improvisation.
I first played the Sampletar for an audience at Omnimedia 4.0 and since then have used it almost exclusively for live performance. It results in music making that is totally different in process and quite a bit different in product than what i do in the studio, though it is rooted in the same aesthetic sense that always drives me, the exploration of appropriated sound, and the processing and juxtaposing of various source materials to create something new and meaningful.