Computer Music – Parerga und Paralipomena http://www.michelepasin.org/blog At the core of all well-founded belief lies belief that is unfounded - Wittgenstein Wed, 10 Feb 2021 18:13:18 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.11 13825966 Extempore functions explorer updated to latest release (v0.8.7) http://www.michelepasin.org/blog/2021/02/01/extempore-functions-explorer-updated-to-latest-release-v0-8-7/ Mon, 01 Feb 2021 13:07:55 +0000 http://www.michelepasin.org/blog/?p=3435 The Extempore functions explorer has been updated with the latest version of the Extempore programming language: v0.8.7

Try it out at: http://hacks2019.michelepasin.org/extempore/

The Extempore functions explorer is a little webapp I built a while ago in order to make it easier to browse (and learn about) the Extempore music-programming language.

For more background, see also the project homepage.

xtm-explorer.gif

 

 

 

]]>
3435
A new livecoding project: ‘The Musical Code’ http://www.michelepasin.org/blog/2020/11/23/a-new-livecoding-project-the-musical-code/ Mon, 23 Nov 2020 05:58:29 +0000 http://www.michelepasin.org/blog/?p=3446 I’ve started a new livecoding project on Github called The Musical Code, where I’ll be adding experimental musical code/algorithms created the amazing Extempore programming language (as well as it precursor Impromptu).

I have accumulated so much musical-code ideas so I’ve finally resolved to clean it up, reorganise it and publish it somewhere. Github seemed the best option, these days:

Turns out that just the code by itself won’t do it. Especially considering that the environments I use to ‘run it’ (and to make sounds) can rapidly disappear (become obsolete, or get out of fashion!).

Hence there’s a YouTube channel as well, where one can find a screencast recording of each of the ‘musical codes’.

Hope someone will find something of interest in there. Comments and feedback totally welcome!

]]>
3446
Impromptu language documentation back online! http://www.michelepasin.org/blog/2015/05/06/impromptu-language-documentation-back-online/ Wed, 06 May 2015 22:19:14 +0000 http://www.michelepasin.org/blog/?p=2625 For all Impromptu aficionados: a little online application that can be used to search&browse the language documentation: http://hacks.michelepasin.org/impromptu/

Impromptu is a scheme-based OSX programming language and environment for composers, sound artists, VJ’s and graphic artists with an interest in live or interactive programming.

As part of its original website, Impromptu included a wiki containing documentation and examples. Unfortunately the wiki isn’t available online anymore, which is a real shame so the other day I’ve decided to replace it with a simple searchable index of the programming language functions. This is based on the documentation file coming with the latest Impromptu release (2.5); so, in theory, it shouldn’t be too different from the original wiki site.

Screen Shot 2015 05 06 at 23 15 36

For those of you who are new to all of this, it’s worth mentioning that Impromptu is now being superseded by the Extempore programming language.

Extempore is much more solid and feature rich; also, it is less dependent on the OS X platform APIs. Still, many of the original Impromptu scheme functions are still available, so this documentation could turn out to be useful too.

Enjoy!

 

]]>
2625
New livecoding screencast: Ziggurat 51 http://www.michelepasin.org/blog/2013/11/27/new-livecoding-screencast-ziggurat-51/ Wed, 27 Nov 2013 09:29:43 +0000 http://www.michelepasin.org/blog/?p=2414 So hard to find time to do something creative these days. So I thought I’d post a screencast of a livecoded piece I’m still working on: Ziggurat 51. Hope you’ll find it interesting!

In the video I’m using the mixer UI I’ve previously talked about here. I quite like it, as you can see it’s so much easier to focus on composition and performance by not having to worry about volumes in the code!

Also, since Impromptu’s video recording functionality is broken on the latest versions of OSx, I’ve been testing out a new software called Screenflick, which is actually pretty good (apart from the logo you can’t get rid of unless you buy the software).

Enjoy!

]]>
2414
Building a master volumes UI with impromptu http://www.michelepasin.org/blog/2013/09/15/building-a-master-volumes-ui-in-impromptu/ Sun, 15 Sep 2013 11:53:33 +0000 http://www.michelepasin.org/blog/?p=2401 Based on one of the examples packaged with Impromptu, I wrote a simple function that uses the objc bridge to create a bare-bones user interface for adjusting your audio instruments master volumes.

The script assumes that your audio graph includes a mixer object called *mixer*. The UI controllers are tied to that mixer’s input buses gain value.

The objc bridge commands are based on the silly-synth example that comes with the default impromptu package.

VolumeSLider

Being able to control volumes manually rather than programmatically made a great difference for me. Both in live coding situations and while experimenting on my own, it totally speeds up the music creation process and the ability of working with multiple melodic lines.

The next step would be to add a midi bridge that lets you control the UI using an external device, in such a way that the two controllers are kept in sync too. Enjoy!

P.s.: this is included in the https://github.com/lambdamusic/ImpromptuLibs

 

]]>
2401
A metronome object for Impromptu http://www.michelepasin.org/blog/2013/02/20/a-metronome-object-for-impromptu/ Wed, 20 Feb 2013 01:34:52 +0000 http://www.michelepasin.org/blog/?p=2319 Metronome: a device used by musicians that marks time at a selected rate by giving a regular tick. If you ever felt that you missed a metronome in Impromptu, here is a little scheme object that can do that job for you.

The make-metroclick function returns a closure that can be called with a specific time in beats, so that it plays a sound for each beat and marks the downbeat using a different sound.

Possibly useful in order to keep track of the downbeats while you compose, or just to experiment a little with some rhythmic figures before composing a more complex drum kit section.

Here’s a short example of how to use it:

Make-metronome relies on the standard libraries that come with Impromptu, in particular make-metro, which is described in this tutoriale and on this video. Essentially, it requires you to define a metro object first, e.g. (define *metro* (make-metro 120)).

Here’s the source code:

 

]]>
2319
Composing at the metalevel http://www.michelepasin.org/blog/2012/03/09/composing-at-the-metalevel/ http://www.michelepasin.org/blog/2012/03/09/composing-at-the-metalevel/#comments Fri, 09 Mar 2012 12:20:17 +0000 http://www.michelepasin.org/blog/?p=1109 I’ve started reading “Notes from the Metalevel: An Introduction to Computer Composition“, by Heinrich Taube, and realised I should have done that a long time ago!

Notes From the Metalevel is a practical introduction to computer composition. It is primarily intended for student composers interested in learning how computation can provide them with a new paradigm for musical composition.

I happened to have a pdf version of the book, but the good news is that there’s an html version of it too, which includes also all the midi files of the numerous examples included in the book. So make sure you check that out, if you’re interested in computer-based composition. You might also be interested in this review on computer music journal, and this course materials from Taube’s class at Illinois.

The preface to the fist chapter contains this suggestive excerpt from Leonard Schlain’s book, The Alphabet Versus the Goddess, which Taube (page 19-20) uses as a metaphor of what algorithmic composition (i.e., metalevel composition) is::

The one crowded space in Father Perry’s house was his bookshelves. I gradually came to understand that the marks on the pages were trapped words. Anyone could learn to decipher the symbols and turn the trapped words loose again into speech. The ink of the print trapped the thoughts; they could no more get away than a doomboo could get out of a pit. When the full realization of what this meant flooded over me, I experienced the same thrill and amazement as when I had my first glimpse of the bright lights of Konakry. I shivered with the intensity of my desire to learn to do this wondrous thing myself.
(spoken by Prince Modupe, a west African prince who learned to read as an adult)

It is impossible to know exactly how Prince Modupe felt when he discovered a process by which his very thoughts could be trapped and released at will again into speech. But I think his epiphany must be close to what I experienced when, as a young composer, I was first shown how I could use a computer to represent my musical ideas and then “release them” into musical compositions.
At that instant it became clear to me that there was an entire level of notation above the scores that I had been writing in my regular composition classes, a level I knew nothing about! But I could see that in this level it was possible to notate my compositional ideas in a precise manner and work with them in an almost physical way, as “trapped words” that could be unleashed into musical sound whenever I wanted.

So what does it meant to compose at the meta level?

Given the existence of the acoustic and score representations one might ask if there is yet another representation that constitutes a level of abstraction above the performance score? The answer, of course, is yes; it is what this book terms the metalevel. If the score represents the composition then the metalevel represents the composition of the composition. A metalevel representation of music is concerned with representing the activity, or process, of musical composition as opposed to its artifact, or score.

This book is about using the computer to instantiate this level: to define, model and represent the compositional processes, formalism and structures that are articulated in a musical score and acoustic performance but are not literally represented there. By using a computer the composer can work with an explicit metalevel notation, or language, that makes the metalevel as tangible as the performance and acoustic levels.

 

]]>
http://www.michelepasin.org/blog/2012/03/09/composing-at-the-metalevel/feed/ 1 1109
Special issue of CMJ DVD on livecoding http://www.michelepasin.org/blog/2012/01/13/special-issue-of-cmj-dvd-on-livecoding/ Fri, 13 Jan 2012 10:02:47 +0000 http://www.michelepasin.org/blog/?p=1103 The latest issue of the Computer Music Journal is now available, and it includes a DVD full of livecoding bonanza.

Because this is the Winter issue, it includes the annual CMJ DVD, whose program notes appear near the end of the issue. The curators for the compositions on this year’s DVD are specialists in live coding, the practice of onstage computer programming whose real-time output is an improvised and often collaborative musical performance. As always, the DVD also includes sound and video examples to accompany recent articles, as well as related files on the DVD-ROM portion of the disc.

A full description of the contents of the DVD is available here (and here if you’re not benefitting from an academic subscription), and I’m very proud to say that it includes also one of my livecoding pieces, Untitled 12, performed live at the Anatomy Museum livecoding event in 2010.

Livecoding @ Anatomy Theatre from Michele Pasin on Vimeo.

CMJ dvd - front

CMJ dvd - back

 

]]>
1103
Article: Thought and Performance, Live Coding Music, Explained to Anyone http://www.michelepasin.org/blog/2011/12/26/article-thought-and-performance-live-coding-music-explained-to-anyone/ http://www.michelepasin.org/blog/2011/12/26/article-thought-and-performance-live-coding-music-explained-to-anyone/#comments Mon, 26 Dec 2011 11:15:09 +0000 http://www.michelepasin.org/blog/?p=1093 I bookmarked this article on createdigitalmusic.com a while ago (it’s from Jul 2010) and ran into it again today.. “Thought and Performance, Live Coding Music, Explained to Anyone – Really” by Peter Kirn contains several simple but thought provoking ideas about livecoding and its relevance in the (traditional) music world.

Is livecoding an elitarian activity?

Secrets such as why the programming language Lisp inspires religious devotion, or how someone in their right mind would ever consider programming onstage as a form of musical performance, represent the sort of geekery that would seem to be the domain of an elite.

Commenting on Ramsay’s video (Algorithms are Thoughts, Chainsaws are Tools):

I doubt very seriously that live coding is the right performance medium for all computer musicians. [..] But Ramsay reveals what live coding music is. It’s compositional improvisation, and code simply lays bare the workings of the compositional mind as that process unfolds. Not everyone will understand the precise meaning of what they see, but there’s an intuitive intimacy to the odd sight of watching someone type code. It’s honest; there’s no curtain between you and the wizard.

An interesting comment from a reader puts forward what I’d call the ‘livecoding as a programming-virtuosism view:

The live coding thing is clearly an amazing talent. I admire anyone who can do that, but it does seem pretty much a sophisticated parlor trick unless the music resulting can stand on its own.
The question becomes, were you to hear the piece without observing the live coding performance, would it stand up, or is the quality of the piece augmented by the way in which it was composed?
Is a decent painting painted by someone who paints blindfolded something I would rather see than an excellent painting by someone who paints in a conventional fashion?
Cause unless the live coder can spin something up that I would enjoy listening to on my portable media player, I feel like music takes a back seat to the musician, which is a truly peculiar something.
[…]
This is not to say live coding is something to be ignored, but where from ever in history have we asked this question? Does the musician matter more than the music?

And another, even more critical comment:

It is not about letting the audience in at all. It’s about cultivating an stage presence of virtuosic technical wizardry. No one in the audience understands the code and that’s why everyone marvels at the “magic”. Worse still it’s Lisp, a particularly archaic and obfuscated computer language.

So what?

I think this is all very useful to read, as it shows what non-specialists may think of livecoding. I’ve been asking myself similar questions a lot of times, but never really reached a clear conclusion. Is livecoding a music making activity, or is it just programming wizardry?

I personally got into livecoding as a musician, first, and only afterwards as a programmer.
As a result I tend to see it as some sort of advanced music-making tool. However, interestingly enough, in order to make that tool match my music taste and composition style I had to become an expert at programming the livecoding environment. While doing that, I sort of lost the closure to the ‘instrument’, which is something you’d have all the time if you play a piano or a guitar. With no closure, you end up in the role of ‘music programmer’, worrying about mathematical structures and time recursions rather than notes and feelings.

It’s a cyclical process, actually. You gain competency with some programming pattern that lets you express your musical ideas quickly and efficiently. Then you think of different ideas, but you can’t put them into code easily, so you’ve got to step back, abandon the musical dimension temporarily, and hack some new programming structures.

Which makes me think: maybe that’s what’s so cool about it. Livecoding environments are malleable meta-instruments that let you create (software) music instruments.

So the music – the end result – is definitely part of it. But the process, the how in the music creation business is also what we have in focus here. In fact this process is also eminently creative (and here lies the difference with many other digital music ‘creation’ tools) and, maybe most importantly, this process is so abstracted and codified that it feels as if it represented some sort of essence of creativity.

 

]]>
http://www.michelepasin.org/blog/2011/12/26/article-thought-and-performance-live-coding-music-explained-to-anyone/feed/ 1 2190
Using Impromptu to visualize RSS feeds http://www.michelepasin.org/blog/2011/12/21/using-impromptu-to-visualize-rss-feeds/ Wed, 21 Dec 2011 13:22:30 +0000 http://www.michelepasin.org/blog/?p=1073 Some time ago I’ve been experimenting with the processing and display of RSS feeds within Impromptu, and as a result I built a small app that retrieves the news feed from The Guardian online and displays on a canvas. I’ve had a bit of free time these days, so last night I thought it was time to polish it a little and make it available on this blog (who knows maybe someone else will use it as starting point for another project).

Visualizing rss feeds with Impromptu

There’re a thousand improvements that could be done to it still, but the core of the application is there: I packaged it as a standalone app that you can download here. (use the ‘show package contents’ Finder command to see the source code).

The application relies on a bunch of XML processing functions that I found within Impromptu ‘examples’ folder (specifically, it’s the example named 35_objc_xml_lib). I pruned that a bit so to fit my purposes and renamed it xml_lib.scm.

By using that, I created a function that extracts title and url info from the guardian feed:

(load "xml_lib.scm")
(define feedurl "http://feeds.guardian.co.uk/theguardian/world/rss")

;;
;; loads the feed and extracts title and url
;;

(define get-articles-online
     (lambda ()
        (let* ((out '())
               (feed (xml:load-url feedurl))
               (titles (objc:nsarray->list (xml:xpath (xml:get-root-node feed)
                                                "channel/item/title/text()")))
               (urls (objc:nsarray->list (xml:xpath (xml:get-root-node feed)
                                                "channel/item/link/text()"))))                                                 
           (for-each (lambda (x y)
                        (let ((xx (objc:nsstring->string x))
                              (yy (objc:nsstring->string y)))
                           (set! out (append out (list (list xx yy))))))
                titles urls)
           out)))

Some feed titles are a bit longish, so I added a utility function formattext that wraps the titles’ text if they exceed a predefined length.

(define formattext 
   (lambda (maxlength txt posx posy)
      (let ((l (string-length txt)))      
         (if (> l maxlength)
             (let loop ((i 0)
                        (j maxlength) ;; comparison value: it decreases at each recursion (except the first one) 
                        (topvalue maxlength)) ;; komodo value : must be equal to j at the beginning
                (if (equal? (- topvalue i) j) ;; the first time
                    (loop (+ i 1) j topvalue)
                    (begin   ;(print (substring txt (- topvalue i) j))
                             (if (string=? (substring txt (- topvalue i) j) " ")
                                 (string-append (substring txt 0 (- topvalue i)) 
                                                "n" 
                                                (substring txt (- topvalue i) (string-length txt)))
                                 (if (< i topvalue) ;;avoid negative indexes in substring
                                     (loop (+ i 1) (- j 1) topvalue))))))
             txt))))

And here’s the main loop: it goes through all the feed items at a predefined speed, and displays it on the canvas using a cosine oscillator to vary the colours a bit. At the end of it I’m also updating 3 global variables that are used for the mouse-click-capturing routine.

(define displayloop
   (lambda (beat feeds) 
      (let* ((dur 5)
             (posx  (random 0 (- *canvas_max_x* 350)))
             (posy  (random 10 (- *canvas_max_y* 150)))
             (txt (formattext 40 (car (car feeds)) posx posy))
             (dim ;(+ (length feeds) 10))                  
                  (if (= (length feeds) 29)
                      60  ;; if it's the first element of the feed list make it bigger
                      (random 25 50)))
             (fill (if (= (length feeds) 29)
                         '(1 0 (random) 1)  ;; if it's the first element of the feed list make it reddish
                         (list (random) 1 (random) 1)))
             (style (gfx:make-text-style "Arial" dim fill)))
         (gfx:clear-canvas (*metro* beat) *canvas* (list (cosr .5 .6 .001) 0 (cosr .5 .6 .001) .5 ))
         (gfx:draw-text (*metro* beat) *canvas* txt style (list posx posy))
         (set! *pos_x* posx)
         (set! *pos_y* posy)
         (set! *current_url* (cadr (car feeds)))
     (callback (*metro* (+ beat (* 1/2 dur))) 'displayloop (+ beat dur)
               (if-cdr-notnull feeds 
                               (get-articles-online))))))

In order to capture the clicks on the feed titles I simply create a rectangle path based on the x,y coordinates randomly assigned when displaying the title on the canvas. These coordinates are stored in global variables so that they can be updated constantly.

(io:register-mouse-events *canvas*)
(define io:mouse-down
   (lambda (x y)
      (print x y)
      (when (gfx:point-in-path? (gfx:make-rectangle *pos_x* *pos_y* 200 200) x y )
            (util:open-url *current_url*))))

Finally, the util:open-url opens up a url string in your browser (I’ve already talked about it here).

You can see all of this code in action by downloading the app and taking a look its contents (all the files are under Contents/Resources/app).

Visualizing rss feeds with Impromptu

If I had the time…

Some other things it’d be nice to do:

  • Creating a routine that makes the transitions among feed items less abrupt, maybe by using canvas layers.
  • Refining the clicking events creation: now you can click only on the most recent title; moreover the clicking event handler is updated too quickly, thus unless you click on the titles as soon as it appears you won’t be able to trigger the open-url action.
  • Refining the xml-tree parsing function, which now is very very minimal. We could extract news entries description and other stuff that can make the app more informative.
  • Adding some background music to it.
  • Any other ideas?

     

    ]]>
    1073