Controlling Improviz Using Midi via OSC

In 2020 I did quite a number of workshops in using the Improviz visuals live coding environment. Improviz can be thought of as a fork of Livecodelab, especially as its developer, Guy John, is one the developers of Livecodelab. However, it has some key differences that make it stand out as its own unique software:

  • It works on the desktop, and I think it is faster because of it
  • The language is more fully documented
  • You can load your own textures, gifs, 3D models, and shaders

Being able to load your own textures might in itself be a reason for many people to switch from Livecodelab to Improviz. Things can be that just a bit more personalised when you’re using your own images and objects rather than only colours, gradients and basic geometrical shapes. Another potentially useful difference is that in Improviz you can interface with the software using Open Sound Control (OSC). This opens up the possibility of using software or external hardware devices. In this blog post I’ll take you through how you can connect a midi controller to Improviz via OSC and Pure Data.

To get started you first need to define a variable in Improviz that you want to be changed by OSC/midi. The name of this variable can be anything as long as it’s not a name already used as a function or variable in Improviz. Check the reference page for a list of reserved names. In my example I’ve used the variable name size.

size = ext(:size, 1)

Next, we need to connect to it via osc so that we can change its value.

When you launch Improviz via the terminal one of the messages you’ll see printed is the port it is using for sending message over OSC.

2021-03-25 20:53:.732595  INFO: Running at 640 by 480
2021-03-25 20:53:.732733  INFO: Framebuffer 640 by 480
2021-03-25 20:53:.390032  INFO: Loaded 3 texture files
2021-03-25 20:53:.437047  INFO: Loaded 8 material files
2021-03-25 20:53:.441641  INFO: Loaded 5 geometry files
2021-03-25 20:53:.441718  INFO: *****************************
2021-03-25 20:53:.441766  INFO: Creating Improviz Environment
2021-03-25 20:53:.466755  INFO: Loading ./stdlib/variables.pz
2021-03-25 20:53:.466846  INFO: Loading ./stdlib/transformations.pz
2021-03-25 20:53:.466890  INFO: Loading ./stdlib/shapes.pz
2021-03-25 20:53:.466930  INFO: Loading ./stdlib/style.pz
2021-03-25 20:53:.466968  INFO: Loading ./stdlib/textures.pz
2021-03-25 20:53:.467004  INFO: Loading ./stdlib/screen.pz
2021-03-25 20:53:.467039  INFO: Loading ./usercode/grid.pz
2021-03-25 20:53:.467078  INFO: Loading ./usercode/seq.pz
2021-03-25 20:53:.467116  INFO: Improviz OSC server listening on port 5510
2021-03-25 20:53:.467297  INFO: Improviz HTTP server listening on port 3000
2021-03-25 20:53:.467405  INFO: Improviz resolution: 640 by 480

Of course you can, at this stage, use any software that can send data over OSC, but for this blog post/tutorial I’ll be using Pure Data. Alternatives exist but I like using it as it’s lightweight, stable and is cross platform.

To send OSC messages use the [netsend] object to connect to the same ip address as Improviz (usually 127.0.0.0) and same port (5510). [udpsend] will output a 1 from its only outlet to show a successful connection. With the connection established I can now send values from a number box to the variable via OSC!

Right now I’m using number box which has its values being set by me manually clicking and dragging. I could have the numbers being generated randomly by using the [random] object, or even have some level of audio reactivity by using the [adc] object. If that’s your thing you do it! Keeping to this blog post’s title I’ll be using a midi controller to change these values. For this next stage you should know that I’m using Ubuntu (20.10) as my operating system. This means that the instructions, especially those concerning connecting a midi controller, may be different for your operating system. Sadly I can’t help with that.

Connecting a midi controller to Pure Data is quite easy. I’m using an Akai MPK Mini MKII, but the instructions on connecting the controller are the same for pretty much any midi controller. First make sure that Pure Data is exposing at least one midi port. Change your midi backend to ALSA-MIDI in Media > ALSA-MIDI. Then go to Media > MIDI Settings… and make sure you have at least one midi input.

Then, open QjackCtl, click on the Connect button and under the ALSA tab connect the MPK Mini Mk II output port to the input port of Pure Data.

In Pure Data you can now read the Control Change (CC) values of a one of the knobs or pads using the [ctlin] object. On my MPK the first dial (K1) is [ctlin 1]. It outputs values from 0 – 127 (128 values). I want it to change the size of a cube from 0 – 4, so I need to map the ranges. I found this very handy mapping abstraction so I’ll be using that. With the ranges mapped I can use the knob on my controller to change the size!


Pure Data patch in Improviz code is here: pd_improviz_4.zip

For my next trick I want one octave, C5 to G5, to alter the shades of grey of the cube. The [notein] object will tell me the current midi number of the key being pressed. From that I can deduce that C5 to G5 is midi notes 48 – 59. Using the [maxlib/scale] object again I can map those ranges to 0 – 256 and send those values over OSC to a variable in Improviz that will be used to change the fill function.


Pure Data patch in Improviz code is here: pd_improviz_5.zip

For my final form I’ll use one of the pads on the midi controller to toggle a random colour generator.


Pure Data patch in Improviz code is here: pd_improviz_6.zip

One of the possibilities of using a midi controller to control visuals in this way is that you can control the audio and visuals simultaneously, rather than one being triggered in reaction to the other. In my experience of doing live visuals it has been quite normal for visuals to move or, as is quite often the case, pulsate in reaction to the amplitude of the music. In fact I did this many years ago for a video for My Panda Shall Fly.

What I’ve sometimes noticed is that there’s latency and the reactive visuals often feel like they’re coming in too late after the beat/instrument has hit. Of course the latency can be reduced by adjusting the sensitivity of the audio input device (microphone or line in) but then it’s a fine balancing act of both the musician and visualist adjusting levels. Achievable but a pain!

By having one device/controller triggering both you can, in theory, have both happen simultaneously. Here’s a demonstration of this from October 2020

As you can see the midi controller is controlling both the visuals and the audio. When I eventually get back to performing live gigs this is definitely something I’m going to explore further. Until then, have fun mixing live coding with midi controllers!

Improviz gifs

Earlier this year fellow visualist and live coder Rumblesan commissioned me to make some gifs for his new live coding software, Improviz. In July he unleashed it into the world!

Looking at the above videos you could easily be forgiven for thinking that it looks a bit like LiveCodeLab. He is, after all, one of the developers of LiveCodeLab. However, Improviz differs in a few ways. As Rumblesan himself explains in the Toplap chat:

the language in Improviz has a lot in common with live code lab, and the basic functionality for shapes, styles, transformations and loops is all pretty much the same. but in terms of implementation and usage they’re very different

lcl is using three.js as an intermediary, whilst improviz is entirely haskell and uses opengl directly (which I think long term is going to cause me grief but we’ll see haha)

the major difference is that improviz lets you use images and gifs as textures, which is something I’d like to back port to lcl, but wouldn’t be a small task unfortunately

That’s right, you can load textures! As mentioned before Rumblesan commissioned me to make a set of gifs to go along with the initial public release. They’re all released under a Creative Commons Attribution licence so you’re free to use them as you wish as long as you attribute me.

As an added bonus I’m also releasing the .blend file that was used to make each one.

Click here to download the Blender files.

These were made using a beta version of Blender 2.80. I’ve tested them in the stable release and they appear to work fine but they definitely will not work in 2.79 or earlier versions. I’m providing these for you to explore and won’t be doing a writeup/tutorial on how they work. If you remix them please share what you make 🙂

Definitely give Improviz a try! Thanks to Rumblesan for commissioning me to make the gifs 🙂

Livecode Festival #2 – Visualists Meetup – 1st September 2018

On Saturday 1st September I’m organising a meetup for visualists as part of Livecode Festival #2 at Access Space in Sheffield:

A session for live coding visualists (at any level) lead by Antonio Roberts (aka hellocatfood), to talk about their tools and how they perform, with focus on Algorave visuals.

A core part of the session will be discussion around key questions for live code visualists; how do you pace yourself in a performance? Should we aim to build up slowly or go straight in with loud visuals? How much can you truly respond to the music? Is it important to show the code, and how does it fit with the musician’s projection?

The session will run from 11:00 – 16:00 and will include workshops in Pure Data/GEM (led by me), Hydra (led by Will Humphries) and Livecodelab (led by Guy John).

Get your tickets here! And whilst you’re in the area get a ticket for the Algorave on the same night at 20:00 😉

Thoughts on live coding visuals in Pure Data

I took part in Algorave in Gateshead on 26th April. Apart from being incredibly awesome it was my first time live coding – or rather live patching – visuals in Pure Data from scratch. I emphasise from scratch because nearly all of my performances involve me modifying patches, but never starting with a completely blank canvas. I also occasionally used the HSS3jb as a texture for objects, but never on its own. It’s also great for when crashes occur, which is/was often ;-). Here’s a few sampels of my visuals. Videos by Mariam Rezaei:

I learnt a few things about Pure Data that night, and my general opinion is that it isn’t that great as a live coding visuals tool.

One of first issues is encapsulation of objects. This can be done quite easily but it’s a manual process which would involve cutting all cords and reconstructuring a patch. That is, you would have to cut the selection of objects, paste them into a sub patch and then reattatch it. By way of comparison, Max/MSP has this as a feature already, whereas this isn’t mentioned at all on the bug tracker Feature request is now on the bug tracker. Not being able to auto encapsulate objects makes reuse a bit more difficult and cumbersome, which resulted in some really messy patches from me on the night.

Algorave patches

This also relates to another issue of object insertion. When I was building my patches I would often have to preempt what I would need. I nearly always started with [gemhead]-[translateXYZ]-[rotateXYZ]-[repeat 10]-[rotateXYZ]-[translateXYZ]-[color]-[cube]. Inserting any additional objects required me to cut the cord and therefore the screen output. This would be solved if there were, for example, a method whereby if two objects were selected, the next object was inserted in between them. This is obviously an over-simplified specific use case which would need more thought behind it. Again, no mention of it on the bug tracker. Feature request is now on the bug tracker.

There were other thoughts I had on the night, such as the incosistencies and clumsiness of using the [repeat] object, the lack of a snap-to-grid option for aligning objects, the tiny size of inlets and outlets – even when the ojbects themselves may be huge, which is only exaggerated when using a 13″ 1080p screen, and the lack of a toolbar (yes, I am aware of GUI plugins), but these are the two which I felt would’ve helped me most.

Has much else been written about the use of Pure Data for live coding visuals?

Visuals for Com Truise

On June 13th I provided visuals for Com Truise‘s gig at Bull’s Head in Moseley, Birmingham. Aside from a few technical errors (I am a glitch artist afterall 😉 ) the gig went really great! Here’s a render of some of the visuals I did set to Terminal and Air Cal from his debut album, Galactic Melt


(Sorry about some of the freezes in the video. It seems VJing and recording the video at the same time causes Pure Data to slow down a bit)

It was a great night and thanks to This Is Tmrw for having me do visuals for such a great musician!

BiLE at SOUNDkitchen, 17th March 2011

On Thursday 17th March from 8pm I’ll be providing visuals for BiLE (Birmingham Laptop Ensemble) at SOUNDktichen Balkan Fusion at the Hare and Hounds
SOUNDkitchen

BiLE are a collaborative group of composers (instrumental and electroacoustic), performers and programmers who are all active members of BEAST (Birmingham ElectroAcoustic Sound Theatre). The core group of six artists were brought together through their shared interest in live performance and desire to perform and improvise together in an interactive ensemble.

This will be my first foray into the land of live visuals and certainly not my last. I’m really excited about this as long as my computer doesn’t crash it’ll be an awesome set! Get your tickets here.