Algomech Algorave

At AlgoMech 2017 we aim to take Algorave to the next level, bringing together some of the best algorithmic (and mechanical) dance music producers and VJs, playing over Sheffield’s fiendish DangerNoise soundsystem, with immersive projections covering the walls of Millennium Gallery. As with the rest of the festival we’ll be mixing mechanisms with the algorithms, showcasing repetitive dance music made from handmade robots as well as live code.

Five Days of Pure Data – Live Coding

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the last few days I have been releasing a few of these patches and techniques that I implement when programming in Pure Data.

Live Coding

The last tutorial isn’t that much about actually programming with Pure Data and is more about performing with it. More specifically, live coding. Live coding takes programming to a performative level. There’s been a lot of writing recently about Live Coding so take a look around. I’ve been programming live visuals for Pure Data since 2014 (a lot happened that year). It’s not without it’s problems, some of which I written about in the past.

One problem I’ve found is displaying your patch alongside your visuals. For live coding musicians this isn’t much of a problem as the output of the live patching is sent to the sound card, not necessarily the screen. For people live coding visuals in Pure Data the output (projector) would be needed for both the visuals output and the patch. If only there was a way to overlay the patches onto visuals (like in Fluxus, Cyril and Livecodelab)!

The most reliable cross-platform method to do this is to use Open Broadcaster Software (OBS). OBS is a great piece of software used by many gamers for livestreaming. A great feature of it is to be able to combine multiple media sources into one to then stream or send to a projector or video file.

To overlay your patch onto your visuals first create a GEM window with some graphics in it. In OBS, under Source click the + button and choose Window Capture. Give the new source a name. And then, under Window choose your GEM output window.

Next, create a new Window Capture source and give it a name (perhaps Pure Data patch). You may have noticed that your patch completely covers your GEM source. No worries! Right-click on your Pure Data patch source and click on Filters. In the next window click on the + button and choose Chroma Key. Give the new key a name (White).

In the next window, under Key Color Type choose Custom and then select the colour. You should choose white (#FFFFFF). You may need to change the settings under Similarity and Smoothness to get the right look.

If you’ve just tried this in Pure Data vanilla you’ll notice that your objects are also transparent. I’m using Purr Data (which you should too). In this version of Pure Data all of the boxes are slightly grey, resulting in them not being transparent when you apply the Chroma Key.

Imagine a scenario where you’re collaborating on an Algorave set with a musician and there’s only one projector. Both will want to project their code but there’s only one projector. What to do?! Luckily a combination of OBS and desktop sharing has you covered.

TidalCycles code from yaxu

If the musician can share their desktop over a network then you can add that in as a source in OBS and do the usual thing of applying a chroma key filter. However, this requires you both be on the same network, which isn’t always possible or efficient. However, if one of you has a recent Android phone you can get over this hurdle and have high quality desktop sharing quality.

On your phone create a wireless hotspot. You don’t need to use mobile data for this so feel free to turn it off.

Connect both machines to this newly created network.

On the remote (musicians) laptop, open Desktop Sharing options and enable the option “Allow other users to view your desktop”. You can choose to require a password and have a user confirm on each new access request.

On the host (visuals) machine open Remmina Remote Desktop Client.

In this new window click New. In the next window change the Protocol to VNC – Virtual Network Computing. In the Basic tap below click on the … button.

You should see the name of the remote computer listed. Click on one of them.

Finally, press Connect. When you try to connect to the remote machine you may get a prompt asking the user to confirm that the connection request is permitted.

Once the connection is approved you should see the remote machine on your computer!!! 🙂 And now you can go back to OBS and add that window as a source, apply a Chroma Key effect and then overlay it onto your visuals.

Sadly you can’t then use this source to apply as a texture to your objects in Pure Data, but it’s a great start to merging visuals and music live code. I first tried out combining my own code and visuals at the Chemical Algorave in Newcastle and it went really well. I also made use of OBS’s “Apply LUT” filter to change the colour of my whole video output.


Chemical Algorave

I then tried incorporating a musicians code at the Algroave at BUMP in Kortrijk.

A post shared by Antonio Roberts (@hellocatfood) on

I’m hoping that this will help lead to more collaborations between Algorave musicians and visual artists 🙂

Five Days of Pure Data – Stop Motion

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Stop Motion

It was at the Co-Position meeting of the Libre Graphics Research Unit in 2012 that I first encountered Toonloop. It’s creator, Alexandre Quessy, gave a live performance using lights, Lego pieces and other objects. I was really quite in awe of how stop-motion was used to create quite an enjoyable performance.

Of course my first instinct was to try and recreate this but in Pure Data. I wasn’t the first to try, and I in fact have some memory of seeing Toonloop’s creator himself trying to write something similar in Pure Data although I can’t find any sources…

My first instinct to recreate Toonloop was to use [pix_write] to write a series of images and then play those back using [pix_image]. The problem there is that there’s no easy way to read an arbitrary set of images from a directory.

In the end I learnt about [pix_buffer_write] which allowed me to story an image frame into a buffer which I can then call back using [pix_buffer_read]. So that’s the basic functionality sorted! When I went to in 2013 Axel Debeul helped to improve the patch a lot. The improvements allowed me to save a video from each animation. You can find the most recent version of the patch below

That’s where the problems start to arise, some of which I haven’t solved yet. The videos are created via [pix_record]. When a frame is captured it is sent to [pix_record] and then recording is paused. However, when you look at the saved video it has a really weird frame rate and doesn’t play smoothly. Even when the frame rate is set explicitly it somehow doesn’t work.

Don’t ask me about the cat ears

Perhaps making a video out of the animations is something to be done in post processing rather than in Pure Data. [pix_record] has always been a very complicated object to work with so perhaps I need to investigate further and try to find the right configuration of all of its many options.

Five Days of Pure Data – Infinite Scrolling

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Infinite Scrolling

For a performance at ChipFest 10 in 2014 I wanted to recreate the look of a scrolling video game but within Pure Data. Sure, I could have just used a console emulator or recorded a clip and used that but doing it all within Pure Data allows me to have some more flexibility with how I perform. Here’s what I made:

I eventually didn’t use it all but I did use the a few techniques I learnt for future performances.

They key to those visuals and for things like 2D side-scrolling computer games is to have infinite scrolling background. That is, to have a single image which tiles and repeats seamlessly as it moves across the screen. Not sure what I mean? check out these examples from TV history:

Borrowing a bit from this answer on the Game Development Stackexchange, first you create a single tile of size, say, 512×512. You scroll it across the screeen horizontally, and once it’s position gets to 512px (or -521px) it jumps back to it’s original position.

If the original tile is repeated a few times and then offset so that the borders of the tile can’t be seen then suddenly you have an infinite scrolling background! To repeat an object in Pure Data we use the [repeat] object. This object repeats any object and its translations. So, if a [cube] has a translation of 1 on the X axis and is [repeat]ed 4 times, then the first [cube] is offset by 1, the second [cube] is offset by 1 in relation to the first [cube] (or offset by 2 in total). The third cube is offset by 1 in relation to the second cube (or 3 in total), and so on.

In action this looks like this:


You can also do the same with the Y and Z axis. For the Z axis you just need to decide where the vanishing point is. Or you could always use fog 😉

The patch for this is really quite simple but I’ve made it available for download below:

Five Days of Pure Data – Randomise Text

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Randomise Text

In the early 2010s I had quite an interest in zines. I had co-organised the Birmingham Zine Festival and was quite regularly reading this. As a result of this in 2011 I started collaborating with a friend, Rebecca Evans, on a collaborative zine. The concept is that we would interpret each others’ way of working using our regularly used tools. Rebecca specialises in textiles and has quite a skill at crocheting, amongst other things. I, on the other hand, can barely operate a sewing machine and feel much more at home using a soldering iron or computer.

For Rebecca this meant trying to create some sense of noise and randomness but using sewing. The results, even if only tests, looked quite awesome!

For my own take on this I wanted to continue the text based work that I was creating at the time. Rebecca two poems to work with, Lovesong by Ted Hughes and Lady Lazarus by Sylvia Plath.

I wanted to remix the works by jumbling the words and/or sentences. Looking at recent(ish) work you can see that this would become a bit of a theme in my work and references. spɛl ænd spik, Silver Screen Changeable, and my reinterpretation of Variations on a Themeby Casey & Finch by Erik Bünger each look at cutting up and rearranging words.

Cutting up words and rearranging them can be achieved in many languages using just a few lines of code. One method is to put each word or sentence into an array and then print those indexes in the array randomly. In Pure Data it isn’t as easy.

For most cases Pure Data has the [tabread] and [tabwrite] objects. To use [tabwrite] you specify the index and then write data to it. You then move onto the next index number and write more data to it. To read the data back you do the inverse but with [tabread]. This works great with numbers but not with text.

Y’see, Pure Data has several data types. Just like php, python and others have strings, numbers, text, etc, Pure Data has symbols, floats, integers etc. The character “6”, for example could be a number (default) or a symbol – using a combination of a number box, [floattosymbol] and then a symbol box. As a number it can be used in arithmetic operations, as a symbol it’s good for when you don’t want it to be treated as a number. The unfortunate thing for me/us is that [tabread/write] doesn’t accept symbols, only numbers. Damn! Enter [m_symbolarray] from rjlib.

Using this abstractions you can put symbols into an array in the same way as numbers. With that problem solved I now could decide whether I wanted to jumble words or sentences. In the former case it would be unlikely to return anything that makes sense, which is ok. In the latter case it could still produce interesting results that could fool some people into thinking it was as it should appear. So, it was important to have the option to do both.

Using [textfile] you can specify what a delimiter is. A delimiter specifies where one list item stops. By default [textfile] uses a space as a delimiter. By sending the message [read textfile.txt cr( to [textfile] I can tell it to use a carriage return (Enter key) as a delimiter. Unfortunately there isn’t great documentation on what the others are!

Using the [until] object and a carefully crafted series of operations I crafted a patch to do the following: Decide on delimiter, read the text file, output each list item to the m_symbolarray object, cycle through text file and do this until we reach the end of a text file.

With the table now populated here’s where the interactive part comes in. A user could decide to either dump the contents of the array into a text file or spit out each item to, say, a [text] object for displaying on a screen.

You can download the finished patch below:

Below are my remixes of the two poems for you to download:

it’s a shame that the zine was never finished but it was still a great learning experience.

I’m sure that by now the relatively new [text] object can solve many of the problems I had using [textfile] to read the file, so perhaps at some point in the future there will be an update.

Five Days of Pure Data – Image to Signal

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Image to Signal

If you’re into experimental ways of creating visuals and happened to be on the internet around 2013 you have have been sent information about software called PixiVisor. The software, developed by Alexander Zoloto, allows one to transmit an image via an audio signal to another device which decodes it and displays it as an image. In this way you can then apply audio effects to the signal and make some pretty cool visuals. I never tried it myself but was quite impressed by what I saw.

Being the open source kinda guy that I am, and a devout use of Linux I was interested in knowing if there was a way of reproducing this using open source software.

sloev got in touch with me some time later in 2013 to demonstrate a Pure Data patch he made that quite faithfully reproduced PixiVisor in Pure Data!

Like PixieVisor this worked by having two devices, one to transmit and another to receive. I quite quickly and roughly rewrote the patches so that this process was done all in one patch. Its first outing was the 2013 Pecha Kucha event in Coventry where I provided visuals for Ashley James Brown/Arctic Sunrise.

The patch works by using the [pix2sig] and [sig2pix] objects. As their names suggests they convert audio signals to pixels and vice versa. The key to how the Pixievisor remix patch works is setting the block size. This tells your computer how many audio signals to process per frame and therefor how many pixels.

By default on Pure Data this is set to 64, which would allow us to work with an image 8 x 8px which is tiny.By increasing the block size to 4096 we can begin to work with images of size 64×64 (because 64 x 64 is 4096).

You might now be thinking that you can increase the block size and work with higher resolution images. Well, that’s not really possible unless you want to sacrifice frame rate. If we take a commonly used video resolution, 640 x 480, you would need a block size of 307200. Try doing this and come back when your computer has done crashing 😉

With sloev’s blessing I am now releasing this modified patch.

The patch is set to use the webcam as an input but it can accept any media input that Pure Data can handle. If you don’t want to have to resize everything before bringing it into Pure Data you can always use [pix_resize] to, um, resize your input live. This has the added benefit downside of putting extra strain on your computer.

Perhaps it’s a dream that will never come true, but it would be good to one day be able to have a completely open source, multiplatform solution for video synthesis. I like the look of Lumen but unfortunately it’s Mac only 🙁 Until then, hope you enjoy the patch!

Algorave – Bluedot Festival

Bluedot with OVO Energy is an award-winning festival of discovery at the grounds of a deep space observatory. Set against a backdrop of the iconic Lovell Telescope at Jodrell Bank, bluedot combines a truly stellar line-up of music with a ground-breaking programme of live science experiments, expert talks and immersive artworks.

An Algorave is a party where electronic music is generated live from algorithms. The word was coined around 2012, initially as a kind of joke, but has since taken hold with Algoraves taking place in over 40 cities around the world.

At an Algorave, the creation of algorithms are brought into the experience of the music itself. This process is opened up by projecting the code on screens in the venue, so audience members can see how the music they hear is being made. This is often complimented by algorithmically generated visuals projected alongside the code.

Blood Sport – Live at Cafe Oto video

On 5th May Blood Sport released their latest LP, Live at Cafe Oto which, as the name suggests, is a live recording of a 40 minute set they did as part of their residency at Cafe Oto.

To coincide with its release Blood Sport asked me to create a one-take video. The video below shows track two from the LP, Melts Into.

The full 40 minute video will be made available at a later date. In the meantime you should buy their LP. They will be performing alongside Heavy Lifting at Supersonic Festival on June 16th.

Blood Sport - Live at Cafe Oto

Pure Data for Beginners workshop at #ArtistsCompute2016, 10th September

On 10th September I’ll be delivering a Pure Data for Beginners workshop as part of #ArtistsCompute2016 in Coventry.

Pure Data Patching Circle

Having recognised that computers have changed society beyond measure #ArtistsCompute2016 is dedicated to mapping, presenting, probing and escalating this impact.

The festival, which is built around a large group exhibition of the same name, features many educational, participatory and enjoyable events including workshops, talks, screenings, performances and parties.

A full, detailed timetable is available here.

The workshop will be a short intense workshop focusing on the basics of using Pure Data and GEM. The workshop will take place from 11:00 – 12:30 at Fab Lab Coventry. They will already have computers there but if you have a laptop please ensure it has Pure Data install and that GEM is working. For details on booking and for information about all the other events please check in with Office for Art, Design and Technology.

One more thing…/h2>
Last Day is surprisingly still installed and is included as part of the festivities. Check it out if you haven’t seen it already before it’s gone!

Coding Camo Workshop – 15th – 17th August

The Office for Art, Design and Technology (led by Ryan Hughes) is delivering its Camouflage Season as a part of Leamington Spa Art Gallery and Museum’s exhibition, Concealment and Deception and as a part of Leamington Camouflage Festival. For its Coding Camo Workshop on the 17th August I’ll be delivering a Pure Data workshop at the headquarters of Office for Art, Design and Technology at Eaton House, Coventry.


This workshop will teach basic computer code using Processing, Pure Data and Live Code Lab and will result in a collaborative performance. The workshop will be led by Ryan Hughes and participants will benefit from 2 days working with established visiting practitioners, Antonio Roberts and Ashley James Brown. The final day of the workshop will focus on collaboratively composing a new work for performance which explores what camouflage might look like in the age of big data, ubiquitous wi-fi and so called smart devices.

Tickets for this four day workshop and performance are £40 (£30 concessions/under 20s) and can be obtained by sending to Ryan Hughes ( a ~200 word statement explaining how the workshop would benefit your practice. Deadline is 9th August.