Livecode Festival #2 – Visualists Meetup – 1st September 2018

On Saturday 1st September I’m organising a meetup for visualists as part of Livecode Festival #2 at Access Space in Sheffield:

A session for live coding visualists (at any level) lead by Antonio Roberts (aka hellocatfood), to talk about their tools and how they perform, with focus on Algorave visuals.

A core part of the session will be discussion around key questions for live code visualists; how do you pace yourself in a performance? Should we aim to build up slowly or go straight in with loud visuals? How much can you truly respond to the music? Is it important to show the code, and how does it fit with the musician’s projection?

The session will run from 11:00 – 16:00 and will include workshops in Pure Data/GEM (led by me), Hydra (led by Will Humphries) and Livecodelab (led by Guy John).

Get your tickets here! And whilst you’re in the area get a ticket for the Algorave on the same night at 20:00 😉

Libre Graphics Meeting – Live Performance with Visual Programming Languages

Tools for Visual Jockying (VJing) have historically been built to manipulate pre-existing video to create live visuals. Over time more experimental approaches have emerged that focus on using programming languages to create visuals in a live performance environment. Using my experiences of VJing at Algoraves and other live performances as a focal point, I will present a critical insight into some of these tools, consider the benefits of a visual approach to coding, and discuss how typical programming functions can be used to create visuals in a live environment.

Radiophonic

Edited arts presents: Radiophonic
A tribute to the BBC Radiophonic Workshop on its 60th anniversary

A day and night event – a mini-festival – at Down Lane Studios, a brand new multi-functional 400 capacity ex-industrial space in Tottenham.

Live musicians, DJs, talks, workshops, film screenings and multi-disciplinary artworks.

Algorave – Bluedot Festival

Bluedot with OVO Energy is an award-winning festival of discovery at the grounds of a deep space observatory. Set against a backdrop of the iconic Lovell Telescope at Jodrell Bank, bluedot combines a truly stellar line-up of music with a ground-breaking programme of live science experiments, expert talks and immersive artworks.

An Algorave is a party where electronic music is generated live from algorithms. The word was coined around 2012, initially as a kind of joke, but has since taken hold with Algoraves taking place in over 40 cities around the world.

At an Algorave, the creation of algorithms are brought into the experience of the music itself. This process is opened up by projecting the code on screens in the venue, so audience members can see how the music they hear is being made. This is often complimented by algorithmically generated visuals projected alongside the code.

Thinking Out Loud

‘Thinking Out Loud’ is the fifth Data as Culture art exhibition at the Open Data Institute. The exhibition is built around the practice of the 2016 ODI Sound Artist in Residence, Alex McLean, with a group of artists, designers, makers and musicians that he has collaborated with. Openness and processes of making – where any end results are left partly undone – are at the heart of many of the projects on display. The exhibition draws connections between the ways in which humans have captured, encoded and distributed data, and made it meaningful through pattern throughout history. From pre-Columbian Quipu and the ancient art of weaving to computer software environments, it introduces us to creative notions of code, and the ways in which it can carry both language and thought.

The exhibition features artists and makers who are driven by radical intentions to expose the inner workings of the systemic structures we live with. We are encouraged to engage with these ourselves through art, software, folk songs, glitch aesthetics, chance encounters and knitted jumpers.

Artists: Felicity Ford, David Griffiths and Julian Rohrhuber, Ellen Harlizius-Klück, Dan Hett, David Littler, Alex McLean, Antonio Roberts, Sam Meech, Amy Twigger-Holroyd

Curated by Alex McLean and Hannah Redler

Algomech Algorave

At AlgoMech 2017 we aim to take Algorave to the next level, bringing together some of the best algorithmic (and mechanical) dance music producers and VJs, playing over Sheffield’s fiendish DangerNoise soundsystem, with immersive projections covering the walls of Millennium Gallery. As with the rest of the festival we’ll be mixing mechanisms with the algorithms, showcasing repetitive dance music made from handmade robots as well as live code.

Five Days of Pure Data – Live Coding

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the last few days I have been releasing a few of these patches and techniques that I implement when programming in Pure Data.

Live Coding

The last tutorial isn’t that much about actually programming with Pure Data and is more about performing with it. More specifically, live coding. Live coding takes programming to a performative level. There’s been a lot of writing recently about Live Coding so take a look around. I’ve been programming live visuals for Pure Data since 2014 (a lot happened that year). It’s not without it’s problems, some of which I written about in the past.

One problem I’ve found is displaying your patch alongside your visuals. For live coding musicians this isn’t much of a problem as the output of the live patching is sent to the sound card, not necessarily the screen. For people live coding visuals in Pure Data the output (projector) would be needed for both the visuals output and the patch. If only there was a way to overlay the patches onto visuals (like in Fluxus, Cyril and Livecodelab)!

The most reliable cross-platform method to do this is to use Open Broadcaster Software (OBS). OBS is a great piece of software used by many gamers for livestreaming. A great feature of it is to be able to combine multiple media sources into one to then stream or send to a projector or video file.

To overlay your patch onto your visuals first create a GEM window with some graphics in it. In OBS, under Source click the + button and choose Window Capture. Give the new source a name. And then, under Window choose your GEM output window.

Next, create a new Window Capture source and give it a name (perhaps Pure Data patch). You may have noticed that your patch completely covers your GEM source. No worries! Right-click on your Pure Data patch source and click on Filters. In the next window click on the + button and choose Chroma Key. Give the new key a name (White).

In the next window, under Key Color Type choose Custom and then select the colour. You should choose white (#FFFFFF). You may need to change the settings under Similarity and Smoothness to get the right look.

If you’ve just tried this in Pure Data vanilla you’ll notice that your objects are also transparent. I’m using Purr Data (which you should too). In this version of Pure Data all of the boxes are slightly grey, resulting in them not being transparent when you apply the Chroma Key.

Imagine a scenario where you’re collaborating on an Algorave set with a musician and there’s only one projector. Both will want to project their code but there’s only one projector. What to do?! Luckily a combination of OBS and desktop sharing has you covered.

TidalCycles code from yaxu

If the musician can share their desktop over a network then you can add that in as a source in OBS and do the usual thing of applying a chroma key filter. However, this requires you both be on the same network, which isn’t always possible or efficient. However, if one of you has a recent Android phone you can get over this hurdle and have high quality desktop sharing quality.

On your phone create a wireless hotspot. You don’t need to use mobile data for this so feel free to turn it off.

Connect both machines to this newly created network.

On the remote (musicians) laptop, open Desktop Sharing options and enable the option “Allow other users to view your desktop”. You can choose to require a password and have a user confirm on each new access request.

On the host (visuals) machine open Remmina Remote Desktop Client.

In this new window click New. In the next window change the Protocol to VNC – Virtual Network Computing. In the Basic tap below click on the … button.

You should see the name of the remote computer listed. Click on one of them.

Finally, press Connect. When you try to connect to the remote machine you may get a prompt asking the user to confirm that the connection request is permitted.

Once the connection is approved you should see the remote machine on your computer!!! 🙂 And now you can go back to OBS and add that window as a source, apply a Chroma Key effect and then overlay it onto your visuals.

Sadly you can’t then use this source to apply as a texture to your objects in Pure Data, but it’s a great start to merging visuals and music live code. I first tried out combining my own code and visuals at the Chemical Algorave in Newcastle and it went really well. I also made use of OBS’s “Apply LUT” filter to change the colour of my whole video output.

MVI_3763

Chemical Algorave

I then tried incorporating a musicians code at the Algroave at BUMP in Kortrijk.

A post shared by Antonio Roberts (@hellocatfood) on

I’m hoping that this will help lead to more collaborations between Algorave musicians and visual artists 🙂

Five Days of Pure Data – Stop Motion

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Stop Motion

It was at the Co-Position meeting of the Libre Graphics Research Unit in 2012 that I first encountered Toonloop. It’s creator, Alexandre Quessy, gave a live performance using lights, Lego pieces and other objects. I was really quite in awe of how stop-motion was used to create quite an enjoyable performance.

Of course my first instinct was to try and recreate this but in Pure Data. I wasn’t the first to try, and I in fact have some memory of seeing Toonloop’s creator himself trying to write something similar in Pure Data although I can’t find any sources…

My first instinct to recreate Toonloop was to use [pix_write] to write a series of images and then play those back using [pix_image]. The problem there is that there’s no easy way to read an arbitrary set of images from a directory.

In the end I learnt about [pix_buffer_write] which allowed me to story an image frame into a buffer which I can then call back using [pix_buffer_read]. So that’s the basic functionality sorted! When I went to Databit.me in 2013 Axel Debeul helped to improve the patch a lot. The improvements allowed me to save a video from each animation. You can find the most recent version of the patch below

That’s where the problems start to arise, some of which I haven’t solved yet. The videos are created via [pix_record]. When a frame is captured it is sent to [pix_record] and then recording is paused. However, when you look at the saved video it has a really weird frame rate and doesn’t play smoothly. Even when the frame rate is set explicitly it somehow doesn’t work.

Don’t ask me about the cat ears

Perhaps making a video out of the animations is something to be done in post processing rather than in Pure Data. [pix_record] has always been a very complicated object to work with so perhaps I need to investigate further and try to find the right configuration of all of its many options.

Five Days of Pure Data – Infinite Scrolling

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Infinite Scrolling

For a performance at ChipFest 10 in 2014 I wanted to recreate the look of a scrolling video game but within Pure Data. Sure, I could have just used a console emulator or recorded a clip and used that but doing it all within Pure Data allows me to have some more flexibility with how I perform. Here’s what I made:

I eventually didn’t use it all but I did use the a few techniques I learnt for future performances.

They key to those visuals and for things like 2D side-scrolling computer games is to have infinite scrolling background. That is, to have a single image which tiles and repeats seamlessly as it moves across the screen. Not sure what I mean? check out these examples from TV history:

Borrowing a bit from this answer on the Game Development Stackexchange, first you create a single tile of size, say, 512×512. You scroll it across the screeen horizontally, and once it’s position gets to 512px (or -521px) it jumps back to it’s original position.

If the original tile is repeated a few times and then offset so that the borders of the tile can’t be seen then suddenly you have an infinite scrolling background! To repeat an object in Pure Data we use the [repeat] object. This object repeats any object and its translations. So, if a [cube] has a translation of 1 on the X axis and is [repeat]ed 4 times, then the first [cube] is offset by 1, the second [cube] is offset by 1 in relation to the first [cube] (or offset by 2 in total). The third cube is offset by 1 in relation to the second cube (or 3 in total), and so on.

In action this looks like this:

Success!

You can also do the same with the Y and Z axis. For the Z axis you just need to decide where the vanishing point is. Or you could always use fog 😉

The patch for this is really quite simple but I’ve made it available for download below:

Five Days of Pure Data – Randomise Text

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Randomise Text

In the early 2010s I had quite an interest in zines. I had co-organised the Birmingham Zine Festival and was quite regularly reading this. As a result of this in 2011 I started collaborating with a friend, Rebecca Evans, on a collaborative zine. The concept is that we would interpret each others’ way of working using our regularly used tools. Rebecca specialises in textiles and has quite a skill at crocheting, amongst other things. I, on the other hand, can barely operate a sewing machine and feel much more at home using a soldering iron or computer.

For Rebecca this meant trying to create some sense of noise and randomness but using sewing. The results, even if only tests, looked quite awesome!

For my own take on this I wanted to continue the text based work that I was creating at the time. Rebecca two poems to work with, Lovesong by Ted Hughes and Lady Lazarus by Sylvia Plath.

I wanted to remix the works by jumbling the words and/or sentences. Looking at recent(ish) work you can see that this would become a bit of a theme in my work and references. spɛl ænd spik, Silver Screen Changeable, and my reinterpretation of Variations on a Themeby Casey & Finch by Erik Bünger each look at cutting up and rearranging words.

Cutting up words and rearranging them can be achieved in many languages using just a few lines of code. One method is to put each word or sentence into an array and then print those indexes in the array randomly. In Pure Data it isn’t as easy.

For most cases Pure Data has the [tabread] and [tabwrite] objects. To use [tabwrite] you specify the index and then write data to it. You then move onto the next index number and write more data to it. To read the data back you do the inverse but with [tabread]. This works great with numbers but not with text.

Y’see, Pure Data has several data types. Just like php, python and others have strings, numbers, text, etc, Pure Data has symbols, floats, integers etc. The character “6”, for example could be a number (default) or a symbol – using a combination of a number box, [floattosymbol] and then a symbol box. As a number it can be used in arithmetic operations, as a symbol it’s good for when you don’t want it to be treated as a number. The unfortunate thing for me/us is that [tabread/write] doesn’t accept symbols, only numbers. Damn! Enter [m_symbolarray] from rjlib.

Using this abstractions you can put symbols into an array in the same way as numbers. With that problem solved I now could decide whether I wanted to jumble words or sentences. In the former case it would be unlikely to return anything that makes sense, which is ok. In the latter case it could still produce interesting results that could fool some people into thinking it was as it should appear. So, it was important to have the option to do both.

Using [textfile] you can specify what a delimiter is. A delimiter specifies where one list item stops. By default [textfile] uses a space as a delimiter. By sending the message [read textfile.txt cr( to [textfile] I can tell it to use a carriage return (Enter key) as a delimiter. Unfortunately there isn’t great documentation on what the others are!

Using the [until] object and a carefully crafted series of operations I crafted a patch to do the following: Decide on delimiter, read the text file, output each list item to the m_symbolarray object, cycle through text file and do this until we reach the end of a text file.

With the table now populated here’s where the interactive part comes in. A user could decide to either dump the contents of the array into a text file or spit out each item to, say, a object for displaying on a screen.

You can download the finished patch below:

Below are my remixes of the two poems for you to download:

it’s a shame that the zine was never finished but it was still a great learning experience.

I’m sure that by now the relatively new object can solve many of the problems I had using [textfile] to read the file, so perhaps at some point in the future there will be an update.