<2022>

</2021>

I don’t have too much to say about this year. It was of course difficult in many ways but there were some moments of relief, especially when I get to see people IRL. Of course the pandemic and everything else in this world is still happening but I really valued those little moments, especially around August – October, where I could just do things. Anyway, on with the overview of 2021!

January

Not much happened in January. The Idle Index exhibition at Phoneix concluded with a talk between some of the artists involved.

February

There were a lot of presentations, lectures and workshops delivered this year and this activity all started around February. After lectures for VCU and The Courtald I did a rather big presentation as part of the Screenwalk series from The Photographer’s Gallery and Fotomuseum . It was a general overview of my practice but also welcomed back a sonification performance but with added Korg Volca synths.

I also started delivering a four-part Blender workshop series. Side note, check out Devon Ko’s amazing thread on Blender resources and in particular why you shouldn’t follow the donut guy.

Finally this month I announced (Algo|Afro) Futures.

This programme is really important to me, and its launch comes/came at a time when it’s more important than ever to address racism and diversity in every aspect of art and culture.

March

This month I did two presentations which came out of my takeover on the Minorities in Stem twitter account in 2020. The first was a presentation about live coding for Nerd Nite London. The second presentation only two days later I had the honour of being a Guest Conductor on The Coding Train!

Although I don’t use Processing/p5.js in my work I do really admire the work that Dan Shiffman does and how accessible his tutorials are. He’s really one of my favourite online educators and so it was an honour to be asked to present my work and talk about live coding to his audience. I followed my presentation with a little performance too!

Fundraisers for both of those events helped to raise nearly £10,000 for Coders of Colour 🙂

April

It was fairly quiet in terms of output, partly due to ill health. I spent a week in London delivering workshops for New Town Culture with Company Drinks.

After a long deliberation process I was happy to announce the successful applications to the (Algo|Afro) Futures programme, Jae Tallawah, Samiir Saunders, Rosa Francesca, and Emily Mulenga.

May

The Rules of Engagement exhibition that I curated for ODI happened back in November 2020, but in May I was in conversation with two of the participating artists, Everest Pipkin and Nick Briz, about their work . Always great to speak with them!

I also made a new video work for the Estuary Festival , which took a trip down the Thames Estuary, showing how it was projected to flood over 100 years. The video I created is titled A Short History of Nearly Everything (big-up to fellow Reuben fans).

Making this video was a real learning experience, particularly in making the soundtrack. I feel that since making the soundtrack for Windows Explorer and Nodes I’ve slowly begun to be more considerate about making music, and making a 40-minute soundtrack gave me a lot to think about in terms of pacing and transitions. Still more to learn about mixing and mastering, but I definitely could sense I made progress.

I also did visuals at the Overlap Social event, which was my first in-person performance since February 2020!

And also this month I contributed to an NFT artwork that was sold at the Proof of Sovereignty Christies auction. The artwork, F473 by Coin Artist was an NFT and also a game! I was really happy to be working with Coin Artist/Blockade Games again (I previously did some work with/for them in 2018 with Plasma Bears).

If you’ve been on the internet at all this year you will have no doubt seen the discourse around NFTs. I still have mixed-to-negative opinions about NFTs in general, so it’s very unlikely that you’ll see me “mint” or “drop” my own works,

June

This month I was really excited to finally launch A.M. Darke’s artwork that I commissioned for the ODI Rules of Engagement exhibition, fairlyintelligent.tech. To launch the artwork I was in conversation with A.M. about their work

Another huge announcement was that I Am Sitting in a Room was acquired by the Government Art Collection!

This artwork, originally made in 2010, is a remix of Alvin Lucier’s (RIP) work of the same name. I’m really happy that they chose to acquire this piece as it marked a significant moment in my career as a digital artist.

July

The big thing to happen this month was the conclusion and celebration event for the (Algo|Afro) Futures programme. After a bunch of in-person and online workshops each of the participants presented a new performance at Vivid Projects!

(Algo|Afro) Futures

(Algo|Afro) Futures

(Algo|Afro) Futures

The whole event was live-streamed:

I’m really proud of what they all achieved in just a few short months!

I also made a music video for Affirmation by Echo Juliet

August

I did an online performance for Flash Crash and then presented some illustration work for Birmingham Design Festival’s Creative City exhibition

September

Another month of not much output, primarily as I was busy preparing new works. I celebrated my birthday with a performance at an Algorave in Sheffield, my first in-person live coding performance since February 2020!

I also produced a new commission for Beatfreeks’ Youth Trends report. You can see the commission and listen to a discussion I had about art made with data.

By far my highlight of September was Errorcamp, which saw around 50 live coders spend a few nights in the Lake District. We did some performances, some workshops, and some talks, but the most important thing for me was all of us just being in the same space together. The community aspect of Algorave and live coding is so important and I was really missing the in-person stuff, so I was glad to be there with others 🙂

Errorcamp

Errorcamp

Errorcamp

I also appeared as a guest on the BBC Arts & Ideas podcast.

October

This was a big month! It seems as though almost everything that was postponed in 2020 was rescheduled to take place in October 2021! My month of events started with a couple of performances at No Bounds Festival in Sheffield. Part of it was a presentation of prerecorded performances by (Algo|Afro) Futures participants and music by me, and then I performed visuals with both Alex McLean and eye measure.

Then, a week later I presented new video work at the Corridor of Light festival in Manchester. The video, Move Fast And B̴͓̝͉͝r̴̞͕̬͒̾̃e̷̜̖̾̊a̸͙̲̾͐́k̴̡̩̥̆͘ Things is quite possibly my biggest projection and was placed in a very prominent position on Oxford Road. It was such a joy to see people stopping and watching the piece.

Move Fast And B̴͓̝͉͝r̴̞͕̬͒̾̃e̷̜̖̾̊a̸͙̲̾͐́k̴̡̩̥̆͘ Things from Antonio Roberts on Vimeo.

Move Fast and Break Things

Move Fast and Break Things

This was another instance where I was really happy with the music that I made. It was made primarily using TidalCycles which I mostly used as a sequencer, and then did lots of editing in Ardour.

I also co-cutrated an Algorave, which saw Bad Circulation do their only public performance of the year.

Corridor of Light Algorave

Corridor of Light Algorave

The following week I was in Nottingham to deliver a few workshops and then to install my work for the Cut & Mix exhibition at New Art Exchange. The conversations with the exhibition’s curator, Ian Sergent, started in February 2020, and so I’m glad that the exhibition was postponed rather than cancelled.

Cut & Mix

Cut & Mix

The exhibition is quite close to my heart, and it’s the first time I’ve ever really focused on race in my own work, but it’s something I’ve been wanting to do for a long time. I’m already planning an extension of the work and will definitely talk and write more about the concepts behind it in the future. Also happy to see the exhibition get a mention in the Guardian.

On the same night Careful Networks also launched.

November

I started running another short course in Blender and then started doing Hydra workshops at Site Gallery in Sheffield.

I also made a variable glitch typeface and some gifs for the Animate Assembly website.

By far the biggest thing was the Concerning Photography event by The Photographers’ Gallery and Paul Melon Centre. I did a keynote artist talk about my practice on the third day of the conference. I’ll update this post with a recording once it’s published.

December

As usual December was quite quiet. I did a performance for the Between Sound and Concept: Listening with the CCRU event at Coventry Biennial. I was quite ill that day so wasn’t my usual energetic self, but did use it as a chance to do a live debut the Pulse and Nodes tracks. I’m definitely near the point where I’m ready to make an EP or something more official.

I then took part in a discussion with Alexandra Cardenas at ICLC.

We were discussing diversity within the live coding community and the challenges we have both faced in this area. I was really happy to discuss the (Algo|Afro) Futures programme further. In general I really hope that the programme inspires live coders and algoravers in other communities to start their own similar programme. I would hate it if people saw this programme and thought that the work to combat racism is now “done”.

I finished the year by taking part in the TidalClub Longest Night live stream event. My performance was similar to the CCRU set but with a more refined and a new ending piece.

Also in December a remix I provided for Phoneutrian’s Sercle EP was released! My contribution was a remix of the song Crystal Mashup. Lots of other great songs on there!

End

So that was the year! No specific hopes for the future other than to survive and continue to learn and develop. Vague, I know, but then so is the future.

I Am Sitting In A Room acquired by Government Art Collection

I’m happy to share that my 2010 video, I Am Sitting In A Room, has been acquired by the Government Art Collection as part of Art X-UK, a permanent collection of works by 45 visual artists from across the UK.

In this difficult and unusual year, the Collection invited the networks for nine regions in England, and the networks in Wales, Northern Ireland and Scotland, to nominate artists as part of a special project, Art X-UK. Our collection curators went on a virtual tour of artists’ studios from Penwith to Ballygally, to select works for the Collection to display in government buildings worldwide. Art X-UK, supported by the Advisory Committee on the Government Art Collection, was a unique way of responding to the impact of COVID-19 on the visual arts sector.

This project has enabled the Collection to support 45 artists, acquiring over 90 works, and spending £230,000 across the UK. As part of the nomination process, we asked each network to form a group and put forward the artists’ names, providing a statement on their selection process for transparency, including a list of the selectors. Asked to consider diverse representation of artists within each region, 24 of the artists are women, 2 identify as non-binary, 20 as minority ethnic, 6 as LGBTQ+, and 4 with disclosed disability.

I Am Sitting In A Room is my take on the 1969 artwork of the same name by Alvin Lucier, in which he recorded himself narrating a text, then played the recording back into the room, re-recorded it, played it back etc.

In my take I used a process of repeatedly glitching font files. I documented this process in several blog posts at the time, including a recent blog post where I tried to recreate the work in 1920×1080 resolution.

I’m really happy that they chose to acquire this piece as it marked a significant moment in my career as a digital artist. It was one of my first works to incorporate automation/programming and the work being screened at GLI.TC/H 2010 in Chicago represented one my first significant international screenings. The piece even was screened at Alvin Lucier’s 80th birthday celebrations in 2011!

Really happy that my work is part of this round of acquisitions along with a lot of great artists 🙂

Controlling Improviz Using Midi via OSC

In 2020 I did quite a number of workshops in using the Improviz visuals live coding environment. Improviz can be thought of as a fork of Livecodelab, especially as its developer, Guy John, is one the developers of Livecodelab. However, it has some key differences that make it stand out as its own unique software:

  • It works on the desktop, and I think it is faster because of it
  • The language is more fully documented
  • You can load your own textures, gifs, 3D models, and shaders

Being able to load your own textures might in itself be a reason for many people to switch from Livecodelab to Improviz. Things can be that just a bit more personalised when you’re using your own images and objects rather than only colours, gradients and basic geometrical shapes. Another potentially useful difference is that in Improviz you can interface with the software using Open Sound Control (OSC). This opens up the possibility of using software or external hardware devices. In this blog post I’ll take you through how you can connect a midi controller to Improviz via OSC and Pure Data.

To get started you first need to define a variable in Improviz that you want to be changed by OSC/midi. The name of this variable can be anything as long as it’s not a name already used as a function or variable in Improviz. Check the reference page for a list of reserved names. In my example I’ve used the variable name size.

size = ext(:size, 1)

Next, we need to connect to it via osc so that we can change its value.

When you launch Improviz via the terminal one of the messages you’ll see printed is the port it is using for sending message over OSC.

2021-03-25 20:53:.732595  INFO: Running at 640 by 480
2021-03-25 20:53:.732733  INFO: Framebuffer 640 by 480
2021-03-25 20:53:.390032  INFO: Loaded 3 texture files
2021-03-25 20:53:.437047  INFO: Loaded 8 material files
2021-03-25 20:53:.441641  INFO: Loaded 5 geometry files
2021-03-25 20:53:.441718  INFO: *****************************
2021-03-25 20:53:.441766  INFO: Creating Improviz Environment
2021-03-25 20:53:.466755  INFO: Loading ./stdlib/variables.pz
2021-03-25 20:53:.466846  INFO: Loading ./stdlib/transformations.pz
2021-03-25 20:53:.466890  INFO: Loading ./stdlib/shapes.pz
2021-03-25 20:53:.466930  INFO: Loading ./stdlib/style.pz
2021-03-25 20:53:.466968  INFO: Loading ./stdlib/textures.pz
2021-03-25 20:53:.467004  INFO: Loading ./stdlib/screen.pz
2021-03-25 20:53:.467039  INFO: Loading ./usercode/grid.pz
2021-03-25 20:53:.467078  INFO: Loading ./usercode/seq.pz
2021-03-25 20:53:.467116  INFO: Improviz OSC server listening on port 5510
2021-03-25 20:53:.467297  INFO: Improviz HTTP server listening on port 3000
2021-03-25 20:53:.467405  INFO: Improviz resolution: 640 by 480

Of course you can, at this stage, use any software that can send data over OSC, but for this blog post/tutorial I’ll be using Pure Data. Alternatives exist but I like using it as it’s lightweight, stable and is cross platform.

To send OSC messages use the [netsend] object to connect to the same ip address as Improviz (usually 127.0.0.0) and same port (5510). [udpsend] will output a 1 from its only outlet to show a successful connection. With the connection established I can now send values from a number box to the variable via OSC!

Right now I’m using number box which has its values being set by me manually clicking and dragging. I could have the numbers being generated randomly by using the [random] object, or even have some level of audio reactivity by using the [adc] object. If that’s your thing you do it! Keeping to this blog post’s title I’ll be using a midi controller to change these values. For this next stage you should know that I’m using Ubuntu (20.10) as my operating system. This means that the instructions, especially those concerning connecting a midi controller, may be different for your operating system. Sadly I can’t help with that.

Connecting a midi controller to Pure Data is quite easy. I’m using an Akai MPK Mini MKII, but the instructions on connecting the controller are the same for pretty much any midi controller. First make sure that Pure Data is exposing at least one midi port. Change your midi backend to ALSA-MIDI in Media > ALSA-MIDI. Then go to Media > MIDI Settings… and make sure you have at least one midi input.

Then, open QjackCtl, click on the Connect button and under the ALSA tab connect the MPK Mini Mk II output port to the input port of Pure Data.

In Pure Data you can now read the Control Change (CC) values of a one of the knobs or pads using the [ctlin] object. On my MPK the first dial (K1) is [ctlin 1]. It outputs values from 0 – 127 (128 values). I want it to change the size of a cube from 0 – 4, so I need to map the ranges. I found this very handy mapping abstraction so I’ll be using that. With the ranges mapped I can use the knob on my controller to change the size!


Pure Data patch in Improviz code is here: pd_improviz_4.zip

For my next trick I want one octave, C5 to G5, to alter the shades of grey of the cube. The [notein] object will tell me the current midi number of the key being pressed. From that I can deduce that C5 to G5 is midi notes 48 – 59. Using the [maxlib/scale] object again I can map those ranges to 0 – 256 and send those values over OSC to a variable in Improviz that will be used to change the fill function.


Pure Data patch in Improviz code is here: pd_improviz_5.zip

For my final form I’ll use one of the pads on the midi controller to toggle a random colour generator.


Pure Data patch in Improviz code is here: pd_improviz_6.zip

One of the possibilities of using a midi controller to control visuals in this way is that you can control the audio and visuals simultaneously, rather than one being triggered in reaction to the other. In my experience of doing live visuals it has been quite normal for visuals to move or, as is quite often the case, pulsate in reaction to the amplitude of the music. In fact I did this many years ago for a video for My Panda Shall Fly.

What I’ve sometimes noticed is that there’s latency and the reactive visuals often feel like they’re coming in too late after the beat/instrument has hit. Of course the latency can be reduced by adjusting the sensitivity of the audio input device (microphone or line in) but then it’s a fine balancing act of both the musician and visualist adjusting levels. Achievable but a pain!

By having one device/controller triggering both you can, in theory, have both happen simultaneously. Here’s a demonstration of this from October 2020

As you can see the midi controller is controlling both the visuals and the audio. When I eventually get back to performing live gigs this is definitely something I’m going to explore further. Until then, have fun mixing live coding with midi controllers!

(Algo|Afro) Futures

I’m happy to launch (Algo|Afro) Futures, a mentoring programme for early career Black artists in the West Midlands who want to explore the creative potential of live coding.

Live coding is a performative practice where artists and musicians use code to create live music and live visuals. This is often done at electronic dance music events called Algoraves, but live coding is a technique rather than a genre, and has also been applied to noise music, choreography, live cinema, and many other time-based artforms.

(Algo|Afro) Futures will take place between April – June online and at Vivid Projects and will consist of four sessions. Dates will be confirmed in response to lockdown restrictions and participant availability.

Algorave Birmingham

Four participants will receive mentorship from myself and Alex McLean on all things live coding. Each participant will receive a fee of £100 per mentoring session attended plus reasonable travel expenses.

This opportunity is open for Black West Midlands-based artists only. The call is open now until 23:59 GMT on 14th March . Further information about the programme, FAQs and the application form can be found at the (Algo|Afro) Futures website.

Late at the Library: Algorave

(Algo|Afro) Futures is organised with FoAM Kernow and Vivid Projects, in collaboration with and funded by the UKRI research project “Music and the Internet: Towards a Digital Sociology of Music

<2021>

</2020>

2020 was definitely a hard year, which feels a pretty repetitive and redundant thing to say at this point. I did try to stay creative, and did create things, but sometimes I just felt like staying still and watching the world crumble around me. At times just getting out of bed before 12:00 felt like enough of an achievement for one day.

When exhibitions or performances did eventually happen, albiet online, it all felt a bit anticlimactic. Sometimes months of work would go into preparing for an online exhibition or performance. After the adrenaline of the event wore off there was no release, no celebration, no friends around to hug or high five. Just a sudden comedown, get your pajamas on and realise that you’ve not travelled more than 50 metres from the kitchen in days. Oh, and the world is still falling apart.

So, with that cheery start here’s most of the things that I got up to in 2020.

January

As usual not much happened in January. Imagine that, an uneventful month. How I’d wish for that right now…

February

In early February I made my way to Limerick to attend ICLC and perform at an Algorave with Maria Witek (mxwx). I feel that the academic side of live coding sometimes passes me by, but what I do like about events like these is the critical reflection on the practice and the gathering of artists from all parts of the world. It helps to remind me that live coding is a global thing, not just UK/western world.

I really enjoyed performing with Maria. You can see a bit of our performance from around 03:09:00.

Shortly after that I was in Norwich share some new work for Love Light Norwich. I shared a new video work, Let’s Never Meet.

I did a couple of blog posts detailing how I made both the audio and some of the visuals.

March

On 5th March I had the honour of performing at the Algorave at Cafe Oto. I was really nervous as I was making music, not visuals. By this stage I had performed music live a handful of times in venues and online. To then perform at this prestigious venue was daunting but in the end it pushed me to learn and practice more. Here’s a recording of the performance.

Little did I know that this would be my last performance in a venue this year.

On 19th March the year of live streams started. The Eulerroom Equinox took place over three days and featured performances from myself and Alex McLean and one of my favourite performances from myself and mxwx:

This event had been in planning since late 2019 but I think it took on new relevance with the whole world now moving online.

Also in this month I did live streams with Echo Juliet and published a lot of blog posts on (mis)using FFmpeg’s motion interpolation commands. To gather all of the findings together I melted a cat:

April

Online group exhibitions and performances dominate my activities from April onward. On of the first was the Well Now WTF? exhibition which launched on April 4th. This exhibition featured over 140 exhibiting gifs and videos that raised the question of what should/can we do now that everything is cancelled. I contributed a gif in the “Wash Your Fucking Hands” room reflecting on the collective loneliness that comes from online parties.

I did a couple more online live coding events, including a performance with Yaxu for Graham Dunning’s Noise Quest series and a performance for Open Data Institute where we got cut off half way through, possibly for copyright violation! Another sign of things to come.

Also In April I did an overview of the Design Yourself project I ran with Barbican is 2019. Working with a select group of their Young Creatives we created artwork that asked what it meant to be human in an age of technology. One of the participants, Tice Cin, wrote a really good summary of the programme. Here’s one of my favourite videos:

May

Live streams this month included performances with Yaxu on a Cyberyacht(!) (from 32:00) and a performance for Github (better quality version here).

As part of the Well Now WTF? exhibition I presented Gifhouseparty, a lockdown party for all the gifs stuck at home. The music was all live coded and features music/code from me and mxwx, and also gifs of people you may recognise.

Perhaps the biggest event of this month was the opening of the Copy Paste exhibition on 22nd May at Piksel in Bergen, Norway. As Curator I had been planning this exhibition for over a year. I had fully expected this exhibition to not go ahead but the lockdown situation in Bergen at the time allowed for events to still go ahead and so it went along, just without me there. A carefully curated online component was added to allow some of the works to be enjoyed online.

I’m of course thankful to Piksel for their work in allowing the exhibition to go ahead, but I still can’t help but feel sad that I wasn’t able to be there to see it in person!

Other events this month include another performance with Yaxu for the Copy Paste exhibition, a presentation and discussion about copyright/copyleft at Photographer’s Gallery and a performance and presentation at Art Meets Radical Openness. The presentation, called Sorry About That, was about the role that copyright plays in online streaming.

You can watch the presentation here (from 01:40:00), or listen to a rebroadcast of the talk that happened on Radio FRO in July (from around 21:20).

June

This month was kinda quiet. The Copy Paste exhibition continued with events including a presentation from Constant and a workshop from Duncan Poulton. With my skills in audio production getting better I decided to revisit the Wonderland video I made for the Wonder exhibition in 2019 and add a soundtrack.

July

I did visuals for a mix from Reprezent Radio for Late at Tate Online on 17th July. The video’s no longer online so have a couple of gifs!

On 18th July I did two performances in one day! The first was for Oxidize Global and then later me and mxwx collaborated again for a performance at Network Music Festival. Sadly there’s no recordings of either performance but there will hopefully be rerecordings of the music at some point.

Elsewhere in this month I was interviewed by Thisandthatzine and also did a self portrait for it.

Click for larger version

August

The collaboration between me and mxwx finally got a name! We’re now known as Bad Circulation and you can find our music here. At the moment it’s just live recordings and rehearsals. We’re working on an EP. In the meantime here’s one of my favourite recordings.

I was also on the selection panel for Hyperlink from Test Card. Congrats to those that were successful!

September

The online component of Copy Paste was included in Ars Electronica. This included the online exhibition as well as a Curator’s tour, an rebroadcast of Constant’s presentation and the performance from me and Yaxu.

I also published a blog post about it being 10 years since the first GLI.TC/H happened in Chicago. It had quite an impact on me in many ways so I felt it right to mark the occasion somehow.

I was also on the selection panel for the Jerwood Arts / FACT Digital Fellowship. I’m intrigued to see what the three selected artists will create next year!

All the way back in February I was on the selection panel for Ten Acres of Sound, “a festival of noise, sound, sonic art, music, performance, whatever located within Stirchley, Birmingham”. I’m glad it managed to happen as it was postponed from earlier in the year.

October

Back in July I was undertaking a “Stay at Home” residency with New Art Gallery Walsall:

In response to the Coronavirus pandemic, The New Art Gallery Walsall initiated a series of remote residencies to support artists to produce work from their homes. Departing from the Gallery’s usual emphasis on making and sharing work within the context of the Gallery’s purpose-built studio space, artists were encouraged to find creative approaches to developing their practice amid imposed national restrictions and, in particular, to explore the benefits and possibilities of engaging with an online audience.

I challenged myself to learn more about film making and make a video using only what I already have at home. Here’s my video called Windows Explorer:

It was a big challenge and I wrote three blog posts detailing each challenge.

I took part in another online group exhibition (this time featuring 50 artists) called The Archive to Come. For this I made a gif/video reflecting on the tearing down of statues and the Black Lives Matter protests. Here’s a lower resolution gif version:

A better quality video can be seen here and you should check out all of the works in the exhibition.

I also (finally) took part in DA Z. This event was cancelled back in March as was a related event in September, and though I wasn’t able to be physically present in Switzerland I was still happy to be part of it.

November

November was unusually busy. Since July I was working behind the scenes with Open Data Institute to curate Rules of Engagement, an online programme of artworks that make a case for ethical practices when working with data.

The commissioned artists were Nick Briz, Everest Pipkin, and A.M. Darke. The artworks were launched at ODI’s annual Summit and are still available online to view now. It was a lot of work to get the programme together but it was a pleasure to commission new work from some great artists!

You can hear myself, Nick Briz and ODI’s Hannah Redler-Hawes talk about the programme on the TECHnique podcast.

The next day on 11th November I presented new work as part of the Peer to Peer online exhibition. I was one of the three UK commissioned artists and created a piece called Nodes.

It’s the first time I’ve been commissioned to make a piece of music (I did make the visuals as well though) and I really enjoyed making it.

Sticking with music, in November the Compassion Through Algorithms Vol. II compilation was released. The compilation is raising funds for Young Minds Together and was created in response the Black Lives Matter protests, and the general recognition that live coding/electronic music is still heavily dominated by White men. I made a track for it called Pulse.

I also did a short blog post about how I made it. It’s still on sale so go buy it!

I provided a screensaver for the The Idle Index online exhibition from Phoenix Leicester. It’s delivered via a browser extension which you can install in Chrome.

I also took part in Abuja Art Week‘s digital exhibition with two existing videos, Visually Similar and Abundant Antiques.

Back in September I was a judge for the second year running for the Koestler Arts Digital Art category. In November their annual exhibition, this time called No Lockdown of the Imagination launched. Lockdown prevented me from seeing the works in London in person but they have an app you can use to view all of the works.

In other selection panel/judging activities, I was on the selection panel for the MADE IT graduate exhibition which features around 50 artists. The selection process took place between September – October but the online exhibition launched in November. Congrats to all those selected!

December

A fairly quiet month. From 7th – I did a takeover of the Minorities in STEM twitter account. Each week on that account a different person talks about their experiences of being, well, a minority working in STEM (Science, Technology, Engineering, Maths). Though my work does use all of those there’s also the Art side (sometimes called STEAM) so I used some of 3700 words to talk about how al of these overlap. I also talked about how in my experiences of learning about digital art there was never any talk of Black people or anyone other than mainly White men. Things have gotten better since I was in education but there’s still so much work to do to recognise the contributions of Black people in (digital) art. You can read each of the daily threads here:

  • Monday – how I got into working in art+tech, with a focus on setting up the fizzPOP makerspace
  • Tuesday – glitch art and early experiments in making generative art using code
  • Wednesday – realising that the history of Black people working in art + technology is often overlooked
  • Thursday – demonstrated live coding and talked about Algorave
  • Friday – covered a handful of the organisations in the UK that are helping to make art and technology more diverse

I ended the year with a performance at the Eulerroom Winter Solstice. I combined live coding using Tidal Cycles with a couple of Korg Volca synths. No video yet but I’ll update when it becomes available.

end

And so ends a crappy year. That sense of community from being part of group exhibitions and performances definitely helped keep me sane and connected but I really need human contact again. Anything that isn’t a Zoom window… I of course hope that 2021 will be better, but I think we’ll need to fight to keep our galleries, museums, venues and other institutions open. Time and time again our government has shown that they don’t value the arts, and I fear that so many of the places I love will be lost next year. Did I also mention that there’s a pandemic still going on?

Laters 2020!

Making Pulse

On November 6th the Compassion Through Algorithms Vol. II compilation was released, raising money for Young Minds Together . The compilation is still available, and of course you can donate directly to Young Minds Together if you prefer.

In this blog post I’ll be going over how I made my track, Pulse.

I’m two years into making music and I’ve recently become more comfortable and confident in my processes. I’ve gotten over the technological hurdles and, having experimented in making music/sounds of different styles both in private and at Algoraves, I feel I’ve found a range of styles that I like making music in. In the live coding music world some of my biggest influences have been eye measure, Miri Kat, Yaxu, and Heavy Lifting. Their work spans many genres but what I’m drawn to in their music is the more sparse, ambient and even sometimes aggressive sounds. I tried to keep this in mind when making Pulse.

As with most things I make I started first by just experimenting. I can’t fully remember my thought process but at some point I landed on turning a kick drum (“bd” in Tidal) sound from a percussive to a pitched instrument. I achieved this by triggering the sample many times in quick succession and playing with the speed in which it was played back.

setcps (135/60/4)

d1 
$ sound "bd*4"

d7
$ loopAt "0.75 0.25"
$ chunk 4 (slow "2 4")
$ sound "bd*<32 16> kick*32"
# speed (fast "0.25 1" $ range 4 "<0.25 0.75 0.5>" $ saw)
# accelerate "3 2 0"
# gain 0.75
# pan (slow 1.2 $ sine)

I like the piercing buzzing nature of the sound and so decided to focus on building the track around this. Next I had to get the tempo right. By default Tidal runs at 135 bpm (0.5625 cps). Running that code at 135 bpm felt way too fast and so I tried bringing it down to 99 bpm.

It’s no longer at a speed to dance to but makes for better listening. It also meant I could more accurately identify what note the buzzing sound was at. The loopAt command affects the pitch of the samples and it is itself affected by the tempo that Tidal is running at, so setting it at 99 bpm (setcps (135/60/4)) revealed that the buzzing sound was at a G-sharp. It’s probably still a little bit out of tune but it’s close enough!

In late August I bought + was given the Volca Bass and the Volca FM synths. By this time I had been using bass samples in this track but saw this as an opportunity to give these newly acquired synths a try! The Tidal website has instructions on setting up midi, which worked well. One issue was that I was using two of the same usb-to-midi adaptors. On the surface this isn’t an issue, but, at least according to the midi Tidal instructions, when adding a midi device it does so by name and not by any sort of unique ID. Running MidiClient.init: with both adaptors connected gave me this:

MIDIEndPoint("USB MIDI Interface", "USB MIDI Interface MIDI 1")
MIDIEndPoint("USB MIDI Interface", "USB MIDI Interface MIDI 1")

I didn’t know which of the two adaptors Tidal was going to send midi messages to and so no idea which synth would be triggered! Fortunately Alex McLean was on hand to provide a (linux-specific) solution. The dummy Midi Through Port-0 port exists by default and so Alex suggested adding another one. I’ll quote from Alex from the Toplat chat:

if you add options snd-seq-dummy ports=2 (or more) to /etc/modprobe.d/alsa-base.conf
you’ll get two of them
the other being
Midi Through Port-1
obvs
then you can tell supercollider/superdirt to connect to them
then go into qjackctl and the alsa tab under ‘connect’ to connect from the midi through ports to the hardware ports you want
then you can make them connect automatically with the qjackctl patchbay or session thingie
I like doing it this way because it means I can just start supercollider+superdirt then change round which midi device I’m using super easily.. plugging/unplugging without having to restart superdirt
I don’t know if this will solve the problem of having two devices with the same name but hopefully..

With that all fixed I recorded my track! Here’s a live recording of me, um, recording it. It is made using Tidal, the code is just on a screen out of shot.

As you may have noticed there’s some latency on the Volca bass. I should have adjusted the latency in Jack to account for this but at the time didn’t realise that I could do this or even how to do it. However, I was recording the Volca Bass and FM onto separate tracks in Ardour so I was able to compensate for the latency afterwards.

On reflection I should have recorded each orbit (d1, d2 etc) into separate tracks. At the time I didn’t realise I could do this but it’s pretty simple withclear instructions located on the Tidal website, and there’s friendly people on the Toplap chat who helped me. This would allow me to do additional mixing once it was recorded (my Tidal stuff is typically way too loud). Aside from those observations I’m really happy with how it sounds! I’ve shared my code below, which may be useful to study but of course you’ll need Volca’s/midi devices to fully reproduce it.

setcps (99/60/4)

d1 -- volca fm
$ off 0.25 ((fast "2") . (|+ note "12 7"))
$ note "gs4'maj'7 ~"
# s "midi1"

d6
$ stack [
sound "kick:12(5,8) kick:12(3,<8 4>)",
sound "sd:2",
stutWith 2 (1/8) ((fast 2) . (# gain 0.75)) $ sound "hh9*4",
sound "bd*16" # speed 2 # vowel "i"
]

d4 -- volca bass
$ fast 2
$ stutWith 2 (1/4) ((|+ note "24") . (slow 2))
$ note "~ ~ ~ gs2*2"
# s "midi2"

d7
$ loopAt "0.75 0.25"
$ chunk 4 (slow "2 4")
$ sound "bd*<32 16> kick*32"
# speed (fast "0.25 1" $ range 4 "<0.25 0.75 0.5>" $ saw)
# accelerate "3 2 0"
# gain 0.75
# pan (slow 1.2 $ sine)

d2 -- transpose volca fm
$ segment 32
$ ccv 50
$ ccv (range 10 (irand 10+60) $ slow "8 3 7 3 1" $ sine )
# ccn "40"
# s "midi1"

If you enjoyed my track or any of the others on the compilation please consider buying the compilation or making a donation to Young Minds Together and help the fight against racial injustice.

Coder Beatz

Happy to be working with Birmingham Open Media to deliver Coder Beatz, a creative digital programme focusing on live coding for young black kids in the West Midlands.

Coder Beatz a new creative digital programme for young black kids aged between 11-15 years old.
We are running 4 monthly Coder Beatz workshops between November 2020 and February 2021. In each session we will be teaching kids how to create digital music and visuals using live coding and algorithms. The sessions will be delivered by Antonio Roberts who is a renowned digital artist and expert coder. Being a man of colour, Antonio is really passionate about inspiring young black kids to get skilled up on coding music and visuals.

Kids will not need any music or tech experience, and we will provide laptops and headphones for them at BOM’s art center.

Over four sessions I’ll be teaching how to use TidalCycles for making music and Improviz for making visuals. All of the details, including sign up details, can be found by contacting Birmingham Open Media.

On a personal level I’m really happy to be delivering this programme because during the six-ish years I’ve been live coding at Algoraves I’ve noticed that the scene is very good at addressing gender inequalities but, at least in the UK scene, it’s still very white (which could probably be said of electronic music more generally).

Through delivering the programme I hope to demonstrate the creative possibilities of programming and, while I don’t expect those who take part to become fully fledged Algoraves, I do hope it encourages them to explore ways of making digital music and art beyond the “standard” ways of using tools like Ableton and Adobe software.

I also recognise that there are other issues that need to be addressed to make live coding more diverse. For example, encouraging more black people to build live coding tools, recognising and celebrating the impact black culture has had on digital art/music… And I hope this is part of that process.

Please get in touch with BOM if you’re interested or know anyone who would be great for this!

The Stay at Home Residency – part 3

From 1st – 29th July I was happy to be selected as an artist in residence for The New Art Gallery Walsall’s Stay at Home Residencies.

In the second blog post I looked at how I approached filming. In this third and final blog post I’ll be detailing my sound making process and sharing the finished film.

The next stage in making this film was working on the sound. As you can hear in a couple of the clips in the previous blog post the area that I live in is really really quiet! Everyone in the local area was using the Summer time to sit outside bathing in the sunlight. Was very relaxing for sure but recordings of the ambient background noise didn’t make for an interesting soundtrack. There was once the sound of a wood chipper but otherwise it was mostly silent. At times me playing music was the loudest sound!

Instead I took to making recordings from within the home. This process made very aware of the variety, and at times lack thereof, of sounds in my home environment. There’s lots of shuffling, tapping, television and dampened thud sounds. With the exception of the television, the place with the most variety of sounds is most definitely the kitchen and so most sounds I used came from there. There’s sounds of glass, metal, wood, and water and even from inside the fridge!

If you’ve been following any of my work for a while you’ll see that I’ve done a lot of live coding performances over the last two years. I like the liveness of this process and so chose to incorporate it into my sound making process. I took the samples that I recorded into TidalCycles and got coding! Here’s some of the recordings along with variations on the code that created them.

setcps(50/60/4)

d1
$ sometimes (fast 2)
$ whenmod 8 6 (# speed 0.5)
$ slow "4 2? 1"
$ sometimes (# accelerate "-0.05 0 0.02")
$ loopAt "1 0.25?"
$ stutWith 4 (1/8) (# speed 1.25)
$ sound "bowl*<1.5 2 1> blinds*<1 2>"
# n (irand 3)
d2
$ sometimes (fast 1.35)
$ striate "2 4 8"
$ stutWith "8 2 1" (1/16) (# speed (irand 3-1))
$ sound "droplet*4"
d3
$ every 7 (# speed "0.5")
$ slow 4
$ sometimes (striate "8")
$ stutWith 8 (1/8) (soak 4 (|+ speed 0.15))
$ juxBy (slow 3 $ sine) ((# speed 2) . (# accelerate "-1"))
$ sound "stackingplates*2 [whack(3,9)]"
# n "1 2"
# pan (perlin)
d4
$ hurry "2 1 4 8"
$ sound "whack*4"

Although not the same as the drone soundscapes that Rodell Warner creates I thought they provided a lot of texture and would work well as an accompaniment to a drone soundscape. For that I loaded up Ardour and the Helm synthesiser.

The process of making and putting together all of these separate parts was in no way linear. The tutorials I followed all recommended writing a script or having a plan and I certainly didn’t have either. For this exploratory stage of my journey into film making I think that was mostly ok but for anything in the future I would at least consider what kind of atmosphere, emotions, or general message I wanted to convey.

The actual editing process was a big chore. Open source video editing software on Linux still leaves a lot to be desired. Despite there being a number of video editors available nearly all of them have one failing in common: stability. With just a few HD resolution clips and no effects or transitions I was experiencing a lot of stuttering during seeking and playback and crashes when rendering. This, of course, caused a lot of frustration and definitely resulted in me spending less time editing than I would have liked to. For recent videos I’ve used Olive which has worked really well – seeking on the timeline is fast and there are few crashes – but at the time of editing version 0.2 was still too unstable to be usable.

After that last hurdle I feel I have produced a film that demonstrates a lot of what I’ve learnt.

The film, titled Windows Explorer, represents my desire to be out in the world again. Like pretty much everyone my world has shrunk and my engagement with the world comes from looking out of and into various windows, whether that be out of my office window or into a Zoom, Skype, Teams, Jitsi or whatever window.

With Thanks

This residency was certainly a big earning experience. In a conversation with the curators at the gallery I expressed concern that I wasn’t making enough, or that everything that I was making was, well, crap in comparison to the digital art portfolio that I’ve built up over the last decade. They reassured me that I was trying something new and so I can’t be expected to be immediately great at it. Even if I was in a situation where I had access to a team and equipment, a month isn’t really a long time to fully learn a new skill and make a complete piece of work using that skill. This really helped to put into context that this residency was time for me to reflect on my practice and to learn at my own pace.

From this residency I feel a lot more prepared to make narrative film, even if it’s a 1-minute film. I’ve already upgraded my equipment in preparation for future projects and have more knowledge of the multi-level process that goes into making a film.

Many thanks to The New Art Gallery Walsall for this opportunity 🙂