Nodes

Nodes is a new commission created for the Peer to Peer: UK/HK online festival which ran from 11th – 14th November, created as a reflection on the interconnectedness of the global live coding community.

Live coding is a performative practice where artists make music and visual art live using programming. This happens primarily at events such as Algoraves, but there is an equally active online community which organises regular performances, conferences, workshops and more.

Moving beyond e-mail and social media platforms, people within the community have built their own tools which allow for real time communication and collaboration across borders and time zones. In this way the local nodes the global live coding community are able to stay connected.

Many thanks to Dr Charlotte Frost from Furtherfield for the nomination. Nodes was commissioned on the occasion of Peer to Peer: UK/HK online Festival 2020 by Centre for Chinese Contemporary Art, Open Eye Gallery and University of Salford Art Collection.

Making Pulse

On November 6th the Compassion Through Algorithms Vol. II compilation was released, raising money for Young Minds Together . The compilation is still available, and of course you can donate directly to Young Minds Together if you prefer.

In this blog post I’ll be going over how I made my track, Pulse.

I’m two years into making music and I’ve recently become more comfortable and confident in my processes. I’ve gotten over the technological hurdles and, having experimented in making music/sounds of different styles both in private and at Algoraves, I feel I’ve found a range of styles that I like making music in. In the live coding music world some of my biggest influences have been eye measure, Miri Kat, Yaxu, and Heavy Lifting. Their work spans many genres but what I’m drawn to in their music is the more sparse, ambient and even sometimes aggressive sounds. I tried to keep this in mind when making Pulse.

As with most things I make I started first by just experimenting. I can’t fully remember my thought process but at some point I landed on turning a kick drum (“bd” in Tidal) sound from a percussive to a pitched instrument. I achieved this by triggering the sample many times in quick succession and playing with the speed in which it was played back.

setcps (135/60/4)

d1 
$ sound "bd*4"

d7
$ loopAt "0.75 0.25"
$ chunk 4 (slow "2 4")
$ sound "bd*<32 16> kick*32"
# speed (fast "0.25 1" $ range 4 "<0.25 0.75 0.5>" $ saw)
# accelerate "3 2 0"
# gain 0.75
# pan (slow 1.2 $ sine)

I like the piercing buzzing nature of the sound and so decided to focus on building the track around this. Next I had to get the tempo right. By default Tidal runs at 135 bpm (0.5625 cps). Running that code at 135 bpm felt way too fast and so I tried bringing it down to 99 bpm.

It’s no longer at a speed to dance to but makes for better listening. It also meant I could more accurately identify what note the buzzing sound was at. The loopAt command affects the pitch of the samples and it is itself affected by the tempo that Tidal is running at, so setting it at 99 bpm (setcps (135/60/4)) revealed that the buzzing sound was at a G-sharp. It’s probably still a little bit out of tune but it’s close enough!

In late August I bought + was given the Volca Bass and the Volca FM synths. By this time I had been using bass samples in this track but saw this as an opportunity to give these newly acquired synths a try! The Tidal website has instructions on setting up midi, which worked well. One issue was that I was using two of the same usb-to-midi adaptors. On the surface this isn’t an issue, but, at least according to the midi Tidal instructions, when adding a midi device it does so by name and not by any sort of unique ID. Running MidiClient.init: with both adaptors connected gave me this:

MIDIEndPoint("USB MIDI Interface", "USB MIDI Interface MIDI 1")
MIDIEndPoint("USB MIDI Interface", "USB MIDI Interface MIDI 1")

I didn’t know which of the two adaptors Tidal was going to send midi messages to and so no idea which synth would be triggered! Fortunately Alex McLean was on hand to provide a (linux-specific) solution. The dummy Midi Through Port-0 port exists by default and so Alex suggested adding another one. I’ll quote from Alex from the Toplat chat:

if you add options snd-seq-dummy ports=2 (or more) to /etc/modprobe.d/alsa-base.conf
you’ll get two of them
the other being
Midi Through Port-1
obvs
then you can tell supercollider/superdirt to connect to them
then go into qjackctl and the alsa tab under ‘connect’ to connect from the midi through ports to the hardware ports you want
then you can make them connect automatically with the qjackctl patchbay or session thingie
I like doing it this way because it means I can just start supercollider+superdirt then change round which midi device I’m using super easily.. plugging/unplugging without having to restart superdirt
I don’t know if this will solve the problem of having two devices with the same name but hopefully..

With that all fixed I recorded my track! Here’s a live recording of me, um, recording it. It is made using Tidal, the code is just on a screen out of shot.

As you may have noticed there’s some latency on the Volca bass. I should have adjusted the latency in Jack to account for this but at the time didn’t realise that I could do this or even how to do it. However, I was recording the Volca Bass and FM onto separate tracks in Ardour so I was able to compensate for the latency afterwards.

On reflection I should have recorded each orbit (d1, d2 etc) into separate tracks. At the time I didn’t realise I could do this but it’s pretty simple withclear instructions located on the Tidal website, and there’s friendly people on the Toplap chat who helped me. This would allow me to do additional mixing once it was recorded (my Tidal stuff is typically way too loud). Aside from those observations I’m really happy with how it sounds! I’ve shared my code below, which may be useful to study but of course you’ll need Volca’s/midi devices to fully reproduce it.

setcps (99/60/4)

d1 -- volca fm
$ off 0.25 ((fast "2") . (|+ note "12 7"))
$ note "gs4'maj'7 ~"
# s "midi1"

d6
$ stack [
sound "kick:12(5,8) kick:12(3,<8 4>)",
sound "sd:2",
stutWith 2 (1/8) ((fast 2) . (# gain 0.75)) $ sound "hh9*4",
sound "bd*16" # speed 2 # vowel "i"
]

d4 -- volca bass
$ fast 2
$ stutWith 2 (1/4) ((|+ note "24") . (slow 2))
$ note "~ ~ ~ gs2*2"
# s "midi2"

d7
$ loopAt "0.75 0.25"
$ chunk 4 (slow "2 4")
$ sound "bd*<32 16> kick*32"
# speed (fast "0.25 1" $ range 4 "<0.25 0.75 0.5>" $ saw)
# accelerate "3 2 0"
# gain 0.75
# pan (slow 1.2 $ sine)

d2 -- transpose volca fm
$ segment 32
$ ccv 50
$ ccv (range 10 (irand 10+60) $ slow "8 3 7 3 1" $ sine )
# ccn "40"
# s "midi1"

If you enjoyed my track or any of the others on the compilation please consider buying the compilation or making a donation to Young Minds Together and help the fight against racial injustice.

Rules of Engagement – 10th November

Happy to announce that I’m curating a new programme called Rules of Engagement for the Open Data Institute’s annual Summit on November 10th. The programme features new commissions from Nick Briz, A.M. Darke and Everest Pipkin. By seeing people as more than just data points, Rules of Engagement asks those with power to reimagine how we engage with data, advocating for an ethical data future for everyone.

The Open Data Institute (ODI) arts programme Data as Culture harnesses the critical and unexpected voices of artists in response to ODI’s research. The current research and development programme looks at sustainable data access and building trust through certification, and creating data infrastructure for common challenges.

Rules of Engagement is curated by guest curator Antonio Roberts who was inspired by the numerous scandals involving data towards the end of the 2010s. The artist’s work will be integrated throughout the ODI Summit 2020 – Data | Futures and online.

Commissioned artists Nick Briz, A.M. Darke and Everest Pipkin interrogate the systems that have allowed unethical use of data. Through their work, the artists ask important questions that all of us should be considering, such as why could there be mistrust in current data practices or should data collection even be considered in the first place and who are the people or communities impacted by data misuse.

The artists have taken a very open approach, exposing ‘black-box’ AI systems, showing what technology says about us; challenging people who work with data and those who are subjects of systems that use data to reflect on their own biases, which may influence how data is used and collected.

Nick Briz – howthey.watch/you

Nick Briz’s commission, howthey.watch/you exposes the tracking technology built into our everyday experience of internet browsing. In this online work, the artist discusses this technology and asks important questions about its uses beyond fingerprinting and, ultimately, tracking.

A.M. Darke – ODI R&D Artist in Residence

As Research & Development artist-in-residence, A.M. Darke is researching a new work which will confront us with the biases and prejudices embedded into algorithmic systems which govern everything from credit ratings to criminal convictions. The artist is seeking to create a system imbued with their own biases, to expose how algorithms are extensions of its programmers. They want to reveal the uncomfortable truths surrounding algorithms’ far-reaching consequences, particularly for people from marginalised communities. During the Summit, they will take part in an in-conversation with curator Antonio Roberts discussing the challenges of creating such work while consistently working within a data ethics framework themself.

Everest Pipkin – Shell Song

Everest Pipkin’s Shell Song is an interactive audio narrative game about corporate deep-fake voice technologies and the datasets that go into their construction. The game explores physical and digital bodies and voices, asking what a voice is worth, who can own a human sound, and how it feels to come face to face with a ghost of your body that may yet come to outlive you.

All of the commissions and residency details can be found on the ODI’s Data as Culture website.

All of the commissions and residency will launch at the Summit on 10th November and will then be available to the public by 11th November. Check back here on 11th November or follow me on Twitter/Instagram for links to the artworks.

Thanks to Hannah Redler-Hawes and the ODI for the invitation to curate this programme, I’m really happy with the artworks!

Compassion Through Algorithms Vol. II

I have a new track coming out on November 6th as part of the Compassion Through Algorithms Vol. II compilation, which is raising funds for Young Minds Together.

We’re a group of people from England’s North (from Birmingham up) making music and art from algorithms, shared here in solidarity with the Black Lives Matter movement.

We join calls for justice for George Floyd and Breonna Taylor, but also reflect on the situation here in the UK, including the lack of justice for Stephen Lawrence, for Christopher Alder, for the people lost in the New Cross and Grenfell fires, for the Windrush deportees and all suffering under our government’s ‘hostile environment’ policy.

We want educational reform, so that the next generation can open their eyes to Black British history. Stating that ‘Black Lives Matter’ should not be difficult, but right now it’s not enough to be non-racist. We need to be anti-racist.

We share this compilation on a ‘pay as you feel’ basis, but please give generously if you can. All proceeds will go to Young Minds Together, a group of Black girls making music and dance in Rotherham UK, in need of your help to rebuild post-pandemic.

The compilation features tracks from 65daysofstatic, TYPE, Michael-Jon Mizra, Anna Xambo, Yaxu, Shelly Knotts, 0001, Antonio Roberts (that’s meee), Leafcutter John, and features awesome artwork from Rosa Francesca. November 6th is Bandcamp Friday, so if you buy it then Bandcamp will waive their fees and so more funds can be donated. Of course, you can always donate to Young Minds Together directly.

Black Lives Matter.

Peer to Peer: UK/HK – 11th – 14th November 2020

From 11th – 1th Novemebr I’ll be presenting new commissioned work as part of the Peer to Peer UK/HK programme.

Peer to Peer: UK/HK is a digital programme and platform encouraging meaningful cultural exchange and forging enduring partnerships between the UK and Hong Kong’s visual arts sectors.

The programme launches with an online festival of international exchange and collaboration taking place 11-14 November.

The Festival will include an online exhibition of digital artworks from UK and Hong Kong based artists, including 5 new commissions by artists nominated by UK and Hong Kong based partners. There will also be a series of digital residencies taking place across partner organisation’s social media channels as well as a set of curated panel discussions.

The Festival is led by Ying Kwok (Festival Director and independent curator, HK), with Lindsay Taylor,  (University of Salford Art Collection), Open Eye Gallery and Centre for Chinese Contemporary Art (CFCCA), supported by a project team.

In the spirit of exchange and collaboration the Festival is piloting a distributed leadership model, involving co-curation and co-production with partner organisations.

The project has been generously supported by funding from Arts Council England and the GREAT campaign.

I’m one of the five commissioned artists, alongside Danielle Brathwaite-Shirley, Hetain Patel, Lee Kai Chung and Sharon Lee Cheuk Wan. My commission will be a live coded audio/visual work which will then enter the University of Salford Art Collection as a permanent legacy of the project. Many thanks to Charlotte Frost from Furtherfield for the nomination!

Coder Beatz

Happy to be working with Birmingham Open Media to deliver Coder Beatz, a creative digital programme focusing on live coding for young black kids in the West Midlands.

Coder Beatz a new creative digital programme for young black kids aged between 11-15 years old.
We are running 4 monthly Coder Beatz workshops between November 2020 and February 2021. In each session we will be teaching kids how to create digital music and visuals using live coding and algorithms. The sessions will be delivered by Antonio Roberts who is a renowned digital artist and expert coder. Being a man of colour, Antonio is really passionate about inspiring young black kids to get skilled up on coding music and visuals.

Kids will not need any music or tech experience, and we will provide laptops and headphones for them at BOM’s art center.

Over four sessions I’ll be teaching how to use TidalCycles for making music and Improviz for making visuals. All of the details, including sign up details, can be found by contacting Birmingham Open Media.

On a personal level I’m really happy to be delivering this programme because during the six-ish years I’ve been live coding at Algoraves I’ve noticed that the scene is very good at addressing gender inequalities but, at least in the UK scene, it’s still very white (which could probably be said of electronic music more generally).

Through delivering the programme I hope to demonstrate the creative possibilities of programming and, while I don’t expect those who take part to become fully fledged Algoraves, I do hope it encourages them to explore ways of making digital music and art beyond the “standard” ways of using tools like Ableton and Adobe software.

I also recognise that there are other issues that need to be addressed to make live coding more diverse. For example, encouraging more black people to build live coding tools, recognising and celebrating the impact black culture has had on digital art/music… And I hope this is part of that process.

Please get in touch with BOM if you’re interested or know anyone who would be great for this!

The Stay at Home Residency – part 3

From 1st – 29th July I was happy to be selected as an artist in residence for The New Art Gallery Walsall’s Stay at Home Residencies.

In the second blog post I looked at how I approached filming. In this third and final blog post I’ll be detailing my sound making process and sharing the finished film.

The next stage in making this film was working on the sound. As you can hear in a couple of the clips in the previous blog post the area that I live in is really really quiet! Everyone in the local area was using the Summer time to sit outside bathing in the sunlight. Was very relaxing for sure but recordings of the ambient background noise didn’t make for an interesting soundtrack. There was once the sound of a wood chipper but otherwise it was mostly silent. At times me playing music was the loudest sound!

Instead I took to making recordings from within the home. This process made very aware of the variety, and at times lack thereof, of sounds in my home environment. There’s lots of shuffling, tapping, television and dampened thud sounds. With the exception of the television, the place with the most variety of sounds is most definitely the kitchen and so most sounds I used came from there. There’s sounds of glass, metal, wood, and water and even from inside the fridge!

If you’ve been following any of my work for a while you’ll see that I’ve done a lot of live coding performances over the last two years. I like the liveness of this process and so chose to incorporate it into my sound making process. I took the samples that I recorded into TidalCycles and got coding! Here’s some of the recordings along with variations on the code that created them.

setcps(50/60/4)

d1
$ sometimes (fast 2)
$ whenmod 8 6 (# speed 0.5)
$ slow "4 2? 1"
$ sometimes (# accelerate "-0.05 0 0.02")
$ loopAt "1 0.25?"
$ stutWith 4 (1/8) (# speed 1.25)
$ sound "bowl*<1.5 2 1> blinds*<1 2>"
# n (irand 3)
d2
$ sometimes (fast 1.35)
$ striate "2 4 8"
$ stutWith "8 2 1" (1/16) (# speed (irand 3-1))
$ sound "droplet*4"
d3
$ every 7 (# speed "0.5")
$ slow 4
$ sometimes (striate "8")
$ stutWith 8 (1/8) (soak 4 (|+ speed 0.15))
$ juxBy (slow 3 $ sine) ((# speed 2) . (# accelerate "-1"))
$ sound "stackingplates*2 [whack(3,9)]"
# n "1 2"
# pan (perlin)
d4
$ hurry "2 1 4 8"
$ sound "whack*4"

Although not the same as the drone soundscapes that Rodell Warner creates I thought they provided a lot of texture and would work well as an accompaniment to a drone soundscape. For that I loaded up Ardour and the Helm synthesiser.

The process of making and putting together all of these separate parts was in no way linear. The tutorials I followed all recommended writing a script or having a plan and I certainly didn’t have either. For this exploratory stage of my journey into film making I think that was mostly ok but for anything in the future I would at least consider what kind of atmosphere, emotions, or general message I wanted to convey.

The actual editing process was a big chore. Open source video editing software on Linux still leaves a lot to be desired. Despite there being a number of video editors available nearly all of them have one failing in common: stability. With just a few HD resolution clips and no effects or transitions I was experiencing a lot of stuttering during seeking and playback and crashes when rendering. This, of course, caused a lot of frustration and definitely resulted in me spending less time editing than I would have liked to. For recent videos I’ve used Olive which has worked really well – seeking on the timeline is fast and there are few crashes – but at the time of editing version 0.2 was still too unstable to be usable.

After that last hurdle I feel I have produced a film that demonstrates a lot of what I’ve learnt.

The film, titled Windows Explorer, represents my desire to be out in the world again. Like pretty much everyone my world has shrunk and my engagement with the world comes from looking out of and into various windows, whether that be out of my office window or into a Zoom, Skype, Teams, Jitsi or whatever window.

With Thanks

This residency was certainly a big earning experience. In a conversation with the curators at the gallery I expressed concern that I wasn’t making enough, or that everything that I was making was, well, crap in comparison to the digital art portfolio that I’ve built up over the last decade. They reassured me that I was trying something new and so I can’t be expected to be immediately great at it. Even if I was in a situation where I had access to a team and equipment, a month isn’t really a long time to fully learn a new skill and make a complete piece of work using that skill. This really helped to put into context that this residency was time for me to reflect on my practice and to learn at my own pace.

From this residency I feel a lot more prepared to make narrative film, even if it’s a 1-minute film. I’ve already upgraded my equipment in preparation for future projects and have more knowledge of the multi-level process that goes into making a film.

Many thanks to The New Art Gallery Walsall for this opportunity 🙂

The Stay at Home Residency – part 2

From 1st – 29th July I was happy to be selected as an artist in residence for The New Art Gallery Walsall’s Stay at Home Residencies.

In the first blog post I looked at my influences and research carried out before I started making work. In this second blog post I’ll be showing some of the filming I did.

With the research conducted and panic now over I started filming again. I began by filming various things in my home. I tried to focus on shots that would have some movement in them, even if it were only background movement. Because of this most of my shots look out of a window. Although the background is blurred whatever movement there is – be it the trees, people, or lights turning on/off – makes the still shot that little bit more interesting.

Next, I decided to bring out my projector and see what I could do with it. By now my projector is at least seven years old (I originally purchased it for a BYOB event in 2013) and so not only is the projection quality quite poor, there are glitchy lines running through the the projection.

I had thought about making animations to project onto various objects, but I didn’t want to turn this into an animation project. I’ve long used my Glass video when experimenting with projections and I liked how to made any surface it landed on just way more interesting. To replicate this saturation of glitchy colour and movement I installed a copy of waaave_pool onto a Raspberry Pi, connected a webcam to it and pointed the webcam at random surfaces in the room.

Video waves itself is a bit like a video synthesiser, working primarily with webcam video input. With that installed I made some things like this:

I liked these projection experiments most when they were really subtle. I didn’t want the projection to overpower the surface and render it invisible or irrelevant. For example, in one experiment I projected onto cushions, which looked really great but the cushions got lost behind the projections.

I also played with a strip of LED lights I had from a previous project. They can be programmed to to flash quickly but they seemed to work best when they were pulsating slowly, which very much matched the pace of the shots I had filmed so far.

In the next blog post I’ll be detailing how I made sounds for the film and sharing the finished film.

The Stay at Home Residency – part 1

From 1st – 29th July I was happy to be selected as an artist in residence for The New Art Gallery Walsall’s Stay at Home Residencies.

The New Art Gallery Walsall has adapted its Studio residency programme in the wake of the Coronavirus pandemic to support three artists based in the West Midlands to produce work from their homes between May and July this year.

Following an open-call to artists based in the West Midlands, the Gallery received 60 varied proposals from a diverse range of artists working across the region. The many challenges that artists are facing during lockdown were well articulated. In selecting, we were keen to see opportunities for artistic and professional development during these challenging times, to support creative approaches to practice amid imposed restrictions and to explore the benefits and possibilities of sharing with an online audience.

It’s been some months since the residency ended and I really learned a lot. In this three-part blog post series I’ll be talking a bit about the month of learning and creating, the struggles I had, what I did to overcome them, and some of my thoughts on the final outcome. In this first blog post I’ll be going over my research and influences.

My reason for doing the residency was to explore ways of making work without a computer. Quoting from my application:

Creating my digital video works is a very stationary process, requiring me to spend long hours sat in my home office at my desk on the computer. I have long had a desire to step away from the desk and learn film and sound production techniques. I already own much of the required equipment including a DSLR camera, microphone and tripod. I have mainly used these to document events or exhibitions.

This residency would grant me the opportunity to step into learning film production techniques. I will study available materials (digital books and tutorial videos) and implement what I learn when creating the films.

Looking back over the last 10 years of my practice I have noticed that most of my work has been computer generated videos and animation.

Loud Tate: Code

Most of these works are generative and, much like animated gifs, they don’t have an extensive narrative and are best viewed on repeat. This isn’t a downside to the works, but making something with a narrative using filmed footage was definitely of interest to me for this residency.

I began the residency exploring the technical processes involved in film making. I have used cameras for a long time but often I don’t explore their full capabilities. I usually just leave the settings on Auto and most of the time it works out fine! This is similar for lenses. The camera I owned at the time of the residency was a Olympus Pen F together with a 45mm and 17mm lenses. I only ever really understood that the former is good for portraits and the latter for landscapes/outdoor but still didn’t understand why.

I wanted to understand this and more so spent a lot of time watching videos and reading tutorials. Two really interesting videos were The Changing Shape of Cinema: The History of Aspect Ratio and The Properties of Camera Lenses from Filmmaker IQ.

These two videos, and the many others I watched late one evening, went into far more detail than I needed about film, the history of cinema, and equipment. I also didn’t own 99% of the equipment and resources the videos mention, but it was really interesting to know how all those things go into making a film and achieving a certain cinematic look.

The next set of videos that was really insightful was the Crash Course Film Production series of videos. The Filmmaker IQ videos focused on specific details about film making whereas these videos were perhaps more relevant to me as they were produced from the viewpoint of someone with no knowledge wanting to know what goes into making a film. The third video in particular, The Filmmaker’s Army,is particularly enlightening as it explains a lot of the roles in a film production and how each work together to make a finished film.

One of the main things I took from watching this series of videos is that there is a lot of planning that goes into a film. Depending on the scale of the project the time between writing a script and filming can be years! And when on a film set a lot of the roles are there to ensure each person is doing the correct things at the right time.

Although all of this was really exciting and inspiring to learn at the beginning of the residency there was one big problem: Almost all of it would not be applicable to me at this time. Quoting my application:

Using tools and materials I have in my home – which include programmable lights, a projector, screens, and other electronics – I want to create a series of short abstract films that explore the use digital art, light, and projection to illuminate my home and immediate surroundings. The everyday objects in the home, the grass outside, the brickwork and more will act as both creative material and canvas for abstract projections.

I was strict in my desire to create a film only within the home. This meant that I couldn’t acquire stage lights, microphones or other equipment. I had to use whatever I had in whatever filming conditions I was given. Still, These restrictions could hopefully provide inspiration.

Early on I struggled to make anything interesting. I filmed whatever I could find in my home but it was all very static and at times boring. It was then that I realised that the domestic environment, especially during lockdown, is a pretty boring place! In my household there are only two people and the environment doesn’t change that much. It’s not like the outdoors where the environment changes, or like a gallery space which can reconfigured and has access to lots of equipment. In short, everything is just static. I was very worried that whatever I made would be very boring to watch.

I started to look to other films and artists for inspiration. I was browsing Mubi one day and saw a movie called Villa Empain by Katharina Kastner. I had no idea what it was about at the time but it was short and gave me a distraction from the panicking!

It turned out to be exactly the kind of film I needed to see. To me it was a series of animated portraits of the Villa Empain building. A lot of the shots in the film were static, featuring minimal movement from the pool water, trees, or sun shining through the stained glass windows. It was quite a meditative film. It helped to show me that a film didn’t need to be action packed to be interesting.

I also remembered the work of Rodell Warner (having first seen their work in 2019 at bcc:). In his Augmented Archive series he’ll take an archive picture, add a drone soundtrack to it and animate it using a flickering effect (plus his own 3D sculptures). Of course there is a much deeper concept than my very technical description (and you should see more of his work to understand), but seeing his work showed me that there are ways to add depth and movement to static imagery.

In the next blog post I’ll be detailing the process of filming shots.