Nodes

Nodes is a new commission created for the Peer to Peer: UK/HK online festival which ran from 11th – 14th November, created as a reflection on the interconnectedness of the global live coding community.

Live coding is a performative practice where artists make music and visual art live using programming. This happens primarily at events such as Algoraves, but there is an equally active online community which organises regular performances, conferences, workshops and more.

Moving beyond e-mail and social media platforms, people within the community have built their own tools which allow for real time communication and collaboration across borders and time zones. In this way the local nodes the global live coding community are able to stay connected.

Many thanks to Dr Charlotte Frost from Furtherfield for the nomination. Nodes was commissioned on the occasion of Peer to Peer: UK/HK online Festival 2020 by Centre for Chinese Contemporary Art, Open Eye Gallery and University of Salford Art Collection.

Making Pulse

On November 6th the Compassion Through Algorithms Vol. II compilation was released, raising money for Young Minds Together . The compilation is still available, and of course you can donate directly to Young Minds Together if you prefer.

In this blog post I’ll be going over how I made my track, Pulse.

I’m two years into making music and I’ve recently become more comfortable and confident in my processes. I’ve gotten over the technological hurdles and, having experimented in making music/sounds of different styles both in private and at Algoraves, I feel I’ve found a range of styles that I like making music in. In the live coding music world some of my biggest influences have been eye measure, Miri Kat, Yaxu, and Heavy Lifting. Their work spans many genres but what I’m drawn to in their music is the more sparse, ambient and even sometimes aggressive sounds. I tried to keep this in mind when making Pulse.

As with most things I make I started first by just experimenting. I can’t fully remember my thought process but at some point I landed on turning a kick drum (“bd” in Tidal) sound from a percussive to a pitched instrument. I achieved this by triggering the sample many times in quick succession and playing with the speed in which it was played back.

setcps (135/60/4)

d1 
$ sound "bd*4"

d7
$ loopAt "0.75 0.25"
$ chunk 4 (slow "2 4")
$ sound "bd*<32 16> kick*32"
# speed (fast "0.25 1" $ range 4 "<0.25 0.75 0.5>" $ saw)
# accelerate "3 2 0"
# gain 0.75
# pan (slow 1.2 $ sine)

I like the piercing buzzing nature of the sound and so decided to focus on building the track around this. Next I had to get the tempo right. By default Tidal runs at 135 bpm (0.5625 cps). Running that code at 135 bpm felt way too fast and so I tried bringing it down to 99 bpm.

It’s no longer at a speed to dance to but makes for better listening. It also meant I could more accurately identify what note the buzzing sound was at. The loopAt command affects the pitch of the samples and it is itself affected by the tempo that Tidal is running at, so setting it at 99 bpm (setcps (135/60/4)) revealed that the buzzing sound was at a G-sharp. It’s probably still a little bit out of tune but it’s close enough!

In late August I bought + was given the Volca Bass and the Volca FM synths. By this time I had been using bass samples in this track but saw this as an opportunity to give these newly acquired synths a try! The Tidal website has instructions on setting up midi, which worked well. One issue was that I was using two of the same usb-to-midi adaptors. On the surface this isn’t an issue, but, at least according to the midi Tidal instructions, when adding a midi device it does so by name and not by any sort of unique ID. Running MidiClient.init: with both adaptors connected gave me this:

MIDIEndPoint("USB MIDI Interface", "USB MIDI Interface MIDI 1")
MIDIEndPoint("USB MIDI Interface", "USB MIDI Interface MIDI 1")

I didn’t know which of the two adaptors Tidal was going to send midi messages to and so no idea which synth would be triggered! Fortunately Alex McLean was on hand to provide a (linux-specific) solution. The dummy Midi Through Port-0 port exists by default and so Alex suggested adding another one. I’ll quote from Alex from the Toplat chat:

if you add options snd-seq-dummy ports=2 (or more) to /etc/modprobe.d/alsa-base.conf
you’ll get two of them
the other being
Midi Through Port-1
obvs
then you can tell supercollider/superdirt to connect to them
then go into qjackctl and the alsa tab under ‘connect’ to connect from the midi through ports to the hardware ports you want
then you can make them connect automatically with the qjackctl patchbay or session thingie
I like doing it this way because it means I can just start supercollider+superdirt then change round which midi device I’m using super easily.. plugging/unplugging without having to restart superdirt
I don’t know if this will solve the problem of having two devices with the same name but hopefully..

With that all fixed I recorded my track! Here’s a live recording of me, um, recording it. It is made using Tidal, the code is just on a screen out of shot.

As you may have noticed there’s some latency on the Volca bass. I should have adjusted the latency in Jack to account for this but at the time didn’t realise that I could do this or even how to do it. However, I was recording the Volca Bass and FM onto separate tracks in Ardour so I was able to compensate for the latency afterwards.

On reflection I should have recorded each orbit (d1, d2 etc) into separate tracks. At the time I didn’t realise I could do this but it’s pretty simple withclear instructions located on the Tidal website, and there’s friendly people on the Toplap chat who helped me. This would allow me to do additional mixing once it was recorded (my Tidal stuff is typically way too loud). Aside from those observations I’m really happy with how it sounds! I’ve shared my code below, which may be useful to study but of course you’ll need Volca’s/midi devices to fully reproduce it.

setcps (99/60/4)

d1 -- volca fm
$ off 0.25 ((fast "2") . (|+ note "12 7"))
$ note "gs4'maj'7 ~"
# s "midi1"

d6
$ stack [
sound "kick:12(5,8) kick:12(3,<8 4>)",
sound "sd:2",
stutWith 2 (1/8) ((fast 2) . (# gain 0.75)) $ sound "hh9*4",
sound "bd*16" # speed 2 # vowel "i"
]

d4 -- volca bass
$ fast 2
$ stutWith 2 (1/4) ((|+ note "24") . (slow 2))
$ note "~ ~ ~ gs2*2"
# s "midi2"

d7
$ loopAt "0.75 0.25"
$ chunk 4 (slow "2 4")
$ sound "bd*<32 16> kick*32"
# speed (fast "0.25 1" $ range 4 "<0.25 0.75 0.5>" $ saw)
# accelerate "3 2 0"
# gain 0.75
# pan (slow 1.2 $ sine)

d2 -- transpose volca fm
$ segment 32
$ ccv 50
$ ccv (range 10 (irand 10+60) $ slow "8 3 7 3 1" $ sine )
# ccn "40"
# s "midi1"

If you enjoyed my track or any of the others on the compilation please consider buying the compilation or making a donation to Young Minds Together and help the fight against racial injustice.

Rules of Engagement – 10th November

Happy to announce that I’m curating a new programme called Rules of Engagement for the Open Data Institute’s annual Summit on November 10th. The programme features new commissions from Nick Briz, A.M. Darke and Everest Pipkin. By seeing people as more than just data points, Rules of Engagement asks those with power to reimagine how we engage with data, advocating for an ethical data future for everyone.

The Open Data Institute (ODI) arts programme Data as Culture harnesses the critical and unexpected voices of artists in response to ODI’s research. The current research and development programme looks at sustainable data access and building trust through certification, and creating data infrastructure for common challenges.

Rules of Engagement is curated by guest curator Antonio Roberts who was inspired by the numerous scandals involving data towards the end of the 2010s. The artist’s work will be integrated throughout the ODI Summit 2020 – Data | Futures and online.

Commissioned artists Nick Briz, A.M. Darke and Everest Pipkin interrogate the systems that have allowed unethical use of data. Through their work, the artists ask important questions that all of us should be considering, such as why could there be mistrust in current data practices or should data collection even be considered in the first place and who are the people or communities impacted by data misuse.

The artists have taken a very open approach, exposing ‘black-box’ AI systems, showing what technology says about us; challenging people who work with data and those who are subjects of systems that use data to reflect on their own biases, which may influence how data is used and collected.

Nick Briz – howthey.watch/you

Nick Briz’s commission, howthey.watch/you exposes the tracking technology built into our everyday experience of internet browsing. In this online work, the artist discusses this technology and asks important questions about its uses beyond fingerprinting and, ultimately, tracking.

A.M. Darke – ODI R&D Artist in Residence

As Research & Development artist-in-residence, A.M. Darke is researching a new work which will confront us with the biases and prejudices embedded into algorithmic systems which govern everything from credit ratings to criminal convictions. The artist is seeking to create a system imbued with their own biases, to expose how algorithms are extensions of its programmers. They want to reveal the uncomfortable truths surrounding algorithms’ far-reaching consequences, particularly for people from marginalised communities. During the Summit, they will take part in an in-conversation with curator Antonio Roberts discussing the challenges of creating such work while consistently working within a data ethics framework themself.

Everest Pipkin – Shell Song

Everest Pipkin’s Shell Song is an interactive audio narrative game about corporate deep-fake voice technologies and the datasets that go into their construction. The game explores physical and digital bodies and voices, asking what a voice is worth, who can own a human sound, and how it feels to come face to face with a ghost of your body that may yet come to outlive you.

All of the commissions and residency details can be found on the ODI’s Data as Culture website.

All of the commissions and residency will launch at the Summit on 10th November and will then be available to the public by 11th November. Check back here on 11th November or follow me on Twitter/Instagram for links to the artworks.

Thanks to Hannah Redler-Hawes and the ODI for the invitation to curate this programme, I’m really happy with the artworks!

Compassion Through Algorithms Vol. II

I have a new track coming out on November 6th as part of the Compassion Through Algorithms Vol. II compilation, which is raising funds for Young Minds Together.

We’re a group of people from England’s North (from Birmingham up) making music and art from algorithms, shared here in solidarity with the Black Lives Matter movement.

We join calls for justice for George Floyd and Breonna Taylor, but also reflect on the situation here in the UK, including the lack of justice for Stephen Lawrence, for Christopher Alder, for the people lost in the New Cross and Grenfell fires, for the Windrush deportees and all suffering under our government’s ‘hostile environment’ policy.

We want educational reform, so that the next generation can open their eyes to Black British history. Stating that ‘Black Lives Matter’ should not be difficult, but right now it’s not enough to be non-racist. We need to be anti-racist.

We share this compilation on a ‘pay as you feel’ basis, but please give generously if you can. All proceeds will go to Young Minds Together, a group of Black girls making music and dance in Rotherham UK, in need of your help to rebuild post-pandemic.

The compilation features tracks from 65daysofstatic, TYPE, Michael-Jon Mizra, Anna Xambo, Yaxu, Shelly Knotts, 0001, Antonio Roberts (that’s meee), Leafcutter John, and features awesome artwork from Rosa Francesca. November 6th is Bandcamp Friday, so if you buy it then Bandcamp will waive their fees and so more funds can be donated. Of course, you can always donate to Young Minds Together directly.

Black Lives Matter.