Thank you for joining the past hydra meetups, and we are excited to announce the 5th edition!
Presentations
hellocatfood
Melanie Wilson
Jamie Faye Fenton
Thank you for joining the past hydra meetups, and we are excited to announce the 5th edition!
Presentations
hellocatfood
Melanie Wilson
Jamie Faye Fenton
In 2020 I did quite a number of workshops in using the Improviz visuals live coding environment. Improviz can be thought of as a fork of Livecodelab, especially as its developer, Guy John, is one the developers of Livecodelab. However, it has some key differences that make it stand out as its own unique software:
Being able to load your own textures might in itself be a reason for many people to switch from Livecodelab to Improviz. Things can be that just a bit more personalised when you’re using your own images and objects rather than only colours, gradients and basic geometrical shapes. Another potentially useful difference is that in Improviz you can interface with the software using Open Sound Control (OSC). This opens up the possibility of using software or external hardware devices. In this blog post I’ll take you through how you can connect a midi controller to Improviz via OSC and Pure Data.
To get started you first need to define a variable in Improviz that you want to be changed by OSC/midi. The name of this variable can be anything as long as it’s not a name already used as a function or variable in Improviz. Check the reference page for a list of reserved names. In my example I’ve used the variable name size.
size = ext(:size, 1)
Next, we need to connect to it via osc so that we can change its value.
When you launch Improviz via the terminal one of the messages you’ll see printed is the port it is using for sending message over OSC.
2021-03-25 20:53:.732595 INFO: Running at 640 by 480 2021-03-25 20:53:.732733 INFO: Framebuffer 640 by 480 2021-03-25 20:53:.390032 INFO: Loaded 3 texture files 2021-03-25 20:53:.437047 INFO: Loaded 8 material files 2021-03-25 20:53:.441641 INFO: Loaded 5 geometry files 2021-03-25 20:53:.441718 INFO: ***************************** 2021-03-25 20:53:.441766 INFO: Creating Improviz Environment 2021-03-25 20:53:.466755 INFO: Loading ./stdlib/variables.pz 2021-03-25 20:53:.466846 INFO: Loading ./stdlib/transformations.pz 2021-03-25 20:53:.466890 INFO: Loading ./stdlib/shapes.pz 2021-03-25 20:53:.466930 INFO: Loading ./stdlib/style.pz 2021-03-25 20:53:.466968 INFO: Loading ./stdlib/textures.pz 2021-03-25 20:53:.467004 INFO: Loading ./stdlib/screen.pz 2021-03-25 20:53:.467039 INFO: Loading ./usercode/grid.pz 2021-03-25 20:53:.467078 INFO: Loading ./usercode/seq.pz 2021-03-25 20:53:.467116 INFO: Improviz OSC server listening on port 5510 2021-03-25 20:53:.467297 INFO: Improviz HTTP server listening on port 3000 2021-03-25 20:53:.467405 INFO: Improviz resolution: 640 by 480
Of course you can, at this stage, use any software that can send data over OSC, but for this blog post/tutorial I’ll be using Pure Data. Alternatives exist but I like using it as it’s lightweight, stable and is cross platform.
To send OSC messages use the [netsend] object to connect to the same ip address as Improviz (usually 127.0.0.0) and same port (5510). [udpsend] will output a 1 from its only outlet to show a successful connection. With the connection established I can now send values from a number box to the variable via OSC!
Right now I’m using number box which has its values being set by me manually clicking and dragging. I could have the numbers being generated randomly by using the [random] object, or even have some level of audio reactivity by using the [adc] object. If that’s your thing you do it! Keeping to this blog post’s title I’ll be using a midi controller to change these values. For this next stage you should know that I’m using Ubuntu (20.10) as my operating system. This means that the instructions, especially those concerning connecting a midi controller, may be different for your operating system. Sadly I can’t help with that.
Connecting a midi controller to Pure Data is quite easy. I’m using an Akai MPK Mini MKII, but the instructions on connecting the controller are the same for pretty much any midi controller. First make sure that Pure Data is exposing at least one midi port. Change your midi backend to ALSA-MIDI in Media > ALSA-MIDI. Then go to Media > MIDI Settings… and make sure you have at least one midi input.
Then, open QjackCtl, click on the Connect button and under the ALSA tab connect the MPK Mini Mk II output port to the input port of Pure Data.
In Pure Data you can now read the Control Change (CC) values of a one of the knobs or pads using the [ctlin] object. On my MPK the first dial (K1) is [ctlin 1]. It outputs values from 0 – 127 (128 values). I want it to change the size of a cube from 0 – 4, so I need to map the ranges. I found this very handy mapping abstraction so I’ll be using that. With the ranges mapped I can use the knob on my controller to change the size!
For my next trick I want one octave, C5 to G5, to alter the shades of grey of the cube. The [notein] object will tell me the current midi number of the key being pressed. From that I can deduce that C5 to G5 is midi notes 48 – 59. Using the [maxlib/scale] object again I can map those ranges to 0 – 256 and send those values over OSC to a variable in Improviz that will be used to change the fill function.
For my final form I’ll use one of the pads on the midi controller to toggle a random colour generator.
One of the possibilities of using a midi controller to control visuals in this way is that you can control the audio and visuals simultaneously, rather than one being triggered in reaction to the other. In my experience of doing live visuals it has been quite normal for visuals to move or, as is quite often the case, pulsate in reaction to the amplitude of the music. In fact I did this many years ago for a video for My Panda Shall Fly.
What I’ve sometimes noticed is that there’s latency and the reactive visuals often feel like they’re coming in too late after the beat/instrument has hit. Of course the latency can be reduced by adjusting the sensitivity of the audio input device (microphone or line in) but then it’s a fine balancing act of both the musician and visualist adjusting levels. Achievable but a pain!
By having one device/controller triggering both you can, in theory, have both happen simultaneously. Here’s a demonstration of this from October 2020
As you can see the midi controller is controlling both the visuals and the audio. When I eventually get back to performing live gigs this is definitely something I’m going to explore further. Until then, have fun mixing live coding with midi controllers!
Tickets from £10
Improviz is an environment built by Guy John aka Rumblesan, which can be used for live coding visual performances. Its easy-to-learn language for creating visuals can be extended through the use of custom GL shaders and by using your own GIFs, 3D models and image textures.
This two-hour workshop, led by visual artist Antonio Roberts aka hellocatfood, will introduce you to the world of live coding, and guide you through the basics of using Improviz for live visuals. This workshop will also include a short look at how artists and musicians use code to make visuals and music in real time at Algoraves. If you’ve ever been curious about using code to make live visuals and have aspirations to perform at live coding events and Algoraves then this workshop is perfect for you.
No coding experience is necessary.
You will need:
– A Windows, Mac or Linux laptop that can connect to the internet
– Google Chrome or Firefox
– Improviz (desktop version)
– Atom
– Zoom
Ahead of the workshop please follow this guide to make sure you can locate and open Improviz using the terminal/command line on your computer.
Mac OS guide
Linux guide
Windows guide
Improviz is an environment built by Guy John aka Rumblesan, which can be used for live coding visual performances. Its easy-to-learn language for creating visuals can be extended through the use of custom GL shaders and by using your own GIFs, 3D models and image textures.
This two-hour workshop, led by visual artist Antonio Roberts aka hellocatfood, will introduce you to the world of live coding, and guide you through the basics of using Improviz for live visuals.
This workshop will also include a short look at how artists and musicians use code to make visuals and music in real time at Algoraves.
If you’ve ever been curious about using code to make live visuals and have aspirations to perform at live coding events and Algoraves then this workshop is perfect for you.
No coding experience is necessary.
You will need:
* A Windows, Mac or Linux laptop that can connect to the internet
* Google Chrome https://www.google.com/chrome/ or Firefox https://www.mozilla.org/en-GB/firefox/
* Improviz (desktop version) https://improviz.rumblesan.com/
* Atom https://atom.io/
* Zoom https://zoom.us/
Improviz is an environment built by Guy John aka Rumblesan, which can be used for live coding visual performances. Its easy-to-learn language for creating visuals can be extended through the use of custom GL shaders and by using your own GIFs, 3D models and image textures.
This two-hour workshop, led by visual artist Antonio Roberts aka hellocatfood, will introduce you to the world of live coding, and guide you through the basics of using Improviz for live visuals.
This workshop will also include a short look at how artists and musicians use code to make visuals and music in real time at Algoraves.
If you’ve ever been curious about using code to make live visuals and have aspirations to perform at live coding events and Algoraves then this workshop is perfect for you.
No coding experience is necessary.
You will need:
* A Windows, Mac or Linux laptop that can connect to the internet
* Google Chrome https://www.google.com/chrome/ or Firefox https://www.mozilla.org/en-GB/
* Improviz (desktop version) https://improviz.rumblesan.
* Atom https://atom.io/
* Zoom https://zoom.us/
Join us online to explore the work of young creatives with workshops, artist’s talks and music
Tate Collective Producers present a streamed version of Late at Tate Britain that explores identity, activism and how young creatives have responded to lockdown.
There will be two streams on this page on the night, running from 19.00 – 21.00 BST:
Stream 1 is a mix of artist talks and workshops
Stream 2 is music from Reprezent Radio
If you want to see any content you missed on either stream, both streams will be available until Friday 21 August.
20.00 – Reprezent Radio: The voice of young London
Reprezent Radio, the voice of young London, curate a playlist for the evening with visuals from digital artist Antonio Roberts streamed live on Mixcloud.
Earlier this year fellow visualist and live coder Rumblesan commissioned me to make some gifs for his new live coding software, Improviz. In July he unleashed it into the world!
Looking at the above videos you could easily be forgiven for thinking that it looks a bit like LiveCodeLab. He is, after all, one of the developers of LiveCodeLab. However, Improviz differs in a few ways. As Rumblesan himself explains in the Toplap chat:
the language in Improviz has a lot in common with live code lab, and the basic functionality for shapes, styles, transformations and loops is all pretty much the same. but in terms of implementation and usage they’re very different
lcl is using three.js as an intermediary, whilst improviz is entirely haskell and uses opengl directly (which I think long term is going to cause me grief but we’ll see haha)
the major difference is that improviz lets you use images and gifs as textures, which is something I’d like to back port to lcl, but wouldn’t be a small task unfortunately
That’s right, you can load textures! As mentioned before Rumblesan commissioned me to make a set of gifs to go along with the initial public release. They’re all released under a Creative Commons Attribution licence so you’re free to use them as you wish as long as you attribute me.
As an added bonus I’m also releasing the .blend file that was used to make each one.
Click here to download the Blender files.
These were made using a beta version of Blender 2.80. I’ve tested them in the stable release and they appear to work fine but they definitely will not work in 2.79 or earlier versions. I’m providing these for you to explore and won’t be doing a writeup/tutorial on how they work. If you remix them please share what you make 🙂
Definitely give Improviz a try! Thanks to Rumblesan for commissioning me to make the gifs 🙂
On Saturday 1st September I’m organising a meetup for visualists as part of Livecode Festival #2 at Access Space in Sheffield:
A session for live coding visualists (at any level) lead by Antonio Roberts (aka hellocatfood), to talk about their tools and how they perform, with focus on Algorave visuals.
A core part of the session will be discussion around key questions for live code visualists; how do you pace yourself in a performance? Should we aim to build up slowly or go straight in with loud visuals? How much can you truly respond to the music? Is it important to show the code, and how does it fit with the musician’s projection?
The session will run from 11:00 – 16:00 and will include workshops in Pure Data/GEM (led by me), Hydra (led by Will Humphries) and Livecodelab (led by Guy John).
Get your tickets here! And whilst you’re in the area get a ticket for the Algorave on the same night at 20:00 😉
A day of panel talks and performances about VJing, live cinema and audio-visual performance. This symposium will bring together academics, artists and VJs to discuss the role of projection in their work, and the current state of AV performance in the UK. A number of thematically focussed panel talks will take place in Centrala, Birmingham on Saturday 24th February, followed by an evening of AV performances from 7pm – 10pm, and ending with a VJ Jam until late. Running in parallel at Vivid Projects there will be a number of projection-based installations. This event is a collaboration between The Projection Project, an AHRC funded research project based at the University of Warwick, and Lighttouch Festival.
Confirmed speakers/artists include: Toby Harris (*spark), Rebecca Smith (Urbanprojections), Miri Kat, Antonio Roberts, Raquel Meyers, Rod Maclachlan, Guy Edmonds, Blanca Regina, Flatpack Film Festival, Sean Clarke (Test Card Manchester), Richard Wallace.
Come along to Vivid Projects on the evening of Fri Feb 23 for a warm-up party with loud music and VJs.
At AlgoMech 2017 we aim to take Algorave to the next level, bringing together some of the best algorithmic (and mechanical) dance music producers and VJs, playing over Sheffield’s fiendish DangerNoise soundsystem, with immersive projections covering the walls of Millennium Gallery. As with the rest of the festival we’ll be mixing mechanisms with the algorithms, showcasing repetitive dance music made from handmade robots as well as live code.