On 8th November a new video work of mine will be screened at Tate Britain as part of Loud Tate
Head over to Tate Britain for a fascinating free day of art, music and performance curated by Tate Collective London. BP Loud Tate: Code explores how codes in language, fashion and technology shape culture, inspired by the displays at Tate Britain.
The video work, which will be joined by three other awesome glitch artists, will be projected somewhere in the gallery, so get hunting!
In addition to all the other activities going on throughout the day – t-shirt making, zine making, DJs and music – Rosa Menkmanwill be giving a talk/workshop about glitch art, which I highly recommend attending. Come meet us all!
Glitch art is enshrined within digital, internet and popular culture, with its distorted and colourful aesthetics being regularly featured in blogs, festivals, music videos, exhibitions and games.
With it now being more commonplace, what can be done to develop it as a concept and aesthetic, and take it past being merely an image of a broken JPG or compression artifacts/datamoshing? Can it jump off the screen into other art forms? How can one glitch their own practice?
For OSP’s Print Party on October 18th I made a bunch of new software and modified some old software that did sonification of text, bitmap, and vector images. I saw it as a way to differently engage with the usually static images featured throughout the festival. Here’s some of the visual output from a Processing sketch that, when combined with Pure Data, would redraw an SVG and convert it to sound:
And here’s an example of a fish being sonfied. Expect lots of bleepy noises:
If you’re interested in using the software the repository is located here and this particular code resides in the listetoSVG folder. The Processing sketch for that piece is a modification of this sketch by emainorbit. I may revisit and develop this software but for now there will be no detailed writeup of how to use it.
As an experiment it was really interesting to “bring to life” a static programme. There is a danger that the performances – whcih I’m sure OSP will write about at some point – deviated too much from the idea of a print party, but in isolation I felt they worked really well!
The Moving Image Project is a project by Charlie Levine that takes a selection of artworks across the world for screenings. Screenings may happen within the traditional gallery space or may go wherever Levine is, being screened via a projector attachment for her phone:
Glimpses of life caught through windows as you speed past in car, trains or planes: the transient state of time as we move around the world crossing time zones and countries: the daydream of staring, watching and waiting.
The Moving Image Project is a series of short films selected by UK curator Charlie Levine. The project began development and touring in early 2014 when Levine herself toured to the other side of the world. Levine originally commissioned/approached various artists to produce silent short films, or films that did not reply on sound, that touched up this idea of movement and travel, of glimpses, the mundane and the world.
On 19th September I did a Sonification Studies performance at glitChicago. I had previously done a demonstration/performance of this at Relearn in Brussels, Post-Modern Plant Life in Leamington Spa, and at September’s Digbeth First Friday at Vivid Projects, but the glitChicago was what I consider to be the first proper performance. A culmination of my reserach, if you will.
The focus for me at this performance was to get a sense of rhythm. Previous performances experimented with using found image sources, which resulted in a somewhat chaotic performance. For the glitChicago performance I composed a series of images with repeating patterns.
I separated these images into four groups, one for each channel. There were no rules strict for what went into each group, but I thought about which would sound better as beats, which have similar colours etc. A Bill Miller shot a bit of my performance, but bow that I’m back in the UK with all my equipment (actually just a midi controller) I present to you a remake of my performance. Warning: it’s loud and there be flashing images.
I’d be really interested to see and hear what others make with the software, or they take the code and extend it somehow.
Back in June 2014 I wrote how that in 2013, after visiting The Cyborg Foundation in Barcelona, I became interested in exploring sonification. My experients at that stage culminated in the production of the Pixel Waves Pure Data patch, which allows the sonification of images based on the colour/RGB values of individual ppixels.
I spent the following months building and refining an update to the Pixel Waves software, with a focus on allowing multiple images to be played simultaneously. In a way, I wanted to create a sequencer but for images. After many months I’m happy to formally announce the release of the Pixel Player.
This software operates in a similar way to Pixel Waves, but with a focus on playing multiple images simultaneiously. Instructions on getting started:
Create the GEM window
Click on the red button to load an image. Supported file types depend on your operating system, but generally jpg, gif and png file formats are supported
Click on the green start button and the pixels will start to be read
Drag the orange horizontal slider up to increase the master volume
Drag the orange vertical slider up on each pixel player to control its volume
Turn the knob to scale the pitch of the audio
The currently displayed/sonified pixel for each channel will be synchronised from the first channel. For this reason it is recommended that all of the input images used are the same dimensions.
This may sound like a lot to do but it becomes easy after a few attempts. To make things easier the loadimage.pd patch has inlets that you can use to control each channel with a midi controller, keyboard, or any other device. To expose the inlets increase the canvas size of the patch by around 10 pixels.
The software includes a video display output, which shows the current pixel colour. This can also be shown on the patch window by clicking the red display button. Flashing lights might not be to everyone’s taste, so this can be turned off. Due to this patch relying on [pix_data], the GEM window needs to be created, even if the pixel display isn’t used.