Datamoshing using Avidemux 2.7.0

Ever since I came across datamoshing in around 2010 via Bob Weisz‘s infamous datamoshing tutorials I have only successfully created a few datamoshed videos “by hand”.

Most times the video I created would be completely broken and not in a good way! And so since them I have used semi-automated datamosh scripts for my needs, like Autodatamosh from grampajoe.

Recently as part of my lecturing role at Staffordshire University I was asked to do a workshop on datamoshing. “This will be easy” I thought as I would just dig up Weisz’s tutorials and teach that. Sure, I couldn’t datamosh back in 2010, but since then I have become way more competent in creating glitch art, learning how software works and programming in general, so learning this widely practiced process didn’t seem impossible.

Of course I was wrong.

Y’see, in the 8 years since I came across datamoshng there have been a lot of changes. Specifically, Avidemux 2.5.4, which was released in 2010 and is the version referenced in Weisz’s tutorial, has been superseded many times and is currently at version 2.7.0. From reading different community pages it was my understanding that the changes in this new version (apparently) “fixed” or “corrected” features that allowed it to be (mis)used to make datamoshed videos. For the unaware, Avidemux is free and open source software that can be used for editing video files. It’s not an NLE, but if you need to make a quick edit to a video it can be useful. It’s also the gold standard for datamoshing.

As a workaround various people have suggested ways to use the 2.5.4 version of Avidemux, even to go as far as running an old OS on a virtual machine or not updating ever. Whilst this might work for now it’s not a recommended or sustainable. In time your OS will outgrow the software which will make it or impossible to install, and old software can introduce security bugs (yes, even buggy video editors can compromise your system). So I set about trying to provide a fix and datamosh a video using Avidemux 2.7.0. Below are my results.

After downloading Avidemux 2.7.0 you will need to do convert your video to the right format for datamoshing. Grab your input video and drag it into Avidemux. For my example I’m using this clip of a person getting hit with a balloon in slow motion.

Under the Video Output change Copy to Mpeg 4 AVC (x264). Click on Configure. In this window click on the on the Frames tab and under B-frames change “Maximum Consecutive B-frames to 0”, and under I-frames change GOP- Size Minimum to 0 and Maximum to 999.

Press OK when done.

In the main window leave Audio Output as Copy and Output Format as Mkv Muxer. With the settings now specified, go to File > Save and give your reencoded file a new name (file_reencoded.mkv).

We now need to open this new file to actually datamosh it! Go to File > Close and then File > Open and select your reencoded video. If you’ve ever followed Weisz’s tutorial, especially the 2nd and 3rd part (or the many copies that have since been made) you’ll already know the process of datamoshing. You can do exactly the same at this point, but for completeness in this tutorial I will go through how to manipulate P-Frames to make the “bloom” style of datamoshing. One of my favourite videos showing this style is Monster Movie by Takeshi Murata.

Using the playhead on the Navigation toolbar, or Left and Right on your keyboard, seek to a part in the video that you want to datamosh that is also a P-Frame. I recommend finding a part of the video that has lots of movement immediately before or after that point. You can tell that you have a P-Frame as the Frame type marker in the Navigation toolbar will tell you. Once there you need to select a P-Frame and copy/paste it over and over again. To do this press the Start Marker button (a red “A” button) (or press Ctrl + PgUP). Then move one frame forward and set press the End Marker button (a white “B” button) (or press Ctrl + PgDn). With the P-Frame highlighted copy it (Ctrl + C) and then paste it (Ctrl + V). And then paste it again. And again. Many times.

The more that you paste the P-Frame the more movement you will get in the bloom effect. Now, be careful and patient when pasting your P-Frames. There is a temptation to paste it hundreds of times but this will definitely slow down Avidemux whilst it catches up. You may also crash it but I haven’t had this happen yet. Perhaps 2.7.0 is a bit more stable than previous versions!

With your P-Frame(s) now repeated set the Start and End markers to be the whole video instead of just the P-Frames you originally selected. When you do this the blue highlight box might not cover the whole area of the timeline. It’s a UI error but it didn’t negatively affect anything. Leave all the Video and Audio options as they are (set to Copy) and then save your video (File > Save). You will get a warning about cut points not being keyframes.

Ignore this and press Yes. Open the finished file in VLC (other players might not like the video).

Voila!

As many others before me have suggested, you may want to resave, or “bake” your glitched file so that your datamoshed file, which is technically a “broken” file, will play well with other video editors and viewers.

As with all things concerning glitch art when you make it you’re doing something unconventional to a file in order to corrupt it in such a way that is aesthetically pleasing. As such, sometimes things just don’t work. Perhaps your video didn’t bloom as much, or maybe removing I-Frames made the file corrupt. I’ve tested this process on Ubuntu 17.10, Windows 10 and Mac OSX and whilst I feel confident that the process will work, the results will be unpredictable. If your result doesn’t turn out as you expect on a particular file then try a different file! Maybe try copy/pasting three P-Frames at a time, or remove some I-Frames. Experiment!

My thanks go to Bob Weisz for originally writing the tutorial and to the community over at the Avidemux forums for clarifying a few things with the new version of the software.

Five Days of Pure Data – Live Coding

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the last few days I have been releasing a few of these patches and techniques that I implement when programming in Pure Data.

Live Coding

The last tutorial isn’t that much about actually programming with Pure Data and is more about performing with it. More specifically, live coding. Live coding takes programming to a performative level. There’s been a lot of writing recently about Live Coding so take a look around. I’ve been programming live visuals for Pure Data since 2014 (a lot happened that year). It’s not without it’s problems, some of which I written about in the past.

One problem I’ve found is displaying your patch alongside your visuals. For live coding musicians this isn’t much of a problem as the output of the live patching is sent to the sound card, not necessarily the screen. For people live coding visuals in Pure Data the output (projector) would be needed for both the visuals output and the patch. If only there was a way to overlay the patches onto visuals (like in Fluxus, Cyril and Livecodelab)!

The most reliable cross-platform method to do this is to use Open Broadcaster Software (OBS). OBS is a great piece of software used by many gamers for livestreaming. A great feature of it is to be able to combine multiple media sources into one to then stream or send to a projector or video file.

To overlay your patch onto your visuals first create a GEM window with some graphics in it. In OBS, under Source click the + button and choose Window Capture. Give the new source a name. And then, under Window choose your GEM output window.

Next, create a new Window Capture source and give it a name (perhaps Pure Data patch). You may have noticed that your patch completely covers your GEM source. No worries! Right-click on your Pure Data patch source and click on Filters. In the next window click on the + button and choose Chroma Key. Give the new key a name (White).

In the next window, under Key Color Type choose Custom and then select the colour. You should choose white (#FFFFFF). You may need to change the settings under Similarity and Smoothness to get the right look.

If you’ve just tried this in Pure Data vanilla you’ll notice that your objects are also transparent. I’m using Purr Data (which you should too). In this version of Pure Data all of the boxes are slightly grey, resulting in them not being transparent when you apply the Chroma Key.

Imagine a scenario where you’re collaborating on an Algorave set with a musician and there’s only one projector. Both will want to project their code but there’s only one projector. What to do?! Luckily a combination of OBS and desktop sharing has you covered.

TidalCycles code from yaxu

If the musician can share their desktop over a network then you can add that in as a source in OBS and do the usual thing of applying a chroma key filter. However, this requires you both be on the same network, which isn’t always possible or efficient. However, if one of you has a recent Android phone you can get over this hurdle and have high quality desktop sharing quality.

On your phone create a wireless hotspot. You don’t need to use mobile data for this so feel free to turn it off.

Connect both machines to this newly created network.

On the remote (musicians) laptop, open Desktop Sharing options and enable the option “Allow other users to view your desktop”. You can choose to require a password and have a user confirm on each new access request.

On the host (visuals) machine open Remmina Remote Desktop Client.

In this new window click New. In the next window change the Protocol to VNC – Virtual Network Computing. In the Basic tap below click on the … button.

You should see the name of the remote computer listed. Click on one of them.

Finally, press Connect. When you try to connect to the remote machine you may get a prompt asking the user to confirm that the connection request is permitted.

Once the connection is approved you should see the remote machine on your computer!!! 🙂 And now you can go back to OBS and add that window as a source, apply a Chroma Key effect and then overlay it onto your visuals.

Sadly you can’t then use this source to apply as a texture to your objects in Pure Data, but it’s a great start to merging visuals and music live code. I first tried out combining my own code and visuals at the Chemical Algorave in Newcastle and it went really well. I also made use of OBS’s “Apply LUT” filter to change the colour of my whole video output.

MVI_3763

Chemical Algorave

I then tried incorporating a musicians code at the Algroave at BUMP in Kortrijk.

A post shared by Antonio Roberts (@hellocatfood) on

I’m hoping that this will help lead to more collaborations between Algorave musicians and visual artists 🙂

Five Days of Pure Data – Stop Motion

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Stop Motion

It was at the Co-Position meeting of the Libre Graphics Research Unit in 2012 that I first encountered Toonloop. It’s creator, Alexandre Quessy, gave a live performance using lights, Lego pieces and other objects. I was really quite in awe of how stop-motion was used to create quite an enjoyable performance.

Of course my first instinct was to try and recreate this but in Pure Data. I wasn’t the first to try, and I in fact have some memory of seeing Toonloop’s creator himself trying to write something similar in Pure Data although I can’t find any sources…

My first instinct to recreate Toonloop was to use [pix_write] to write a series of images and then play those back using [pix_image]. The problem there is that there’s no easy way to read an arbitrary set of images from a directory.

In the end I learnt about [pix_buffer_write] which allowed me to story an image frame into a buffer which I can then call back using [pix_buffer_read]. So that’s the basic functionality sorted! When I went to Databit.me in 2013 Axel Debeul helped to improve the patch a lot. The improvements allowed me to save a video from each animation. You can find the most recent version of the patch below

That’s where the problems start to arise, some of which I haven’t solved yet. The videos are created via [pix_record]. When a frame is captured it is sent to [pix_record] and then recording is paused. However, when you look at the saved video it has a really weird frame rate and doesn’t play smoothly. Even when the frame rate is set explicitly it somehow doesn’t work.

Don’t ask me about the cat ears

Perhaps making a video out of the animations is something to be done in post processing rather than in Pure Data. [pix_record] has always been a very complicated object to work with so perhaps I need to investigate further and try to find the right configuration of all of its many options.

Five Days of Pure Data – Infinite Scrolling

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Infinite Scrolling

For a performance at ChipFest 10 in 2014 I wanted to recreate the look of a scrolling video game but within Pure Data. Sure, I could have just used a console emulator or recorded a clip and used that but doing it all within Pure Data allows me to have some more flexibility with how I perform. Here’s what I made:

I eventually didn’t use it all but I did use the a few techniques I learnt for future performances.

They key to those visuals and for things like 2D side-scrolling computer games is to have infinite scrolling background. That is, to have a single image which tiles and repeats seamlessly as it moves across the screen. Not sure what I mean? check out these examples from TV history:

Borrowing a bit from this answer on the Game Development Stackexchange, first you create a single tile of size, say, 512×512. You scroll it across the screeen horizontally, and once it’s position gets to 512px (or -521px) it jumps back to it’s original position.

If the original tile is repeated a few times and then offset so that the borders of the tile can’t be seen then suddenly you have an infinite scrolling background! To repeat an object in Pure Data we use the [repeat] object. This object repeats any object and its translations. So, if a [cube] has a translation of 1 on the X axis and is [repeat]ed 4 times, then the first [cube] is offset by 1, the second [cube] is offset by 1 in relation to the first [cube] (or offset by 2 in total). The third cube is offset by 1 in relation to the second cube (or 3 in total), and so on.

In action this looks like this:

Success!

You can also do the same with the Y and Z axis. For the Z axis you just need to decide where the vanishing point is. Or you could always use fog 😉

The patch for this is really quite simple but I’ve made it available for download below:

Five Days of Pure Data – Randomise Text

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Randomise Text

In the early 2010s I had quite an interest in zines. I had co-organised the Birmingham Zine Festival and was quite regularly reading this. As a result of this in 2011 I started collaborating with a friend, Rebecca Evans, on a collaborative zine. The concept is that we would interpret each others’ way of working using our regularly used tools. Rebecca specialises in textiles and has quite a skill at crocheting, amongst other things. I, on the other hand, can barely operate a sewing machine and feel much more at home using a soldering iron or computer.

For Rebecca this meant trying to create some sense of noise and randomness but using sewing. The results, even if only tests, looked quite awesome!

For my own take on this I wanted to continue the text based work that I was creating at the time. Rebecca two poems to work with, Lovesong by Ted Hughes and Lady Lazarus by Sylvia Plath.

I wanted to remix the works by jumbling the words and/or sentences. Looking at recent(ish) work you can see that this would become a bit of a theme in my work and references. spɛl ænd spik, Silver Screen Changeable, and my reinterpretation of Variations on a Themeby Casey & Finch by Erik Bünger each look at cutting up and rearranging words.

Cutting up words and rearranging them can be achieved in many languages using just a few lines of code. One method is to put each word or sentence into an array and then print those indexes in the array randomly. In Pure Data it isn’t as easy.

For most cases Pure Data has the [tabread] and [tabwrite] objects. To use [tabwrite] you specify the index and then write data to it. You then move onto the next index number and write more data to it. To read the data back you do the inverse but with [tabread]. This works great with numbers but not with text.

Y’see, Pure Data has several data types. Just like php, python and others have strings, numbers, text, etc, Pure Data has symbols, floats, integers etc. The character “6”, for example could be a number (default) or a symbol – using a combination of a number box, [floattosymbol] and then a symbol box. As a number it can be used in arithmetic operations, as a symbol it’s good for when you don’t want it to be treated as a number. The unfortunate thing for me/us is that [tabread/write] doesn’t accept symbols, only numbers. Damn! Enter [m_symbolarray] from rjlib.

Using this abstractions you can put symbols into an array in the same way as numbers. With that problem solved I now could decide whether I wanted to jumble words or sentences. In the former case it would be unlikely to return anything that makes sense, which is ok. In the latter case it could still produce interesting results that could fool some people into thinking it was as it should appear. So, it was important to have the option to do both.

Using [textfile] you can specify what a delimiter is. A delimiter specifies where one list item stops. By default [textfile] uses a space as a delimiter. By sending the message [read textfile.txt cr( to [textfile] I can tell it to use a carriage return (Enter key) as a delimiter. Unfortunately there isn’t great documentation on what the others are!

Using the [until] object and a carefully crafted series of operations I crafted a patch to do the following: Decide on delimiter, read the text file, output each list item to the m_symbolarray object, cycle through text file and do this until we reach the end of a text file.

With the table now populated here’s where the interactive part comes in. A user could decide to either dump the contents of the array into a text file or spit out each item to, say, a [text] object for displaying on a screen.

You can download the finished patch below:

Below are my remixes of the two poems for you to download:

it’s a shame that the zine was never finished but it was still a great learning experience.

I’m sure that by now the relatively new [text] object can solve many of the problems I had using [textfile] to read the file, so perhaps at some point in the future there will be an update.

Five Days of Pure Data – Image to Signal

In the years that I’ve been creating things in Pure Data I have amassed quite a collection of unfinished and messy patches. Over the next few days I’ll be releasing a few of these patches and techniques that I implement when programming in Pure Data.

Image to Signal

If you’re into experimental ways of creating visuals and happened to be on the internet around 2013 you have have been sent information about software called PixiVisor. The software, developed by Alexander Zoloto, allows one to transmit an image via an audio signal to another device which decodes it and displays it as an image. In this way you can then apply audio effects to the signal and make some pretty cool visuals. I never tried it myself but was quite impressed by what I saw.

Being the open source kinda guy that I am, and a devout use of Linux I was interested in knowing if there was a way of reproducing this using open source software.

sloev got in touch with me some time later in 2013 to demonstrate a Pure Data patch he made that quite faithfully reproduced PixiVisor in Pure Data!

Like PixieVisor this worked by having two devices, one to transmit and another to receive. I quite quickly and roughly rewrote the patches so that this process was done all in one patch. Its first outing was the 2013 Pecha Kucha event in Coventry where I provided visuals for Ashley James Brown/Arctic Sunrise.

The patch works by using the [pix2sig] and [sig2pix] objects. As their names suggests they convert audio signals to pixels and vice versa. The key to how the Pixievisor remix patch works is setting the block size. This tells your computer how many audio signals to process per frame and therefor how many pixels.

By default on Pure Data this is set to 64, which would allow us to work with an image 8 x 8px which is tiny.By increasing the block size to 4096 we can begin to work with images of size 64×64 (because 64 x 64 is 4096).

You might now be thinking that you can increase the block size and work with higher resolution images. Well, that’s not really possible unless you want to sacrifice frame rate. If we take a commonly used video resolution, 640 x 480, you would need a block size of 307200. Try doing this and come back when your computer has done crashing 😉

With sloev’s blessing I am now releasing this modified patch.

The patch is set to use the webcam as an input but it can accept any media input that Pure Data can handle. If you don’t want to have to resize everything before bringing it into Pure Data you can always use [pix_resize] to, um, resize your input live. This has the added benefit downside of putting extra strain on your computer.

Perhaps it’s a dream that will never come true, but it would be good to one day be able to have a completely open source, multiplatform solution for video synthesis. I like the look of Lumen but unfortunately it’s Mac only 🙁 Until then, hope you enjoy the patch!

Glitch GIMP

On Wednesday 29th April I gave my Allowing Mistakes to Happen presentation at Libre Graphics Meeting in Toronto. I was quite anxious about this because the attendees are, typically, developers of software and/or graphic designers. Looking through the archives I found only a comparatively small amount of presentations from artists talking about their artwork and even fewer from those you might call experimental artists (glitch art, generative art etc).

My fears were put to rest somewhat once my presentation actually happened. Despite my computer crashing towards the end (glitch lol) it seems to have struck a chord with many of the attendees. It seemed that they liked that I was turning bugs and the bug hunting process into a form of art.

One such person that was inspired was Michael Natterer, aka Mitch, one of the developers for GIMP, the premier open source photo and image editing software. He showed me how by changing one option when compiling Cairo the contents of the image window would be glitched.

Glitch GIMP

Of course I was quite impressed by the prospect of having a full-featured editing program that could produce only glitch art, so quickly sought advice on compiling it for myself. Presented below are instructions for creating your own glitched GIMP.

Before we go on

This compilation process and the resulting binary file has only been tested on Ubuntu 15.04. I have no way of knowing if the same will work on Windows, Mac OSX or any other flavour of Linux. Also, this tutorial assumes that you have some knowledge of compiling software. If this is all daunting to you go do some research.

…and now we begin

To avoid conflicts we’re going to compile and install Glitch GIMP to its own directory, leaving the original GIMP unmodified.

First create build and installation directories.

[code language=”bash”]mkdir build
mkdir install
mkdir install/share/
mkdir install/share/aclocal[/code]

In the build directory you’ll need to create a file which will hold our environment variables.

[code language=”bash”]cd build
touch env.sh[/code]

These environment variables will tell the computer where to install GIMP. A word of caution, these environment variables are valid for the current session. In other words, if you close your terminal window you’ll have to load these in again. Fill you env.sh file with the following, changing the first line to point to your install directory.

[code language=”bash”]PREFIX=/~
export PATH=$PREFIX/bin:$PATH
export LD_LIBRARY_PATH=$PREFIX/lib:$LD_LIBRARY_PATH
export XDG_DATA_DIRS=$PREFIX/share:$XDG_DATA_DIRS
export GIO_EXTRA_MODULES=/usr/lib/x86_64-linux-gnu/gio/modules
export ACLOCAL_FLAGS="-I $PREFIX/share/aclocal"
export PKG_CONFIG_PATH=$PREFIX/lib/pkgconfig:$PKG_CONFIG_PATH
export CPPFLAGS=-I$PREFIX/include
export LDFLAGS=-L$PREFIX/lib[/code]

Now fill your session with those variables.

[code language=”bash”]. env.sh[/code]

Now we can begin compiling! You’ll have to install GIMP’s dependencies and an extra library.

[code language=”bash”]sudo apt-get build-dep gimp
sudo apt-get install libexiv2-dev[/code]

Still in the build directory you’ll now have to clone parts necessary to compiling GIMP.

[code language=”bash”]git clone git://git.gnome.org/babl
git clone git://git.gnome.org/gegl
git clone git://git.gnome.org/gexiv2
git clone git://git.gnome.org/gimp[/code]

And Cairo.

[code language=”bash”]git clone git://git.cairographics.org/git/cairo[/code]

And now, compile and install babl and gegl:

[code language=”bash”]./autogen.sh –prefix=/path/to/install/directory/
make -j4
make install[/code]

hint: if, like me, you have four processors you can run [code language=”bash”]make -j4[/code] to speed up compiling.

In gexiv2 run:

[code language=”bash”]./autogen.sh –prefix=/path/to/install/directory/
make -j4
make install[/code]

In the cairo directory run

[code language=”bash”]./autogen.sh –prefix=/path/to/install/directory/ –enable-xlib-xcb=yes
make -j4
make install[/code]
(this is the compile option that causes the glitches 😉 )

And finally, compile and install GIMP

[code language=”bash”]./autogen.sh –prefix=/path/to/install/directory/
make -j4
make install[/code]

In your install/bin directory you should now have a file called gimp-2.9. Run this and let the glitch begin.

Glitch GIMP

Glitch GIMP

Glitch GIMP

Glitch GIMP

One thing you will instantly notice is that you can’t directly export the glitch output to a file. This is for display only and, like true glitches, can’t be easily replicated or captured. The only way to do this is to take a screenshot, which is ideal for on-screen display but not so great if you want print quality output.

I’ve been told that I could produce some more reliable glitches by creating or hacking GEGL plugins. I haven’t delved into this yet but if anyone wants to assist please do get in touch.

Thanks

I never would have gained this knowledge had I not been able to attend Libre Graphics Meeting. As seen in the forum thread describing how I came across the databending in Audacity method, trying to ask developers how to creatively break your software can be a confusing task. However, being able to show the developers IRL what can be produced allowed the flow of information to be smoother and more productive than an e-mail exchange would have been.

Libre Graphics Meeting will be coming to London in 2016 and the aim is for it to be free for all to attend, and to cover travel costs of speakers, as it has done every year. If you want to help more stuff like this happen donate now.

Emojify all the things

There’s no doubt that emoji is here to stay and will infiltrate your artwork, desktop, phone screens and inboxes if it hasn’t already done so. In a similar vein to ASCII art, recently apps have been released to convert pixels in images and video to emoji. Emoji Video and Emojify are two iOS apps that can convert content to emoji, with the former appearing to be able to do this in realtime with video.

In a time before emoji two popular libraries existed to do the same thing, only using text and colour blocks (y’know, ASCII). AAlib and libcaca are two popular open source libraries that have been used extensively.

dramaticcaca

Although the two aforementioned emojifying apps work really well, unfortunately there are not yet any open source libraries available to achieve the same effect. Until one is built I took it upon myself to spend a few hours making something that uses Imagemagick and the Twitter emoji set. It’s not nearly as efficient as the emojifying apps or libcaca/libaa, and cannot be used on live video, but as a short experiment I think it works nicely.

The script works by using symbol patterns for dithering. This process uses the frames in an animated gif to replace blocks of colour. As shown in the Imagemagick example any gif can be used. The first step to using the script finding an emoji icon set. The Twitter emoji set is really good and is released under a Creative Commons licence, but feel free to use whatever you want. Download this to your computer.

As mentioned before, this dithering method makes use of the frames from an animated gif. For true emojification all of the emjoi in the set could be converted into one gif, but that would result in a loss of colour, a huge file size and possibly epic processing times! For that reason I decided to pick six random emoji each time the script was run. With each element in place I now just executed the script. You’ll need to modify line three to point to the directory containing the emoji set.

Cat Eye emojified
Original

Freudenberg sg Switzerland emojified
Original

Ipomoea aquatica flower emojified
Original

Studio portrait emojified
Original

Not bad for a few hours of work!

If you’re starting to think that you’ve seen this aesthetic in my work before then you would be right. I have previously used this technique, instead using some randomly generated symbols, for the CóRM image set and some t-shirt/logo designs for NESkimos that I think were never used.

If anyone every creates an open source library for emojifying things I’d be happy to know about it 🙂

g12

Gifs in Pure Data

Every so often on my travels across the information superhighway I come across a Pure Data user asking if animated gif files can be read in Pure Data. Technically speaking they have always been able to be read in Pure Data, but not always in a way that a user usually wants. Using the [pix_image] object a user can read almost any image file format. On Linux this is dependent on ImageMagick, so whatever it can read can (theorectically) be displayed in Pure Data/GEM. The problem arises because [pix_image] doesn’t display animated gifs as animations, only the first frame.

There are several solutions to this problem. For these examples I’m going to use the following two gifs:

box

frame

Click through each image to get the full-sized original versions.

[pix_multiimage]

If you separate the gif into its individual frames you can use [pix_multiimage] to display each frame in succession.

multiimage
Click to download the PD patch.

Benefits

The benefits of using [pix_multiimage] to simulate an animated gif are that you can display high quality images with an alpha channel at whatever frame rate you choose. Simulating stutter effects or reversing is as easy as using a [counter] or random number generator.

Drawbacks

The problems with this approach are that [pix_multiimage] needs to be told how many frames to cycle through, and not all gif animations have the same amount of frames. [pix_image] and even [pix_data] do not report information about the amount of frames in an animation, so that value cannot be passed to [pix_multiimage]. Assuming that you separate your gifs to their individual frames, an abstraction can be built that can detect how many images there are in a directory and then send that value to [pix_multiimage] but that is a lot of effort to go through!

Convert gif to video

The technique that perhaps most PD users have used is to convert the gif into a video file and use [pix_film] to play it. I used the following script to convert a folder full of gifs into mp4 files, with all transparent pixels converted to green pixels:

With the gif now converted to a video you can use [pix_film] to play a video as you normally would.

gifchroma
Click to download the PD patch.

Benefits

So far I have only tested playing animated gifs in Pure Data using Gmerlin on Ubuntu. Without knowing if the same would work on Windows or Mac OSX, using video files is the safest option for all users.

Drawbacks

Any sort of file conversion will reduce the quality of the output, and this method is no exception. The videos aren’t very sharp, especially at the borders of the green pixels.

Making the green pixels transparent using [pix_chromakey] or [pix_alpha] requires fine-tuning to ensure that other colours aren’t made transparent. This isn’t always 100% reliable and can have a few glitchy artifacts.

Using gifs directly with [pix_film]

Another approach is to use [pix_film]. “Hold on” I hear you say, “[pix_film] can only be used to play films! How dare you suggest that it can be used to play image file formats. Balderdash!”. Well, don’t beleive the hype! As a Linux user, I can only comment on this working on Linux. If anyone can get the following methods to work in any other OS please get in touch and I’ll add it here.

When you play media file formats in Pure Data on Linux you’re actually using external programs and libraries to play them. So, you’ll use ffmpeg/libav to play videos and Imagemagick to display images. There’s also another program you can use, Gmerlin. Install it by executing sudo apt-get install gmerlin. Pure Data/GEM has some weird behaviour whereby the delay amount of a gif needs to be explicitly set to a value 1 or above in order for an animated gif to be played. This can be achieved on a folder full of gifs by executing mogrify -delay 1 *.gif.

And now you can easily open an animated gif in Pure Data the same way you would a video file.

gifvideo
Click to download the PD patch.

Benefits

Gifs, unlike (most) video file formats can have an alpha channel. Another benefit is that you don’t need to deal with converting files. No longer will you have to worry about whether an mp4 is faster or more effecient than an mp4, or what codec to use. Gifs will just be gifs.

Drawbacks

If the original format of your source file is a gif, then perhaps it is more efficient to keep it as a gif. If it was a video file, would it be beneficial to convert it to a gif? Not always. Even if you could achieve a smaller file size or have PD use less processor power by using a gif, the quality of the video output would be reduced due to gifs only allowing 256 colours.

It’s pronounced “gifs”

There are perhaps other benefits and drawbacks to each approach that I haven’t written about or haven’t even thought about. One such example of both is processor usage of each method. I suspect using gifs is actually less efficient, but I don’t have a good method of testing this. Perhaps one of y’all could!

pdroll

Pure Data File Killer

On one of my frequent journeys on the information superhighway I stumbled across Little-Scale’s Mass JPG Killer. This handy little patch allows a user to load any binary file and “glitch” it by overwriting some of the original data with a repeating pattern of user-defined data.

Mass JPG Killer by Little-Scale

The only problem (for me and people like me) is that I don’t have Max/MSP and can’t install it on Linux, meaning I’ve never actually used it!

Little-Scale very kindly provided the internet at large with screenshots of the inner workings of the patch. I was able to to use a whole lot of science and maths to rewrite and reinterpret this patch of mass destruction in Pure Data, which is more easily available.

Pure Data File Killer

Click to download

Click to download

Usage

Usage of the patch is very simple and can yield some quite interesting results!

  • Click open to load a binary file. Pure Data may freeze for a moment if you’re loading in a large file. I don’t recommend loading in a file over 100MB
  • Set the byte offset. This number represents the starting point at which the patch will start “corrupting” the file. If you’re a glitchspert (glitch + expert) you’ll remember that you should avoid modifying the header. To avoid modifying the header set the offset to the 1000s.
  • Set the period value. This can be hard to understand, so here’s an example: If the period is set 1378 then at intervals of 1378 bytes from the offset it will modify the data.
  • Set the data value. This works in conjunction with the period value. Using the previous example, if the data is set to 102 then at intervals of 1378 bytes it will replace the current byte value with 102.
  • Press either random period or random byte data to populate these values with random values.
  • Press glitch it!. Guess what that does.
  • Write the files to save them to the same directory as the source file. The original file will not be overwritten.
  • To start again press the reset button. It will load the original byte data.

This patch is very similar to Little-Scale’s with a couple of exceptions:

  • The offset cannot be set for each instance. This is by design as I felt it was a bit redundant.
  • You no longer need to copy the hex data to a new file in order to view the results
  • It’ll work on any platform that can run a full version Pure Data Extended. This should include the Raspberry Pi version as GEM is not required.

Output

Although it was originally inspired by the JPG Killer you can get some very interesting results if you use other file formats and set the period data to a number less than 20.

Pure Data File Killer - Bliss (sgi)

Pure Data File Killer - Bliss (jpg)

Pure Data File Killer - Bliss (pix)

Pure Data File Killer - Bliss (pix)