I create a lot of animated gifs and I’m getting into creating videos as well. All of these are initially created by rendering a series of png files into a folder called
frames. Over the last few years I’ve figured out some pretty good recipes for converting these frames into animated gifs and videos. So I’m sharing.
Both ImageMagick and ffmpeg are extremely complex programs with tons of parameters that allow you to do all kinds of image, video and audio manipulation. I’m not any kind of an expert on any of these. And this post is not meant as thorough documentation on these programs. It’s more of a cookbook of commands that handle specific use cases well. I’ll also try to explain what each of these parameters does, which might help you to come up with your own commands.
Nothing beats thoroughly reading the official docs, but if you just want to get something done quickly, these recipes should help you.
It confused me at first, but ImageMagick is not a single executable. It’s more of a suite of graphics utilities. Maybe under the hood it’s all one executable, but it presents as different commands you can call. For generating an animated gif, you’ll be using the
As I said, I keep my frames in a folder called
frames. They are named sequentially like so:
I use four digits, which gives me up to 10,000 frames. At 30 fps, that gives me about 333 seconds worth of video, or about five and a half minutes. Good enough for my needs now. If you go with three digits, you’ll only get 33 seconds worth. If I ever need to do more than five minutes in a single clip, five digits will give me almost an hour.
convert uses simple wildcard matching, so my input is simply
frames/*.png and the output can be something like
Next you need to figure out the frame rate. I usually go with 30 fps. You don’t enter frame rate directly with
convert though. You need to specify a delay between each frame. And here’s where things get really weird. The unit you use for the delay parameter is hundredths of a second. So if you want 30fps, you’d specify 100 / 30 or 3.333… Not exactly intuitive, but you can get used to anything. Put it all together and you get:
convert -delay 3.333 frames/*.png out.png
But there’s one more parameter I put in there, which is
convert has a whole ton of stuff you can do with it to affect how the gif is put together.
-layers Optimize combines a few of the most useful ones into a single command. I found that before I started using this, I’d get an occasional animation that just totally glitched out for a frame or two. Actually, it was more than occasional. Not every time, but often enough to be a problem. Using this parameter completely solved it, so I highly recommend it. If you want to read more up on it: https://legacy.imagemagick.org/Usage/anim_opt/#optimize
So the final command I use is:
convert -delay 3.333 -layers Optimize frames/*.png out.png
This has been my bread and butter command for a long time now.
But lately, I’ve been running into problems using ImageMagick to create longer, larger format gifs. This hasn’t been an issue at all when I was just posting to Twitter, which has something like a 15mb size limit. But as I’ve been posting gifs in other places that can be up to 100mb, I’ve started making larger and longer animations. And ImageMagick has been crashing pretty regularly while generating these. I’ve watched a graph of memory use and I’m pretty sure that it’s just running out of memory. It uses up all my physical memory, then uses up all my swap, then crashes. On at least one occasion I got it to work by closing down every other program that was running on the computer. That freed up just enough memory to get through. But it doesn’t always work. So that’s… less than ideal.
ffmpeg for gifs
[Update] Before using the command below, read this post, which has to do with color palettes: More ffmpeg Tips
I’ve been using ffmpeg for a while now to create videos, and I knew it could create gifs as well, so I’ve recently given it a try and it seems to do just as good a job as ImageMagick. And it can handle those high res, long gifs as well. So I’m pretty happy so far. Creating gifs with ffmpeg is also pretty straightforward, except for the input filename specification, which I’ll cover next. Here’s the command I use:
ffmpeg -framerate 30 -i frames/frame_%04d.png out.gif
Breaking that down, the
framerate param is just the number of frames per second. Simple.
-i is for input, and you point it to your sequential frames. and finally, the output filename.
The only complexity is the
frame_%04d.png part. This is a printf type of string formatting convention. The
%04d part means that you have four digits (4d), and they will be padded with 0s. So frame 37 will be
As I said, this has been working really well for me so far, but I haven’t been using it very long, so if there are any issues, I might come upon them later.
Video creation is what ffmpeg is usually used for. It’s an amazingly powerful and complicated tool. Even just figuring out all of the possible parameters can be daunting. So I’ll go over some of the common use cases I have and what each parameter does.
Mostly I’ve been optimizing the videos I create for use on YouTube. This uses h264 encoding and a specific pixel format. Here’s the basic command I use for creating a video from a sequence of frames:
ffmpeg -framerate 30 -i frames/frame_%04d.png -s 1920x1080 -c:v libx264 \
-preset veryslow -crf 20 -pix_fmt yuv420p out.mp4
(That should be all one line. I wrapped it to make it more readable.)
Yeah, a lot to take in there. So let’s go through it param-by-param.
-framerate 30 is just the fps setting.
-i frames/frame_%04d.png specifies the input files as described earlier.
-s 1920x1080 sets the size of the video in the format of
height. I believe that if you leave this param off, it will create the video based on the size of the source images. But you can use this to scale up or down as well.
-c:v libx264 OK, this will take a little background. ffmpeg can be used for creating or processing both audio and video – or both at the same time. Some of the parameters can be applied to audio or video. In this case,
-c is letting you specify what codec to use for encoding. But is it a video codec or an audio codec? By adding
:v we’re specifying that it’s a video codec. Fairly obvious here because there is no audio involved in this case, but explicit is always good. I’ve seen examples where people use
-s:v to specify the size of the video, which seems overkill to me. But ffmpeg can also handle things like subtitles, and maybe those can be sized? Again, explicit is good. Anyway, here we are saying to use the libx264 codec for encoding the video. Which is good for Youtube.
-preset veryslow Here we have a tradeoff – speed vs size. These are the presets you can use:
- medium – default preset
The faster you go, the larger your file size will be. It’s recommended that you use the slowest preset you have patience for.
-crf 20 is the constant rate factor. There are two ways of encoding h264 – constant rate factor (CRF) and average bit rate (ABR). Apparently CRF gives better quality generally. This parameter can go from 0 to 51, where 0 is lossless and 51 is crap. 23 is default and anything from around 17 to 28 is probably going to be generally acceptable. The CRF setting also affects file size. Lower CRF means a higher file size. So if you need the lowest possible file size, use a high CRF and a slow preset. It will take a long time and look like crap, but it will be small! More info on all this here: http://trac.ffmpeg.org/wiki/Encode/H.264
-pix_fmt yuv420p is the pixel format. You might be more used to RGB888 where you have red, green and blue channels, and 8 bits per each of those channels. YUV is just an alternate way of formatting colors and pixels. Mind-numbingly detailed description over at the old wikipedia: https://en.wikipedia.org/wiki/YUV . But this is a good format for Youtube.
out.mp4 is the file that is created after applying everything else.
So there you go. my bread and butter video creation command.
But there are a few more tidbits I use:
I usually try to make my animated gifs loop smoothly. But for longer form videos this is not always what you are going for. Still, you usually don’t want the video to just get to the end and stop. A fade out to black is a nice ending effect. You could build that into your source animation code, or you could do it in post of course. But ffmpeg has fades built in. Here’s the altered command:
ffmpeg -framerate 30 -i frames/frame_%04d.png -s 1920x1080 -c:v libx264 \
-preset veryslow -crf 24 -pix_fmt yuv420p -vf "fade=t=out:st=55:d=5" out.mp4
Here, I’ve added the param
vf is for video filter, I believe. This takes a string in a very compact format. It starts by specifying the type of filter and its definition:
"fade=...". After the equals sign is a list of parameters in the format:
a=x:b=y:c=z. We’ll step through those:
t=out means the type of fade out.
st=55 tells the fade to start at 55 seconds into the video.
d=5 means the fade will last 5 seconds.
In most cases the
d parameters will add up to the total video length, but you can do all kinds of things here. Like fading out mid video and then fading back in later.
t=in would execute a fade in. You might want to do that at the start of your video.
Although I have not had the need for it, I assume you can use
-af to create an audio fade.
One of the big benefits of video over animated gifs is the possibility of adding music or sound effects to the animation. As mentioned, ffmpeg can do that too.
I believe it’s possible to add the soundtrack to the video as you are creating it from the source frames, but this does not seem like a good workflow to me. I might generate dozens of videos before coming up with one I decide to go with. So it makes more sense to me to add the audio when I’m done. Here’s my command for adding audio to an existing video:
ffmpeg -i video.mp4 -i audio.wav -map 0:v -map 1:a -c:v copy output.mp4
Again, let’s go through each param.
-i video.mp4 is the video you’ve already created that you want to add audio to.
-i audio.wav is the sound file you want to add to the video. Yes, we have two inputs here. That’s how it works.
-map 0:v The map parameter tells ffmpeg which input is what. Here we’re saying the first input (0) is video (v).
-map 1:a As you can probably guess, this says that input 1 is audio.
-c:v copy As covered before, this is the video codec. In this case though, we don’t want to encode the video all over again. We just want to copy the video over and add the audio to it.
output.mp4 is the final combined audio/video file.
So that’s pretty straightforward and works great. But this kind of assumes that your video and audio are the same size. They might be, which is nice, but they might not be, which raises the question of how long to make the final video? Do you make it the same length of the video or the length of the audio?
I believe by default, it will choose the longest option. So if your video is one minute long and you throw a five minute song on top of it, you’ll wind up with a five minute video. Honestly, I’ve never tried this so I don’t know what happens. I assume you get four minutes of blackness. Or maybe four minutes of the last frame of the source video. Or maybe it loops?
On the other hand, if your video is longer than your audio, you will probably wind up with some amount of silence after the audio finishes.
So the other option here is the
ffmpeg -i video.mp4 -i audio.wav -map 0:v -map 1:a -c:v copy -shortest output.mp4
All this does is make the output video the same length as the shortest input.
I’m sure you can figure out all the different use cases here. And you probably want to think about fading your audio out along with a video fade so the sound/music doesn’t just suddenly stop.
Well it’s been a few days since I gave in and decided to check out the world of NFTs. It feels like a few months. Figured I’d just post some thoughts and observations.
What I thought
I think my biggest confusion when I started this whole thing was, why the hell are people buying NFTs.
Initially I thought there was some kind of confusion going on where buyers thought they were actually buying rights to the artwork in some way whereas in fact, buying an NFT by itself gives you no rights to the original work. But as I talked to people I discovered that while people generally understood this, nobody really cared. It’s not like people were buying NFTs in order to use them in their corporate marketing campaigns or something. They just wanted to “collect” the art.
This just caused more confusion for me. Surely there is no inherent value in buying them, I thought. They’re either buying them as investments or buying them to support artists they like.
While I assumed that people paying thousands of dollars (or even millions of dollars) for big name NFTs are doing it as investments of some sort, I did not believe that the average hic et nunc user wasn’t doing that.
So it had to be about supporting artists, which I thought was nice, but frustrating because I’ve had a donation link up for ages and I get $5 or $10 a couple of times a year. So why were people so happy to support artists via NFTs, but not directly?
I was wrong
Just about everything I thought was wrong.
- People do find (massive) inherent value in buying NFTs.
- People do use NFTs as short/mid-term investments.
- The whole supporting-the-artist thing exists, but it’s a pretty minor aspect.
Yeah, people are crazy about collecting NFTs. Especially from some known artist. Honestly it’s been a bit of an ego trip because people seem to know who I am in this community (getting a shout out from my old friend Mario Klingemann does not hurt either). Feeling a bit of a taste of the (very low level) rock star vibes I had in the 00’s and early ’10s on the Flash conference speaking tour. It’s nice.
Anyway, yeah, everything I’ve put up on hic et nunc has sold out in minutes. I’ve just been experimenting with prices and amounts and it doesn’t matter, it just goes like that. But it goes beyond that. People are DMing me on twitter begging me to let them know the next time I mint something. Or asking if they can buy something from me directly. When people manage to buy something before it sells out, they’re over the moon about it, like they just won the lottery. Get that – they gave me money and they feel like they won something huge. People are HUGELY passionate about collecting stuff.
I’m still trying to wrap my head around it because it seems really, really bizarre to me. It makes absolutely no sense in my brain. But I’m trying to roll with it. It’s just all very surreal.
This is a way bigger part of the system than I thought. For my first NFT I minted 10 editions at 1 tez each. No idea what to expect. Within a couple of hours of them selling out, one was re-sold at 150 tez! There’s one of those still for sale at 3000 tez!!!
Yeah, so IF that sells (I can’t imagine it will), the seller will have made a 3000% profit. Currently 1 tez is around $4 usd. You do the math.
Of course, once something gets into the realm of capitalism, all kinds of ugly stuff starts cropping up. I discovered there are bots people will set up to buy NFTs in bulk from popular artist at low cost and immediately resell them at a huge markup. This generates a bunch of anger in the community, from individual collectors who are mainly just into collecting work from artists they like. The bots make it hard for them to get the pieces they are trying to collect.
tldr; it’s a complex and complicated space.
Supporting the Artists
Like I said, this is there for sure, but it’s not a huge part of it, from what I can see. I think the music industry is a good example. Yeah, there are lesser known indie groups who have the support of loyal fans hoping they’ll eventually make it big. But you wouldn’t say that most people buy music because they want to support the artist. They buy it because they want the music.
I have no idea where this is all headed. On one hand I feel like this can’t last. It’s like a gold rush. Pokemon cards, Beanie Baby stuff. The bubble is gonna burst some day. On the other hand, this probably will pave the way for the future of how art is bought and sold… maybe. I have no clue. I guess I’m just going to surf this wave for a while. And not quit my day job just yet.
OK, y’all wore me down. After multiple messages per week from friends and strangers urging me to put my work on hic et nunc, and many long, drawn out debates with people I respect, I finally gave in and created an NFT.
At some level I feel like a sellout. On the other hand, I was feeling way too stubborn and dogmatic about my resistance to try it. It was a serious internal struggle for the past few months.
Long story short, people seem to want to give money to digital artists via NFTs. Lots of money. But not any other way. I still have misgivings, but I’m going to assume that people know what they are doing and if they want to support me in this way, I’ll accept it.
So here’s my very first NFT:
Make it rain, fans. 🙂
I’m starting a newsletter. I’ve been hearing more and more buzz about newsletters these days. I always assumed these were just cheap marketing gimmicks. “Sign up for my newsletter so I can spam you with info about this thing I’m trying to sell.” But apparently it’s become the replacement for RSS for a lot of people. Who knew? Probably everyone but me.
Anyway, I’m going to start one. For the most part it’s going to be quick summaries and links to stuff that I post right here, with maybe one or two other links or items of interest. I thought about doing more complete articles in the newsletter, but I’d rather keep my content centralized here, so summaries and links it is.
For now, it’s totally free. If it takes off and people seem interested, I might experiment with doing some kind of additional paid content. Undecided at this point. One step at a time. Anyway, sign up here:
First issue should go out Wednesday morning, 8/18/21.
I’m a firm believer that creative coding is a very different activity than the kind of coding that most people do for their day jobs. In feature / application / systems programming, there’s usually a fair amount of planning, scoping and architecting that comes before you start coding. Ideally, when you start coding, you have a pretty good idea of what you are going to make and how you’re going to make it.
But creative coding, for me at least, starts with “what would happen if I did this…?” and generally follows that line of logic all the way until I have something published.
So I’m always going to this value or that, deleting “true” and typing “false” or vice versa. Or, I’m using a sine function and want to see what happens if I use cosine. Or tangent. Or I want to swap greater than with less than and see what that does. Or change plus to minus or minus to plus.
After way too many times doing this type of thing, I figured there must be some plugins that would make this easier. And so there are! I use vim, and found several. I tried a few and had almost settled on vim-toggle and then checked out switch.vim, which I found to be the most powerful of the lot.
I recognize that most people probably use VS Code or Sublime Text or other editors. A quick search informed me that there are similar plugins for those editors as well (all links below). I haven’t tried the non-vim ones, so I can’t vouch for their quality, only their existence. There may be better ones, so do your research.
The way these work is you put your cursor on the word or symbol you want to change and hit some keyboard shortcut. For switch.vim, that’s
gs. Each time you hit that shortcut the word will toggle back and forth between, say, “true” and “false”. Or “on” and “off” or “1” and “0”.
Most of the plugins come with several obvious definitions pre-defined. But make sure you find one that will also allow you to set up custom toggle sets. Some of the plugins only support binary switching between two options, but some, like switch.vim, will let you specify a list of as many items as you want. It will cycle through all of those options when you hit the shortcut key over any one of them. Here’s just a few of the custom toggle sets I set up:
["cos", "sin", "tan"]
And there are more niche ones I use in my day to day creative coding using cairographics bindings for Golang (blgo). Sometimes I’ll be making a piece where the background is white and the foreground is black. I use a function,
ClearWhite to clear the surface to white, and set the drawing color to black using
SetSourceRGB(0, 0, 0). but occasionally I want to invert these to use a black background with white shapes. So I set up some toggles like so:
["0, 0, 0", "1, 1, 1"]
This lets me easily make the change in just a few key strokes.
switch.vim even allows you to set up toggling rules using regx, which seems super powerful, though I haven’t dug into that so far.
So far, I’ve found these immensely useful. Maybe you will too. Here are the links:
Vim / Neovim:
Sublime Text 3:
I’ve come to be somewhat known as a “math guy” in creative coding. It’s one of my impostor syndrome items because I’m really not any kind of expert in the field. I took Algebra and some Precalculus in high school banged my head on a formal Calculus course, but never made it through. Most of what I know has been self-taught and pretty seat-of-the-pants.
Here’s a tutorial on a really neat and powerful math art technique that I first came across years ago in the book, Computers, Pattern, Chaos and Beauty by Clifford Pickover. It’s described in Chapter 14, entitled Dynamical Systems. More recently I’ve seen it described as “Popcorn”. I’m not sure where that name came from, but there’s a good chance it originated with Paul Bourke, who used the term back in 1991 in this article.
The technique is based on some very simple formulas – just using a couple of trig functions (sine and tangent in this example) to transform an x, y point repeatedly. But it creates some amazingly intricate, complex and beautiful images. It’s also open to a nearly infinite amount of hacking by changing the few constants used or swapping out which trig functions you use or how you compose them.
Almost exactly two years ago I bought my last phone, a Pixel 3XL. The phone I had before that was a Samsung Galaxy S8. It was two years old and was in good condition, but I didn’t like it that much. All the extra Samsung garbage was not to my taste. I like stock Android or as close as I can get.
After two full years, I was totally happy with the Pixel. I had no plans on upgrading or changing until I had to. And then… I had to. The first sign was the the volume buttons weren’t working. I have one of those rubber bumper cases. When I took it off, the volume worked fine. But not with the case on. The case seemed fine, and then I took a closer look at the phone beside the volume keys.
Here you can see the back cover has separated from the phone. The whole back plate was swelling out. That’s why the volume wasn’t working. The buttons on the case were no longer aligned with the buttons on the phone.
I had noticed that it was seeming to get pretty hot when I wirelessly charged it but hadn’t thought too much about it. Obviously the battery was on its last legs and getting ready for some kind of catastrophe. I kept an eye on the phone the rest of the day and started looking for a new device.
I was really pretty bummed out about this because I didn’t actually WANT a new phone. And I had to get one quick and didn’t have a chance to do a bunch of research. I was curious about the Oneplus line. Their flagships go for $600 – 900 or even higher. I didn’t want to spend that much when I wasn’t really sure what I wanted. Maybe I wanted a Pixel 6 when it came out. What to do? What to do???
I finally wandered across the Oneplus Nord n10 G5. It was under $300 but decently specced for that price. I watched a few Youtube reviews and while nobody was raving about how great it was, the consensus was that it was a pretty good phone for its price. I crossed my fingers and ordered it with next day delivery.
In the meantime I wondered if I could possibly replace the Pixel’s battery. Quick search revealed a few Youtube videos that made the process seem not too formidable, and a number of under $20 replacement kits. Worth a shot, right?
The repair kit came the same time as the new phone. I set up the Oneplus, got my sim card in it and all my apps. It looked and felt pretty nice. No regrets. Then onto the battery repair.
The toughest part was getting the back off. You need a heat gun (which I have, luckily) and a lot of time and patience. You apply heat to the back of the phone judiciously so as to not damage it. This softens up the glue, then you pry the crap out of back of the phone. The kit had tools and a suction cup. It took a good half hour of heating and prying, heating and prying – and I had a head start since the battery had already started the job – but eventually I got the back off.
Then you have to pry the battery out. It’s also glued in. That was a bit easier, but not… easy.
Finally, the recharging coil is just like a piece of thick paper with the coil inside, glued to the battery. You have to carefully pry that off. If you’ve ever tried to peel a glued-on paper label off of something, you can imagine how that went. I got all of the coil and about half of the paper backing, but I was pretty sure I had wrecked it.
Then you put it all back together. The kit also came with glue strips to put the battery back in with. Stuck the coil back on the new battery, plugged everything in. Cleaned up all the old glue. Crossed my fingers and turned it on. It worked fine. Put it on the charger. It charged right up. Didn’t even get hot. And it held its charge really well.
The last thing I needed was some glue to put the back on again. I ordered that and finished up the next day. But since then it was working fine, holding a charge and charging just fine. Now I had some options:
- Keep using the Oneplus and keep the Pixel as a backup.
- Go back to the Pixel and return the Oneplus.
- Go back to the Pixel and keep the Oneplus as a backup.
I had a good 2-3 days in on the Oneplus, which gave me a good idea of how much I liked it. In general, I did like it. I concur with all the reviews – it’s a great value for it’s price. But there is no doubt that the Pixel is way better. Some details:
- Performance. Pixel wins hands down. Opening apps takes probably 1.5-2x longer on the Oneplus. Random scrolling around is obviously way smoother on the Pixel. But this was really only noticeable on a side-by-side comparison. I could have lived with the Oneplus’s performance easily.
- The Oneplus screen pales in comparison to the Pixel… LITERALLY! (sorry) Not surprising. The Oneplus is an LCD whereas the Pixel has OLED. Again though, wouldn’t be a deal breaker for me.
- Bluetooth performance was not good. I use Galaxy Buds Plus and love them. They have been virtually 100% flawless on the Pixel. On the Oneplus, I had various issues:
- Garbled sound. I’ve had that on cheaper BT earbuds, but never on the Galaxy Buds. I was getting it regularly every time I used them on the Oneplus.
- Unresponsive controls. The tap to start / stop failed multiple times. Never recall it failing while on the Pixel.
- Connection. I think it failed to auto-connect once in the couple of days I used it. I don’t recall it ever having a problem on the Pixel.
- Touch responsiveness. Very noticeable on one of the puzzle games I was playing. Tapping on on-screen items would fail close to 50%, requiring multiple taps. Never experienced it on the Pixel and when I retried the same game on the Pixel again, it was night and day.
To be fair, those are the only negative performance points I could come up with on the Oneplus. I would add that Oneplus have started creating their own UI stuff. A customized settings app, a custom launcher, a bunch of preinstalled Oneplus apps. I was under the impression that Oneplus was close to stock, so this was a bit disappointing. Not as bad as Samsung, but not a plus.
But overall, not bad. The Bluetooth and touch screen stuff were the only points that really pushed me over the edge to going back to the Pixel.
I am really happy to be back to the Pixel though and have a renewed appreciation for what a good phone it is. As I said, I didn’t want to switch phones to begin with and I’m happy that I don’t have to.
I’ve decided to keep the Oneplus though as a backup. I don’t know how long my Pixel surgery is going to hold up. So far it’s flawless, but who knows what the next few weeks or months hold. If the Pixel does crap out on me, I’ll have something to switch over to instantly. Maybe I can last long enough to see how the Pixel 6 does and maybe even long enough to see it come down a bit in price from its initial release.
If you’ve been following along #awegif2021 on twitter, you’ve seem me post a few animated gifs that look like the image above (days 1-6 specifically). These are known as Chladni figures, named after Ernst Chladni. He described the formulas that create the patterns that result when you use soundwaves to activate sand or powder on a flat surface. You can find a ton of videos on Youtube that feature real world examples of this. But it’s also fun to do in code.