ffmpeg shadertoy filter
| June 28th, 2025Some years ago I made a ffmpeg filter for using shadertoys in a libre software video pipeline. I use it extensively in my video essays and my video generation from text software, and given that from time to time ffmpeg’s API changes, once a year or so I do some mainteinance to my filter’s code in order for it to be still usable with newer ffmpeg versions. Today is one of those days, and just now I realize I never did a blog post about it. So, better late than never: here’s the blog post about the filter.
Most of the details about building or how to use it are already in the repo’s README file, so I’m not going to cover that here: just know you gotta build ffmpeg yourself, adding my code and some extra opengl libs.
What the readme doesn’t give you are proper examples of what you can do, which is the juicy part. Consider for example this command:
./ffmpeg -hide_banner -y -f lavfi -i "testsrc=1920x1080:r=30,format=yuv420p" \
-map 0 -vf "shadertoy=shadertoy_file=downloaded_shadertoy_code.glsl:start=5:duration=15" \
-c:v h264 -preset slow -b:v 8M -t 30 test_shadertoy_01.mp4
Which renders the following 30 seconds video:
That shadertoy in the example is “Protean Clouds”, by nimitz. You can just select all the shadertoy code, copy it, and save it inside a “downloaded_shadertoy_code.glsl” file. That’s it: that’s how you’re able to use a shadertoy with this filter.
That particular shadertoy is one of the many that renders some cool animation. In case you didn’t yet, please do browse shadertoy’s site and you’ll find some absolutely amazing work from many extremely talented people all around the world. There you’ll see lots and lots of “render shadertoys” (my terminology), and using my filter you can render them yourself as a video file or even some stream. I use that kind of shadertoys as backgrounds for my videos. And as a side note, take a look at the previous command and you’ll see that it adds the “start” and “duration” parameters to the filtergraph definition; that’s the reason the render starts at second 5 and ends at second 20, while the rest of the time the output simply shows the input video (the test screen instead of the clouds).
However, there are other kind of shadertoys around, with the role of applying some filtering over its input video. I call those “filter shadertoys”, just to distinguish them from the ones that simply replace the input video with a totally different render (but please note all of them are “shadertoys”, and the procedure to use them is the same). Here are some examples of “filter shadertoys” applied to the very same test screen used before:
./ffmpeg -hide_banner -y -f lavfi -i "testsrc=1920x1080:r=30,format=yuv420p" \
-map 0 -vf "shadertoy=shadertoy_file=lsfXzM.glsl" \
-c:v h264 -preset slow -b:v 8M -t 30 test_shadertoy_02.mp4
./ffmpeg -hide_banner -y -f lavfi -i "testsrc=1920x1080:r=30,format=yuv420p" \
-map 0 -vf "shadertoy=shadertoy_file=Wdj3zV.glsl" \
-c:v h264 -preset slow -b:v 8M -t 30 test_shadertoy_03.mp4
As you can see, this other shadertoys are a different beast compared to complete renders. But as you can also see, that doesn’t make them less cool. But this ones are more appropiate for use as “filters” in the sense “ffmpeg video filters” are supposed to work: they do stuff over your input video instead of creating a whole render from scratch. But another cool thing about the ffmpeg filters is that you can apply a complex chain of filter combinations, and so you can mix multiple shadertoys in a single filtergraph. Like this:
./ffmpeg -hide_banner -y -f lavfi -i "testsrc=1920x1080:r=30,format=yuv420p" \
-map 0 -vf "shadertoy=shadertoy_file=3l23Rh.glsl,shadertoy=shadertoy_file=Wdj3zV.glsl" \
-c:v h264 -preset slow -b:v 8M -t 30 test_shadertoy_04.mp4
That one mixes “Protean clouds” with “Old film”, all in the same command. You can quickly realize combining this with the huge ammount of shadertoys available is actually pretty powerful.
Now, it’s not all roses: many shadertoys (most of the cooler ones) are very complex, a multi-step procedure, and require precise inputs in order for you to get the output you see in the shadertoy’s site. My ffmpeg filter just wraps code taken from the shadertoy site by some string hardcoding in C and some library calling: just the minimum for allowing us to load a glsl file with a single-step, video-only, single input, shadertoy code. That means multi-step shadertoys will not work, multiple-input shadertoys will also not work, and there’s no support for audio-based shadertoys. Some single-step shadertoys also doesn’t work, and in all of those cases you’ll have to get your hands dirty with the shadertoy’s code in order to get results. The first time you do that can be very frustrating, specially if you understand nothing about shaders nor the math involved (it looks like black magic even for the people that kinda gets it). But if you persevere, and take a look at how shadertoys work, you can apply some modifications in order to make them work with my ffmpeg filter. There’s also tricks, like concatenating the code from multi-step shadertoys into a single-step procedure, or even get every step as a single glsl file and then putting them in multiple calls to the ffmpeg filter in a single filtergraph (like I’ve done back there combining “Protean Clouds” with “Old film”). But be aware it can be a laborious task, and there’s no guarantee you’ll get the output you want.
It’s also pertinent to note that shadertoys are of course shaders, but they implement some “standard” base code in the website that simplify and normalize their distribution, set up, and display (by using the web site). Part of that code is what I wrap (hardcode) inside the ffmpeg filter C code, and that’s how you can simply download the code and use it. So, even when this is actually “shaders” in the end, you’re seeing “shadertoy code” and not “shader code”. Think about shadertoys as a subset of shaders if you want. And so their code may be also a subset of glsl, but it isn’t strictly some “standard glsl”: it’s shadertoy code. Another pertinent note may be: there are of course other software around enabling you to render shadertoys outside the shadertoy’s website. Search the web for it and you will quickly find some examples. My work was to mix the power of ffmpeg with the power of shadertoys, and not just render them.
And as a side note, I’ve mentioned before that I use this filter in my videos (for applying filtered transitions, filtered animations, and even rendered animated backgrounds), but I had a funny extra use case for this tech: by mounting a fake webcam using ffmpeg, I was able to apply shadertoys to my webcam input to use in online video calls. Here’s an example using an adapted code from a sketchbook shadertoy, getting some A-HA’s take on me vibes: