The G'MIC image has three spatial dimensions; it is mostly out of habit and custom that many people (this writer included) generally fuhgeddaboudit and only use single-slice, two dimensional images. That is a shame, as many G'MIC commands do their bidding in three dimensions as well as two. They are untapped sources of cheap (in terms of rendering time) animation.
The diagram on the left illustrates how the -blur command behaves when an image has more than one slice. An orange pixel affects neighbors along the z axis (depth) as well as along the x and y axes. Play these slices as an animation and the blurred pixel becomes a kind of a pop! explosion.
The -bandpass command also operates in three dimensions. In other places, we've harnessed this command for spectral filtering, eliding all but a subset of frequencies in the original image. Taking an image with such truncated spectral content back into spatial domain gives rise to a set of sine waves that do not quite reconstitute the original; we see patterns of constructive reinforcement and destructive interference that would not have been especially noticable had we not erased those parts of the spectrum needed to reconstitute the original.
The key idea in this Cookbook piece is this: -bandpass, like many G'MIC commands work in three dimensions. And when we think Animation, that extra dimension can be time. The following example happens to use the -bandpass command, but the techniques presented here aren't married to that command. They can be generalized to any other command which operates 'in the round' over slices as well as the width and height of images.
Here is a pipeline that produces a kind of a cauldron effect. The nice thing about clips produced by this pipeline is that they can be seamlessly looped without a 'pop' crossing from the end to the beginning of the loop. Details follow.
First, we conjure from the aether a contiguous volume 320 pixels wide, 240 pixels high and nine hundred slices deep, sufficient for a thirty second video clip. We salt-and-pepper a fairly sparse pattern of pixels. If you scrub through the image volume at this point in the pipeline (insert -display after -noise), the animation would have the appearance of dirty film. To this sparse noise we apply a low-pass spectral filter, using the -bandpass command. The lower relative cutoff frequency is 0.005, the upper relative cutoff frequency is 0.02.
Behind the scenes, the -bandpass command harnesses the Fast Fourier Transform, embodied in the -fft command, converting the three dimensional array of pixels into a similar array of spectral coefficients, this through a process of Fourier analysis. It then deletes all but a hollow sphere of coefficients around the spectral space origin. The lower and upper relative cutoff frequencies set the thickness of this shell and establish the overall character of the animation. By 'cutoff,' we mean that we set coefficients beyond these limits to zero and preserve coefficients within.
Here, we picked low relative frequencies, producing only very large features that seem quite blurry, characteristics of a low pass frequency filter. Larger relative frequencies would give rise to high-pass filters, admitting smaller, sharper features, while the difference between the numbers controls the amount of variation in feature size. Experimenting with these numbers can give rise to a great deal of variety in texture and variation. The -noise command also affects the texture of the pattern, particularly for larger values of cutoff frequencies.
With a much-elided set of spectral coefficients, the -bandpass command invokes the -ifft command, partially reconstituting the spatial image through the inverse process of Fourier synthesis.
At the end of the command pipeline, housekeeping prevails. The reconstituted image generally will not be in a range suitable for presentation display; -normalize shifts all intensities into the 0,255 range, a suitable width for eight bit color animation. We use the -split command in preparation for -output, a command which can create a video, but which needs a sequence of one slice images. By way of particular file extensions, you can specify one of five multimedia containers: Quicktime (.mov), Motion Picture Expert Group (.mpeg), Flash (.flv), Ogg/Vorbis (.ogg) or Audio Video Interleaved (.avi). This pipeline harnesses the Quicktime container.
The multimedia file produced by the -output command runs at 25 frames per second and contains an mpeg-2 video stream, which can be played on a wide range of hardware. If you need to tailor the multimedia stream in more precise ways, save the stream as an image sequence. Choose a lossless image file format like Portable Network Graphic images (.png). The resulting sequence of numbered files may be imported via a .toc file ("table of contents file") into a video editor like cinelerra.
The video clip on the left is a product of the previous G'MIC command pipeline. It reminds us of percolating liquid in a cauldron. Poetic interpretations aside, we are observing the low frequency components of sparsely scattered noise, a three dimensional interference pattern which our "camera" depicts two dimensionally, displaying slices through time. Particular curdling textures depend very much on the choice of the lower and upper relative cutoff frequencies that we gave to the -bandpass command. Larger relative frequencies give rise to finer, higher frequency detail. The blobs are smaller and seem to curdle faster. A larger difference between relative frequencies admits a wider range of blob sizes. The cutoff frequencies can be set too close together. If the difference between the lower and upper relative frequencies becomes too small, the resulting images may be all black.
Because this sequence is a product of Fourier analysis and synthesis, it can be seamlessly tiled in the x, y, and z dimensions. Not only do the top and bottom and left and right edges match, but the end of the animation segues into the beginning, so this material can be used out of the box in an endless loop without having to worry about a 'pop' as the animation loops around. This characteristic stems entirely from the -bandpass command and the underlying Fourier transform commands, -fft and -ifft.
Nearly everything we do with G'MIC is not an end in itself, but furnishes stepping stones to other effects. The cauldron animation can furnish displacement fields to distort other visuals, and the cauldron visual need not even appear itself.
This is a title image made by Inkscape and then exported as a PNG file. We matched its aspect ratio to the cauldron clip shown in the previous section, 360x240 pixels. Our aim is to composite this title with another video, so we have included an alpha channel in the image file so that the base video will be visible through the transparent regions of this title image.
For this example, we will composite this title with the cauldron clip, harnessing it also to furnish frame-by-frame displacement fields. Our particular approach will be based on G'MIC's -warp command, which uses channel zero of a displacement field image as an image shift source of along the horizontal axis and channel one as a image shift source along the vertical axis. We'll extract these channels from each frame of the caudron clip and apply them to the title image on the right. Since it would be tedious to do this manually for each of the 900 frames in the cauldron clip, we will avail ourselves of a G'MIC loop commands, -repeat <n>... -done
This is the pipeline of commands that creates an animated warp of the title image. A walk through of this pipeline follows the table:
We input our cauldron clip, making a large image stack. (-input cauldron.mov...)
G'MIC -repeat ... -done permits us to iterate a pipeline of commands over the image stack The notation '$!' is one of G'MIC's substitution sequences, this particular one will be replaced with the number of images in the stack. With this substitution, we need not know before hand how many frames a video clip has.
G'MIC applies the mini-pipeline of commands between the -repeat...-done markers for each image in the pipeline. Here, our overall mode of operation is to derive a displacement field from whatever image happens to be at the beginning of the stack and use it to distort a copy of the title, placing the warped title at the end of the image stack. Finally, we remove the first image from the stack. We then repeat. As we proceed, we consume frames from the clip, but leave the corresponding warped title slides at the end of the stack, preserving the original order of the clip. As we proceed, the original animation shapes the warping of the title image.
In each repetition, we split the current frame into its red, green and blue channels, then delete the blue channel as we only need two colors to make a displacement field. (...--split c -rm[-1]...)
We compose the displacement field using the --append command and employ -normalize as a warping control. Increasing the second parameter magnifies the effect of the displacement field and the magnitude of the warp. Set to suit your taste; we think 30 is about right for a really dramatic warp. We insert a fresh, unwarped copy of the title image at the end of the stack and invoke a relative warp, using the displacement field, now in the penultimate slot on the stack, to control the warp. (...-normalize[-1] 0,30 -input cauldron_title.png -warp[-1] [-2],1,1...)
The last two commands in the mini-pipeline perform housekeeping. We first remove the displacement field and then the first image in the stack, as both have warped their corresponding title image in the output stream and are no longer needed. (-rm[-2] -rm)
Leaving the repeat loop, we have a sequence of warped title slides, ordered in the same manner as the frames of the original animation, to which they are related. Here, we write to a Quicktime container, but could also write a sequence of PNG files as well.
We used cinelerra to composite the title sequence onto the original cauldron clip.