Hydra is an analog-synth-like coding environment for real-time visuals. It is created by Olivia Jack and is open-source; thus, you can either open the browser version or clone the repository to serve it on your computer. There are a few resources:
- The official documentation is a good resource to get started,
- Function list covers all the functions available in Hydra,
- and Hydra Patterns on Twitter is a way to get inspirations from other artists.
This article is a work-in-progress online book to collect Hydra snippets. The goal is not only to accumulate frequently-used techniques to make coding easier but also to research the theory of Hydra to discover new images.
Table of Contents
In this chapter, we discuss textures or patterns, separately from colors or movements. Most of the snippets have low saturation and no movements in order to separate textures from other effects.
osc() is one of the basic sources to create a texture. The first argument determines the frequency (i.e., how packed the stripes are), the second for the sync (i.e., the scroll speed), and the third for the offset, which adds color to the pattern.
posterize(), the oscillator pattern becomes clear stripes.
pixelate() can achieve a similar effect; however, with sync parameter, the movement will appear differently.
kaleid() with a large number creates circles,
and with a small number it becomes geometric shapes (in the example, an oscillator is combined with
noise() is another basic function as a source. A texture is generated based on a variant of Perlin Noise.
We will look more into detail in the modulator section.
voronoi() is a source to generate a Voronoi diagram.
shape() generates a polygon with a number of sides set by the first argument. Nevertheless, it is more than just a polygon - the second argument changes the size of the shape, and most importantly, the third argument can set gradient of the shape. For example,
shape(2) is a thick line, which can be scaled to make a thin line.
shape() with a large number of sides creates a circle. By tweaking the example above, it generates a Polka dot pattern.
or almost equivalent with (the center of the image will be horizontally shifted)
This tiling technique can be used to create a RGB pixel filter. In this example,
func is decomposed into R, G, and B channels and overlaid on top of each other.
Modulators are the key component in Hydra. Let’s look at this example:
The modulated function (top left):
The modulating function (top right):
The result (bottom)
We can make a few observations. First, the color of the original image (or modulated image,
osc(40,0,1)) is preserved. Second, the oscillator is distorted to resemble the pattern of the modulating texture,
noise(3,0). Modulators can be seen from two different perspectives. On the one hand, a modulator literally modulates (or distorts) the chained function (
osc in this example). In this section, we cover this aspect to explore the distortion. On the other hand, it can be seen as a way to paint the modulator function (
noise in this example). For example,
noise itself is grayscale, but using it as an argument of a modulator, the noise pattern is painted with, for example, an oscillator or a gradient.
Here is a pseudocode of
A.modulate(B, amount) producing
ANew. This might be helpful if you are already familiar with coding environments such as Processing and openFrameworks.
A modulator with a feedback loop keeps pushing pixels based on its brightness. In this example, a noise is modulated by itself in a feedback loop. As a result, bright pixels are pushed further and further, creating a smooth, 3D-like effect.
This example uses the same technique on a Voronoi diagram. Similar to above, the resulting image has a fake 3D look.
Modulators can be chained to create complex patterns. In the examples above, pixels are pushed based on their brightness but always to the same directions. By normalizing an image from
[0, 1] to, for example,
[-1, 1], pixels are pushed to two opposite directions. This can be achieved by
color(2,2).add(solid(-1,-1)) (notice that only red and green are selected because blue channel is ignored by a modulator).
The same technique can be applied to another texture. In this example, a square grid is used, but the second and third arguments of
shape() is changed to add gradient, which helps modulating an image.
modulateScale is a variant of
modulate. The original
modulate translates the texture coordinate by
(r, g) which is the color of modulating texture;
modulateScale scales the pixel position by
(r, g). Simply applying
modulateScale can create huge distortion, which is pleasant as it is, but you can extend your repertoire by understanding the behavior of
modulateScale. For example, modulating a high frequency oscillator by a low frequency oscillator can create the following distortion. Note that
modulateScrollX achieves a similar effect; nevertheless, scrolling involve texture wrapping which creates a discontinuity unlike scaling.
kaleid can be added to create a ripple or breathing effect towards or from the center.
This breathing or ripple texture can be further used for modulating another texture.
Scaling and difference can also create a periodic texture.
This technique can also be applied to a complex texture.
The effect can be enhanced by
thresh and setting the third argument of
voronoi to 0, to have sharp edges. However, a naive implementation will end up in a complete noise.
To have a desired effect, apply a square mask (before trying the next example, apply
solid().out(o0) to clear the buffer).
diff can be replaced by
add(oX, -1) to avoid oscillation. The difference between
diff is discussed in Blending section.
These examples can be used together with rotation.
Or, instead of
scale, scrolling functions (
scrollY) can be used with a feedback loop.
gradient() is one of the sources to generate a gradient texture. The first argument determines the speed of the color change.
With the third argument of
osc(), an oscillator generates a colored texture.
Although not documented,
hue is a useful function to shift the hue in HSV (hue, saturation, value) color space. The saturation and brightness of the color are preserved, and only the hue is affected.
colorama() shifts all H, S and V values, implemented as follows:
Therefore, the resulting image is rather unpredictable (for explanation, the top part shows the original image (oscillator) and the bottom shows colorama-ed result).
This unpredictability is due to the following reasons. In the GLSL snippet above, first, HSV values are increased by
amount, and after converting back to RGB, the
fract value is returned. Since
fract returns the fraction of the value (equivalent to
colorama effect less harsh is to set negative value as an argument:
luma() masks an image based on the luminosity. Similar to
thresh(), however, the color of the bright part of the image is preserved. The first argument is for the threshold, and the second is for the tolerance (with bigger tolerance, the boundary becomes blurrier).
luma() returns an image with transparency. Therefore, the image can be overlayed to another image.
With the second argument of
luma, a shadow-like effect can be created. First, turn the texture to grayscale by
saturate(0), then use
luma(0.2,0.2) to create blurred boundaries, and finally
color(0,0,0,1) to convert grayscale to an alpha mask with black color. In the example, foreground texture
f() is defined for convenience to avoid duplication for shadow generation and foreground rendering. The shadow texture is overlaid on the background texture
osc(200,0,1) and then the foreground texture
f() is overlaid on the shadow texture.
The above examples give “video synthesizer” like colors. But what if you want to use colors from a palette, for example, specified by RGB hexadecimal numbers? In the next example, a grayscale texture is re-colored by a palette taken from coolors.co.
While the example code is long, in a nutshell, the input grayscale texture defined by
func is separated into 5 layers based on the intensity, and each layer is recolored by the hexadecimal number specified in coolors URL. The GIF animation below shows each layer recolored for explanation. At the end, these layers are overlaid on top of each other to produce the final texture (above).
A feedback loop can be used to create unexpected color effects. For example, based on an example from Scaling, a periodic color texture can be generated.
Arithmetic is not the most exciting topic; nevertheless, you might encounter undesired blending effects and wonder how to fix it. The output range of the sources are not all the same.
In this example,
func’s negative value is clipped by
luma and overlaid on a red solid texture. If
func is normalized from 0 to 1, the resulting texture is the same as
func as it is not affected by
luma. However, if
func is normalized from -1 to 1, the negative values are clipped and the red texture appears.
voronoi are the former (0 to 1) and
noise is the latter (-1 to 1) as seen in the image below.
noise can be normalized to 0-1 by the following method:
This example shows the difference between
add(oX, -1) might seem to be identical to
add simply adds the texture (the first argument) multiplied by a scalar (the second argument),
diff first takes a difference of two textures and returns absolute values. Note that
diff only takes one argument, and the resulting alpha value (transparency) is the maximum value between two values.
In this example, a gray solid texture is subtracted by
osc using two different functions. Notice the difference;
diff (top) returns absolute values therefore continuous, and
add (bottom) keeps negative values which appears black.
Another confusing blending functions are
mask. On Hydra interface, the result might appear the same; however, they treat the alpha channel differently. First,
mult simply multiplies the color values of two textures. Each channel, R, G, B and A are treated independently. Therefore, the alpha channel of the resulting image in the example below remains 1 (note that both
shape return opaque textures), and the texture underneath cannot be seen.
mask only uses the luminance of the mask texture. The returned texture is not only the multiplication of the masked texture and the luminance of mask, the alpha channel is overwritten by the luminance of mask. Therefore, the returned texture can be overlaid on another texture by
mult, a similar effect can be obtained by using
luma to modify the alpha channel. In this example, the resulting image is the same; however, with a grayscale texture, the result depends on the arguments of
Low Frequency Oscillator
In audiovisual synthesis, a term low frequency oscillator (LFO) is often used. According to Wikipedia, an oscillator with a frequency below 20 Hz is usually considered as an LFO; nevertheless, the definition depends on the application, and here, I would not define the frequency (in fact, most LCDs support up to 60 Hz, and effectively, what can be displayed on an LCD is LFO). The important point is that, in this section, we strictly look at oscillators in the time domain. In the previous chapters, oscillators are explained in the spatial domain, i.e., the pixel space. If the second argument of
osc is set to a non-zero value, the pattern starts to “move.”
In this book, the images are static. Please try the code on the Hydra editor to watch the movement.
The result seems to be scrolling stripes due to the human perception. If we look at an oscillator with a smaller spatial frequency (i.e., to set the first argument small), and take an average of the whole pixels by
pixelate(1,1), the color change in time becomes recognizable.
This is not particularly an interesting example. Yet, it is important to separate the characteristics in the time and spatial domains. For instance, the sine wave oscillator in the example above can be used as a fader to mix two images:
A similar effect can be achieved by a lambda function:
Visually, both examples crossfade the two shapes: a red triangle and a blue square. The key is to understand the difference between these two examples. In the first code, the two shapes are multiplied by
lfoInvert, which is the inverted texture of
lfo. This can be thought as an analogy of a layer mask with a uniform transparency in Photoshop. In the second code, a lambda function with
Math.sin is attached to the second argument of
blend. This is similar to setting a global opacity of the layer in Photoshop. The latter is more concise and easier to understand. However, it is spatially less flexible because the single transparency is applied to the blending operation of all the pixels. The former can be modified to add spatial oscillation, i.e., a layer mask.
Beyond image blending, LFOs can be used for other several operations. An example is
pixelate. To change the argument of
pixelate in time, one might use a lambda function:
lfo function itself is passed to
lfo() function call. When
lfo() is passed, it is only evaluated once and you will not see any change in the image. A similar texture can be generated using
Again, the difference of the two example is the flexibility in the spatial domain. By increasing the number of
pixelate in the later example, you can apply different pixelation operations to each segment of the texture.
The downside of the
osc.pixelate LFO compared to a lambda LFO is that arithmetic operations are cumbersome. To add a value X, one needs to write
And to multiply by Y,
Also, such an LFO has fewer mathematical functions. Nevertheless, discretization can be achieved by
which is similar to
A screenshot is omitted because a static image would appear similar to the previous example. The same technique can be used in conjunction with other modulation functions such as