You can use them for any kind of color or geometric transformation, any at all. Like, if you wanted hue shifts, or to go black and white, or to add some kind of ripple effect (water, or a screen transition, or anything else).
In those screenshots I posted, they are being used for color masks, defining the screen coordinates, and coloring primitives. If you think of the parameters to, say, Rectangle(), the shaders are defining what those values mean. You could just change one line, and the x coord would mean the y. Or the red color channel. Or rotation, or whether or not to do alpha blending, or whatever. The default shaders I wrote just imitate the fixed function pipeline and allow you to use raster coordinates.
Considering an only slightly gimmicky use....
Take
the default fragment shader. If, say, you wanted to lower the lighting, and make it look like night. You could use a different shader that looks like this:
#version 120
void main(void){
gl_FragColor = gl_Color/vec4(0.5, 0.5, 0.8, 1.0);
}
If you imagine a color to be a vector of four floating point numbers, each representing the luminosity of a channel (1.0 is like 255, 0.0 is like 0 in 32-bit color), this would cut the red and green channels in half during drawing. So it would look blueish. The point being, it would be pretty slow to post process these kinds of effects (even the trivial example above!) in software. And also much less elegant.
And besides, if you want to use OpenGL that isn't ten years old, you need to use shaders. I'm setting it up so that if you don't want to learn GLSL, you don't have to, either. The default shaders work the way it did previously, you can just pretend it isn't there and you will be fine. But you
can write your own shaders and switch between them if you want to.
EDIT:
I've added hardware image cloning, and an alternative software method that will be used when it can't be done in hardware. Makes image cloning much faster, but also much more elegant (only a single function call, no buffers for pixels in system memory). The extension it uses is very common, and really
should exist for your GPU if you have OpenGL 3 or greater (guaranteed to exist if your card is OpenGL 4.3 or greater).