Thumper: a WebGL demo

Thumper is a shader driven by frequency modulation, rendered to a single quad. I put this together using iq’s Shader Toy and used a track from Akira Kiteshi called “Boom N Pow”. Below, I describe briefly how it works.

There are four basic components that create the effect:

1) A procedural texture
2) Radial blur
3) Plane deformation
4) Frequency modulation

Smooshy Squares

The procedural texture (the function “cube” in the shader) is generated by taking the [-1, 1] coordinate of the current fragment and modding it’s x and y components to determine if either is in an active column. If so, the pixel value is white. This creates white horizontal and vertical bars leaving grid of black squares. The basic formula is:

float value = 0.0;

if (floor(mod(p.x*10.0, 2.0)) == 0)
value = 1.0;

if (floor(mod(p.y*10.0, 2.0)) == 0)
value = 1.0;

vec4 color = vec4(value, value, value, 1.0);

Multiplying times 10.0 and modding by 2.0 means we should end up turning on and off in large blocks, the value 10 controls the size of the grid squares. The floor() call is necessary because we’re dealing with floats, an epsilon value could be used to make the transition smoother, but I wanted hard edges. The x and y values are then manipulated over time (t0) to cause the squares to strech and compress. The effect is stable in p.y, but the frequency of p.x is modulated (t0*cos(sin(t0*3.)*t0*.01)) to cause the variation in the horizontal compression over time.

Adding the Bounce

The radial blur is based on iq’s old school radial blur effect in the Shader Toy. The basic idea is to calculate a direction vector and layer the color of the current pixel by sampling in that direction over a given number of steps. If you look at iq’s example, this is the vector d, the color is “total” variable and each step is reduced in weight with the “w” variable.

In Thumper, two direction variables (d1 & d2) are used, pointing in opposite directions. Then, the mix() function is applied to interpolate between these direction to a heavily modulated sin function:

vec2 d = mix(d1, d2,.5+.5*sin(sin(sin(t0)*t0)*t0));

The value .5 + .5 * sin(...) is necessary to keep the value in the range [0, 1], which keeps the mix() call sane.

Warpy & Melty

The plane deformation (the function "deform" in the shader) is what causes the warping/bubbling effect. This is also based on iq's radial blur shader, with some added frequency modulation to spice it up a bit.

What's happening here is the fragment point (p) is being mapped into (sin, sin) space. Then, properties of this new vector are used to manipulate the original point, p, which is finally used to sample the procedural texture. The end result is the texture deformed by a two dimensional wave.

If you look at the source code, the value "r" is roughly the amount of displacement in local space, which is multiplied by p again, essentially mapping back into fragment space. sin(t0) is added to the resulting value to layer on some additional movement.

The Seizure Test

Frequency modulation is, as it says, the act of modulating the frequency of a function. In this case, the function is sin() or cos() and I'm using the same functions again to modulate their own frequency, such as sin(sin(t)*t). The great and terrible thing about FM is that it continues to change over time. This can be desirable for creating variation, but ultimately you need some control for long running effects.

To get this control, I used a constant period variable, T = 20. Each function uses this variable slightly differently, but the idea is that it provides a basis for cutting off the frequency modulation. Without it, the effect has a tendency to spiral off into seizure inducing chaos.

If you look at my usage of the limiter, it basically resets after some period, which wouldn't work well for a smooth effect. However, since this effect was intended to be spastic, the jerkiness of the reset kind of adds something nice :)

Timing & Audio

Although it may look like this is doing beat detection, it's not. The audio and video just happen to sync up due to some very loose planning. I know dub step is created primarily using frequency modulation, so I hoped the end effect would sync to the music and I think it worked out okay.

On the other hand, the timing truly *is* synced to the music. I used the HTML5 audio tag to get a notification of when the song was loded. The shader takes a uniform parameter "time" which is driven by the the current time of the audio tag. So when you scrub through the song, the visual effect stays in sync, as you would expect.

Conclusion

So there you have it, very little code is needed to produce this effect, essentially, the shader does all the work. I have to give major props to iq, this demo really looks good because of the tricks I learned from looking at his shader toy code.

Tagged Tags: , , , on June 17, 2011 at 9:18 pm

Maya Cornell Box (Learning Maya)

A simple Cornell Box I made in Maya ray traced with caustics and atmospheric effects:

Tagged Tags: , on May 2, 2011 at 6:39 am

HTML5 Audio Synthesizer

I built this synthesizer purely in HTML and JavaScript using the Firefox audio data API. The way it works is an audio source is initialized with a callback which generates wave data. I coded up three oscillators and a volume envelope to give the sound a little more character.

Key up/down events are trapped and drive the audio when a new buffer is requested. The biggest issue I have with this demo is the huge buffering delay. Unfortunately, I couldn’t do much about it because it’s imposed by Firefox, which requires a 500ms delay between writing audio buffers.

To use it, click the link below, wait for the page to load and then press any of the top two rows of keys on your keyboard (Q-P, and a handful of number keys). You can also tweak the parameters while it’s playing, which is a fun way to torture your friends and family : )

Check it out: http://visualcore.com/htmlsynth/

Tagged Tags: , , on April 11, 2011 at 11:47 pm

Flo: a realtime graphics demo

This is kind of old news (from November, 2010), but here is a video of the demo Shalin Shodhan and I created for the Pixar demoparty (which we won : ). The effects are created with particle systems, fluid dynamics, some shader magic, frame buffers and a constraint solver (for the hair). We built it in about two weeks; the original demo was a real-time executable (3mb):

Tagged on April 11, 2011 at 11:30 pm

Compilers Review

While studying for my compilers final, I made a map of the information covered (using kdissert):

Tagged Tags: , on May 14, 2010 at 7:31 pm

Pyth Compiler

An x86 compiler for Pyth (a Python dialect)

The core lexer and parser are generated using Flex and Bison, from which the parser outputs an AST tree. Static analysis is preformed on the tree generating errors or producing a tree decorated with declaration, type and scope information. The tree is then transformed into an intermediate language (IL) for a virtual machine. Optionally, the IL can also be compiled down to assembly language (IA32) and used to produce a standalone executable.

Download: Source (C++)

Tagged on May 1, 2010 at 12:00 am

Angry Beard: Distortion Pedal

An audio distortion circuit

This was built based on the classic Angry Beard III circuit schematic, but heavily modified. A different OpAmp and transistor were used, the high-low switch was removed, added dual-power supply, I used an additional voltage divider to bring the gain down to a reasonable level at the output and I also added an on/off switch. The super-cheap OpAmp I used gives it a much grungier sound, which turned out surprisingly cool.

Download: Schematic (original)

Tagged on April 27, 2010 at 12:00 am

Path/Ray Tracer

A ray tracer and Monte Carlo path tracer

The core engine supports two modes of rendering: path tracing or ray tracing. The geometry of the scene and the various renderer settings are specified in a simple text file format.

Noteworthy features: tweakable multi-threading, variable anti-aliasing (2x, 4x, …), an OBJ file format parser, area lights, soft shadows, video export, Python API for controlling the renderer.

Download: Source (C++, Python)

Tagged on April 26, 2010 at 12:00 am

Mandelbrot Zoom



I created this last semester for a class on simulations (which also had a heavy emphasis on fractals). A Mandelbrot fractal is generated and then analyzed for the most “interesting” area to zoom in on.

The following animated gif shows the algorithm in action. The left side is the generated Mandelbrot fractal and the right side is a visualization of the activity detection algorithm. The algorithm is choosing how to zoom into the fractal in real-time (when the image was recorded).

I’ve also posted the Matlab code if you are interested in seeing how it works.

Tagged Tags: , on March 22, 2010 at 11:46 am

reCAPTCHA

Apparently using WordPress makes you a huge target for spam, so unfortunately I’ve had to resort to using a CAPTCHA again.

I’ve tried some alternative non-CAPTCHA filters, but none of them seem very reliable (some very simple tests were giving false positives).

On a (non-false) positive note, since this is my spring break, I’m trying to get some of my recent projects together and post them this week :)

Tagged Tags: , on March 22, 2010 at 11:34 am

Next Page »