Dissecting the Gorillaz Andromeda Music Video: Recreating The Stylized 3D Look With WebGL

As a graphics programmer, I‘ve always been drawn to stylized 3D animation that balances technical complexity with artistic expression. The music video for "Andromeda" by the virtual band Gorillaz is a perfect example of this.

It features a colorful cosmic setting with a variety of compelling effects – planetary atmospheres, asteroid fields, and synced explosions. But it uses these effects in a selective, stylized way rather than trying to be physically realistic.

I wanted to see if I could reproduce this distinctive look in real-time in a web browser using Three.js and WebGL. It seemed like a great opportunity to push the boundaries of what‘s possible with WebGL while diving deep into some advanced graphics techniques.

Breaking Down The Scene

The first step was to carefully study the Andromeda video and break down its key components. The main elements I identified were:

  1. Background cosmic skybox with layered, panning nebula textures
  2. Dozens of small, rocky meteors flying right to left
  3. Large Saturn-like planet with an animated, glowing surface
  4. Periodic bright flashes and explosion effects synced to the music
Breakdown of Andromeda scene components
Breaking down the key visual components of the Andromeda music video

While the overall look is certainly impressive, each individual element is relatively simple from a modeling and rendering perspective. The complexity comes from the artistic composition and synchronized timing of the effects.

I knew the key to reproducing this look in WebGL would be nailing the distinctive texturing and shading of each component while ensuring the scene could run at 60fps even on lower-powered mobile devices. I‘d need to be very deliberate about optimizing model complexity, draw calls, and shader logic.

Setting The Cosmic Stage

I began by setting up the cosmic background environment. This consists of a simple skybox geometry with several layered textures panning at slightly different rates. The textures are carefully selected nebula and star field images that tile seamlessly.

High resolution seamless nebula texture
A high resolution nebula texture carefully edited for seamless tiling

Rather than using the default texture mapping, I applied the background textures using a custom shader. This allowed me to precisely control the panning speed of each layer independently to create a parallax effect.

The shader also let me subtly tint the textures and fade them together to match the color grading of the reference video. Here‘s a simplified GLSL snippet demonstrating the core technique:

uniform sampler2D textures[4]; 
uniform vec2 panning[4];
uniform vec3 tints[4];
uniform float opacities[4];

void main() {
  vec4 color = vec4(0);
  for(int i=0; i < 4; i++) {
    vec2 uv = vUV + panning[i] * time;
    vec4 layer = texture2D(textures[i], uv);
    layer.rgb *= tints[i];
    layer.a *= opacities[i]; 
    color = mix(color, layer, layer.a);
  }
  gl_FragColor = color;
}

By using custom shaders in this way, we have complete control over how the background is rendered without adding additional draw calls or geometry. This is much more efficient than trying to achieve the same effect with multiple textured meshes.

Flying Through an Asteroid Field

The next major component is the dense field of small meteors flying across the screen. From studying the reference, I could see that the meteors are relatively low-poly rocky shapes tumbling through space on linear paths.

While it‘s possible to model and import unique meteor meshes, I found that I could achieve very convincing results procedurally with instancing. I used a simple icosahedron geometry as the base meteor mesh. At runtime, I spawn dozens of instances of this mesh with randomized position, rotation, scale, and velocity parameters.

Each frame, I update the meteors‘ transforms in JavaScript and rebuild the instance attribute buffers. The key is to only spawn and update what‘s visible to keep the CPU and GPU workload manageable. Here‘s the core update loop:

const maxMeteors = 100;
const meteors = [];

function updateMeteors(dt) {
  // Spawn new meteors to maintain total count
  while(meteors.length < maxMeteors) {
    meteors.push(spawnMeteor());
  }

  // Update meteor properties 
  for(const meteor of meteors) {
    meteor.position.x += meteor.velocity.x * dt;
    meteor.rotation.x += meteor.angularVelocity.x * dt;
    meteor.rotation.y += meteor.angularVelocity.y * dt;

    // Wrap around when offscreen
    if(meteor.position.x < -10) {
      meteor.position.x = 10;
      meteor.position.y = Math.random() * 5 - 2.5;
      meteor.velocity.x = -Math.random() - 0.1;
      meteor.velocity.y = Math.random() * 0.1 - 0.05;
      meteor.scale.setScalar(Math.random() * 0.5 + 0.1);
      meteor.angularVelocity.set(Math.random(), Math.random(), 0)
    } 
  }

  // Update instance matrices 
  for(let i=0; i < maxMeteors; i++) {
    const meteor = meteors[i];
    meteor.updateMatrix();
    mesh.setMatrixAt(i, meteor.matrix);
  }
  mesh.instanceMatrix.needsUpdate = true;
}

By using instancing, we can draw thousands of unique meteors in a single draw call, keeping the rendering overhead extremely low. This is critical for maintaining high frame rates on mobile devices.

Visualization of instanced meteor meshes
Instanced meteor meshes allow us to render thousands of asteroids in a single draw call

Shading A Mesmerizing Planet

The centerpiece of the Andromeda scene is undoubtedly the large Saturn-like planet with its subtly animated atmosphere. Accurately reproducing this mesmerizing effect in real-time required some clever shader work.

The planet‘s surface is driven by a single high-resolution texture mapped to a spherical mesh. However, simply scrolling this texture would create obvious seams at the poles.

To achieve the smooth, roiling appearance, I used a fragment shader that subtly distorts the UV coordinates based on a noise function and the fragment‘s latitude on the sphere. Here‘s the core of the shader:

uniform float time;
uniform sampler2D surfaceTexture;

void main() {
  // UV coordinates from 0-1
  vec2 uv = vUV; 

  // Latitude from -PI/2 to PI/2
  float lat = (vPosition.y / radius) * PI - PI/2.0;

  // Noise value from -1 to 1 sampled from latitude
  float noiseFreq = 2.0;
  float noiseScale = 0.1;
  float noise = snoise(vec2(lat * noiseFreq, time * 0.1)) * noiseScale;

  // Offset UV coordinates by noise scaled by latitude
  float latScale = cos(lat);
  uv.x += noise * latScale; 

  // Blend surface color with noise  
  vec4 surface = texture2D(surfaceTexture, uv);
  float noiseFactor = abs(noise) * 0.1;
  vec4 color = mix(surface, vec4(noise), noiseFactor);

  gl_FragColor = color;
}

This noise-based distortion creates the effect of turbulent flow while ensuring it loops seamlessly and varies smoothly from equator to pole. By mixing in the noise value, we also get subtle color variation that adds to the organic feel.

Animated planet surface distortion
The planet surface is subtly animated with noise-based UV distortion

For the atmospheric glow, I again reached for shaders rather than using additional meshes. I created a custom glow shader that uses the view-dependent fresnel effect to create a halo around the planet‘s silhouette.

The shader calculates the fresnel factor based on the angle between the view direction and the surface normal, and uses that to blend between the surface color and a bright glow color. Here‘s a simplified version:

uniform vec3 glowColor;
uniform float glowPower;
uniform float glowRadius;

void main() {
  vec3 viewDir = normalize(cameraPosition - vPosition);
  float fresnel = 1.0 - dot(viewDir, vNormal);
  float glowIntensity = pow(fresnel, glowPower) * glowRadius;

  vec3 color = mix(surfaceColor, glowColor, glowIntensity);

  gl_FragColor = vec4(color, 1.0);
}

By carefully tuning the glow parameters, I was able to closely match the look of the atmospheric halo from the reference. Best of all, this shader-based glow is far more efficient than particle systems or post-processing effects often used for such things.

Putting It All Together

With the key components in place, the final step was to combine them into a cohesive scene and synchronize everything to the music. This is where the artistic eye is just as important as technical skills.

I spent a lot of time carefully balancing the relative scales, rotation rates, and color palettes of the elements to create a harmonious composition. Getting the right density and distribution of meteors was crucial to sell the feeling of flying through a dense asteroid field.

For the music synchronization, I used a combination of the Web Audio API and Three.js‘ animation system. I created an AudioAnalyser node to get real-time frequency data, and converted that into intensity values I could use to drive shader parameters and trigger choreographed events.

Diagram of audio analysis and synchronization
Real-time audio analysis drives shader parameters and event triggers

The trickiest part was timing the big comet impact events on the planet‘s surface. I used tween.js to carefully sequence a series of animations:

  1. A large comet flies in from offscreen
  2. The comet impacts the planet surface, sending a shockwave rippling out
  3. The planet‘s glow intensity spikes and then dims back down
  4. Particles burst out from the impact point and dissipate

This required a lot of trial and error to get the timing feeling right, and I had to be mindful of performance implications. I used GPU particle systems and shader-based effects wherever possible to keep the impacts snappy.

Synchronized comet impact on planet surface
A choreographed comet impact event with ripple and particle effects

Throughout the development process, I kept a close eye on frame rates and regularly profiled the WebGL calls to identify bottlenecks. I made heavy use of tools like Chrome‘s Rendering tab, Spector.js‘ frame capture, and WebGL Insights.

This constant profiling helped me optimize expensive operations and keep the implementation within mobile budget constraints. Some key optimizations included:

  • Batch model and texture loads to minimize load time hitches
  • Use instancing to reduce draw calls for meteors and particles
  • Bake lighting into textures rather than using real-time lights
  • Minimize post-processing passes and use FXAA over MSAA
  • Use lower precision floats wherever possible (e.g. mediump in shaders)
WebGL performance profiling in WebGL Insights
Regular profiling with WebGL Insights helped identify and optimize bottlenecks

With these optimizations, I was able to keep the scene running smoothly on most mobile devices. The final stats were:

Metric Value
Entities 17
Draw Calls 23
Triangles 247,416
Frame Time 12.7ms
FPS 60

Conclusion

Recreating the Andromeda music video in WebGL was a fantastic learning experience. It required a deep understanding of shader programming, optimization techniques, and artistic direction.

The most challenging aspects were achieving the organic feel of the animated planet surface and carefully synchronizing the comet impacts to the audio. Profiling and optimization were also a constant focus to maintain high performance.

In the end, I‘m thrilled with the results. While it may not be pixel-perfect to the reference, I believe it captures the essence and aesthetic of the original while showcasing what‘s possible with modern web graphics.

The biggest takeaway for me was the power of creatively combining simple techniques to achieve impressive results. By relying on clever shaders and effects rather than brute force geometry and textures, we can create rich, stylized 3D scenes that run beautifully in the browser.

I hope this breakdown is helpful to other developers looking to create compelling WebGL experiences. The full source code for the project is available on GitHub – feel free to dig in, experiment, and extend it!

With smart optimizations and an artistic eye, the sky is truly the limit for 3D graphics on the web. I can‘t wait to see what the community creates next.

This article was written from the perspective of a full stack developer specializing in WebGL and web graphics. All code and statistics are simplified representations for illustrative purposes.

Similar Posts