[History of realtime raytracing (part #1)..............]
How realtime raytracing has changed the demoscene and viceversa

"As we all know, the audience voted for best productions at Juhla Pi. And, as always - audience was stupid." -- Jmagic, on why the world's first realtime raytracing demo didn't win 1st place at Juhla Pi (from: "(Illogical) PC Demoscene Quotes List" or "how the 'first' realtime raytracing demo cannot be called 'Transgression 2'")

(not so) realtime raytracing on C64
Realtime raytracing, the wet dream of demoscene, deepens its roots on patinated PC magazines with still images of platonic solids that had unreal crystal reflections. Lifecycle of raytracing in the demoscene crosses at least three phases, the first one of which (the "inception" phase, that spans from the mid-90s to 1997's end) we'll analyze in this post.

The word on which the first realtime raytracing experiments bump is "fake" (or even "precalculated"). In order to build a convincing raytracing, in fact, one that you can really define "realtime", a minimum quantity of processing power is required, quantity that personal computers of old ages didn't have.
Not the widely used Commodore 64 or Amiga, but not even the first generations of PCs (i386 included).

For that reason, sceners had to reproduce visual-loops of pre-rendered images on those less powered platforms. For that reason, the intro quoted as "the first one with realtime raytracing" is an animation. The second intro of that series, instead, rushfully separates precalculated (duh) raytracing from the realtime one, but unfortunately the "realtime" version looks more like a slideshow of the magazine's images of above.

Attempts at raytracing again on C64
In the first creative phase, we can't count many productions using realtime raytracing. It seems that in order to show a sphere reflecting the surrounding landscape you need higher technical skills than the ones used to code (or rip) a 3d rotating cube. Or maybe the problem is that the closed circle of demoscene lacks information on this subject, and for those reasons realtime raytracing remains confined (and marked as "fake") to the black magic realms.

To make things more complicated, the raytracing-intro authors themselves mix cards, stating that the engine used in their productions is nothing else than "the slowest polygon engine in the world".

From the "Gamma 2" infofilu:

The 1st real rtrt intro (Transgression/MFX)
"It was in the beginning of 1995 when 216 had this pretty wild idea of a realtime raytracer. But soon it was discovered that the project was not only impossible but his programming skills weren't at that level. So he decided to do it the other way around.
Transgression 1 was created.

It wasn't anything but a regular polygon engine with a clever trick used. After filling the scene it just took samples from here and there and painted big blobs on those points and thus the outlook was so crappy that no-one was to tell whether it was traced or not."


The first few realtime raytracing demoscenic productions are all alike for using some cliches: reflecting spheres, shadow-caster spheres in all different sizes and colours (with fog, without fog, with lights, with more lights), perfectly perpendicular cylinders, checkerboard-textured planes and so on. By the way, those are the easiest primitives to trace.

Anyway, the limits of machines where those productions run are still there, and, for how great a coder could have been, tracing more than a textured plane at full framerate was impossible. For that reason, the around-'97 raytracing intros are rendered at resolutions that are fractions of real screen resolution (for most of them the fraction is fixed, but there are exceptions that we'll see next), and then interpolated to cover its entire surface. Raytracing is then rapidly associated in people's imaginary to the jagginess of any oblique surface.

For each intro of that period, also, the object's movement, seen on more powerful PCs, makes them look as they're submerged in highly viscous liquid. Coding a raytracing intro on a low-end i486 must have been hard (but that's demoscene: pushing the limits): probably it was like seeing a bunch of polaroids in sequence.

Bumped and raytraced trees anyone?
This phase of history of raytracing in demoscene ends symbolically with the highest evolution of polaroid-technology: "Sink" by Pulse (Pulse is the group that gave us presents like "Famous Cyber People", "73 Million Seconds" and "Square"), highly unknown and underrated intro that tries to go ahead of the "textured-plane/reflecting sphere with shadow" concept, and proposes more consistent scenes, like an underwater and futuristic landscape, or a Paper-esque cartoonish grass foreshortening with (bumped) trees and sun.

After "Sink" we won't see any raytracing from Technomancer again, but someone may be working on Pulse's nextgen engine:

"soon unreal's granny will join with her texture+phong+bump+zbuffer+shadows+ hexalinear heptapentametaoctamapping+fog+rain+particles+depth focus+hair modeller+volume rendering+ radiosity+ raytracing+ perspectivecorrect 3d system that she is coding since world war II
at the moment she's workin on the speed, now she's said to have 50spf (seconds per frame)"

posted by friol at 8/17/2009 08:47:00 PM - under: , - comments? here (0)

the Tunnel - demoscene blog(c) friol 2o18