history of realtime raytracing - part 3

[History of realtime raytracing (part #3)................]
We need raytrace, anyway.

"balls are the new bobs" - Insectecutor

"When art critics get together they talk about Form and Structure and Meaning.
When artists get together they talk about where you can buy cheap turpentine." - Pablo Picasso


Did you think we were dead? Yes, we were, indeed. And then your scientists invented electricity.
Merry Christmas.




We've left raytracing extinguishing just after the release of Heaven seven. Demoscene of the years '00 gets oriented towards an orgy of 3d scenes, more and more crowded and detailed as the graphic cards give away more raw power for free. Or maybe, demoscene is simply dazzled by the German Rise of the group Farbrausch, that leads the way with robots, discoball-alike dancing men and supermarket carts.

But everything has an end, even the insatiable thirst for power. And at the millionth polygon, the scene searchs for new directions.

And how do you innovate, if not going back to the roots of old and dear raytracing?

Where it all started (again)
The ones that first recognize in the new graphic cards not only polygon crunching powers are the almost unknown Castrum Doloris, who incidentally name the new scene's manifesto "ps_2_0" (as the code directive that says to the graphic card to use pixel shaders 2.0), and propose again the undead classic of the shadowcasting spheres in a hollow cylinder. But somehow, a little bit for the (coder) colours, a little bit because the scenes' resolution makes it look like a mid-nineties intro, the tentative of pioneers Castrum Doloris doesn't gain any followers.

The scene goes ahead wandering between hundreds of polygons, with brief stops in the accelerated raytracing's field, as in the Tron-esque and beautiful "Fair play to the queen" by Candela, where the running robot escaping from something passes behind three metaballs with specular bumps (yes, that's made with pixel shaders).

Rudebox - toruses stomping
And then, suddenly the scene has a revelation. The "revelation" is not so different from the innovative Castrum Doloris' intro (of two years before), but:

1) it doesn't have coder colours
2) it shows much much higher resolution
3) it's 4k in size (OMG, 4k)

Its name is "kinderpainter", and the author is noone but iq of rgba.

A new era is born. Everybody realizes that raytracing, and fast raytracing is possible on the GPU, and suddenly everyone starts making 4k intros with raytracing. But why is raytracing so fast on modern GPUs? We asked that to the author of kinderpainter himself:

"Well, it fact depends a bit. Raytracing is also damn fast in CPUs today. With multithreading, SSE coding and cache friendly design you can get several million rays per second in a regular CPU, which allows realtime raytracing of huge 3d models with millions of polygons. You can find here a few screenshots of my own CPU realtime raytracing.

Now, democoders are not in general interested in real raytracing - we are quite behind the raytracing community. We are still tracing rudimentary spheres and planes like in the 80s which is completely unimpressive for those who are really doing raytracing today. If you speak about raytracing as understood in the "demoscene" (spheres and planes), then yes, GPU is very fast, faster than the CPU.


Muon Baryon - love the grid at the bottom
Real raytracers are usually memory bandwidth bound, and not compute bound, because huge kdtree/bih structures have to be traversed and few polygons accessed per ray. That's why the fastest raytracers out there are CPU and not GPU.
However, in simple scenes like in the demoscene without kdtree/bih or complex data structure and geometry, brute force rendering the GPU really shines indeed. That´s because no memory accesses means the ALUs of the shader units can work without stalling. Of course all these ALUs work in parallel because every pixel of the screen is independent of the other pixels in a raytracer. So, for demoscener kind of raytracing the GPU is really a good option. Also, as the setup needed is minimum so it's perfect for 4k productions. Lastly, as GPUs get more powerful by means of adding more shader units (arithmetical power) these sort of simple raytracers will benefit more and more, just like rendering of fractals which are pure computation problem.

However, the CPU is still probably the best/fastest option for real raytracing with not ten, or a hundred of "stupid" spheres but millions of polygons. However NVidia is trying to have real raytracing running fast in the GPU too, although it's only for a limited amount of polygons (say, a million)."


After this "breakthrough", it becomes possible to reproduce in realtime the first chapter's patinated covers of early '80s magazines. It's a cycle that completes inside the scene. "Photon race"'s still image now moves, between translucency and refractions, in "Photon race 2". Every surface that seemed made of plastic now reflects everything around or diverts the rays of light, reaching a realism never seen before at this detail and at those resolutions. The 3d scenes are not penalized anymore by triangles' jizziness, but gather new and unexplored roundness.

And finally, raytracing
The possibilities offered by pixel shaders motivate the scene to go behind the classic spheres and planes (on the contrary of what iq says).

In "Yes we can", the Obama-sponsored 4k, Quite use raytracing to create kaleidoscopic plasmas, curved and distorted surfaces that alternate with rainbows of flexuos lines, billions of dust particles that generate infinite geometries.

In "Nevada", Loonies recreate cavernous moon passages, undergound wet and barely lighted tunnels, but more than anything, creased tentacles that mix their roots in a substance that reminds of lava, and stretch out towards a distant and incandescent star.

In "dollop", SQNY uses isosurfaces to build an enviroment that reminds of the underwater abyss, passing with a seamless cut to the soft clouds of a sunny day, only disturbed in an imperceptible way by an electric and psychedelic shell that surrounds the observer.

Raytracing is born again.
Someone calls it "raymarching", but the line between the two techniques is thin.

4k intros, genre that was drying up as the use of 3d raised the bar further and further, come back, and in democompos the number of those little gems exceeds the one of other productions. The "still images in 4k" genre, newly born, can move towards a robust youth, leading to masterpieces like "organix" or "slisesix".

A spark generates an explosion, if you have explosive.

6 commenti:

  1. So many demos I haven't seen before, thanks for this great overview!

    RispondiElimina
  2. What necessary words... super, an excellent phrase

    RispondiElimina
  3. raymarching and raytracing are two complete different things! Make your homework, think about it, then decide if you rephrase many of those sentences.

    RispondiElimina
  4. I think they are very similar, instead; for example:

    1) both concepts are similar: trace a ray from the origin towards the scene
    2) they are both techniques "opposed" to triangle rasterization
    3) the way you calculate reflections, shadows, etc. is the same

    Anyway, you can think of the series as "History of realtime raytracing and raymarching" if you want. The purpose of the articles is to discuss demoscene prods that show 3d worlds and are not triangle rasterization based.

    RispondiElimina
  5. Questo commento è stato eliminato da un amministratore del blog.

    RispondiElimina
  6. Questo commento è stato eliminato da un amministratore del blog.

    RispondiElimina