Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In comparison to rasterization, ray tracing does a perfectly good job of doing correct reflections.

However from a psychovisual standpoint correct reflections are the least important thing to look for. Just use a dedicated cube map for a scene and use environment mapping and focus on stuff that really matters and everybody immediately notices. Like high resolution textures, some proper anisotropic filtering and "HDR" lighting.



Textures, antialiasing, and applying filters to make the rendered image look nice are all good, but I would argue that what really matters and everybody immediately notices is the realistic modeling of ambient light. Proper global illumination is often what makes the difference between a sort-of-realistic looking scene from something that's hard to distinguish from a real image. The best known methods of doing global illumination are all based on tracing a lot of rays.

Physically correct reflection and refractions are nice to have, but those aren't the only reasons to prefer ray tracing.


Ray traced shadows are noticeably better than rasterized shadows.

In practice, ray-tracing will never fully take over rasterization. At best, a standardized ray-tracing API will perform a few calculations which are superior in the ray-tracing style.

IE: Mirrors, Ambient Occlusion, and Shadow generation. Everything else (texturing, among other things) will likely remain rasterization.


Why wouldn't some form of ray tracing or path tracing eventually take over once we have the necessary hardware and software and standard APIs to do it at high resolution and high framerates? What features does polygon rasterization give us that can't be replicated any other way?


If there is a limit to FPS and resolution where we say “ok that’s enough” then we could ray trace more with the yearly advances in computing power. And I really thought at one point that was 1080p. But later I thought it was 2k, then 4K, then 2x8K (per eye in VR).

Now I honestly have no idea what the “endgame” is, but if it’s say 8K@120 for single screens and twice that for VR then thats two to there orders orders of magnitude away from what a regular gaming card does today. The problem with numbers that big is that it’s hard to say what the tech landscape is.

The killer feature of rasterization is that it gives an illusion of correctness at a millionth of the computational effort. Apart from efficiency I can’t see many attractive properties.


For VR at 120fps you can do eye tracking and only render part of the image at 8k and keep the rest at much lower resolutions. Interlacing 8k images at 120 FPS is also viable.

Also, resolution has increased really slowly over time, people generally don't really care that much. 1080p is 20 years old at this point.


Pro games are often played at 2K@240 now; it never ends


If you can do ray tracing fast, you can do rasterization faster. And so if the ray tracing is not critical, you're wasting time computing it.


I don't think it's a waste of time. Ray tracing puts you on the path to real-time global illumination, which I think everyone is going to want if it's in reach. It also makes a lot of non-graphics tasks easier, like determining line-of-sight visibility between arbitrary points in the scene or doing collision detection.

Besides, computational power isn't something we necessarily need to conserve for its own sake. We stopped using Gourad shading when the hardware became fast enough that we didn't need it anymore. Polygon rasterization will stick around for a long time I think, but I don't think it will always be an essential part of everyone's real-time rendering technology stack.

http://www.paulgraham.com/hundred.html

> I can already tell you what's going to happen to all those extra cycles that faster hardware is going to give us in the next hundred years. They're nearly all going to be wasted.

> I learned to program when computer power was scarce. I can remember taking all the spaces out of my Basic programs so they would fit into the memory of a 4K TRS-80. The thought of all this stupendously inefficient software burning up cycles doing the same thing over and over seems kind of gross to me. But I think my intuitions here are wrong. I'm like someone who grew up poor, and can't bear to spend money even for something important, like going to the doctor.


"Real time rendering" mostly means games, and performance is a very very easy bottleneck to hit for games. Because it is easy to make things more fun and interesting by scaling them up: more entities, more interactions, more options, etc.

So I believe you are quite wrong. It is trivial to scale-up a simulation to the point where you're at a performance bottleneck. The only reason we don't scale things up today is because our computers are far too slow to handle it.

And we stopped using Gouraud shading because it is horrendously ugly. We'd still be using it still if it looked half as good as rasterization does.


One of the surprising characteristics of ray tracers is that if you've got a reasonably efficient acceleration structure, they're relatively insensitive to scene complexity. Say you're rendering ten thousand triangles at a certain framerate, and then you bump the resolution of your meshes up so that you've got a hundred thousand triangles. The framerate drops a little bit, like maybe 20 percent or so, but it's nowhere near a linear slowdown.

You can always slow the rendering times way down by adding a lot of reflection and refraction, but those are things you can't really do in a general and physically-correct way in a polygon rasterizer.


Gouraud shading and rasterization are not mutually exclusive. Rasterization is a technique for getting from polygons to pixels, gouraud shading is a technique for assigning colours to those pixels. Even flat shading is usually done with rasterization.


Achieving "photorealistic" results via rasterization requires a lot of dirty tricks and slight of hand. Supporting it requires a lot of (expensive) hard work from graphics programmers and content artists.

The content pipeline is relatively simple for ray-traced content: just define the material properties of the surfaces of your scene, and the transport properties of the media your light is flowing through, and the light bounces where it bounces.

The limiting factor for real-time graphics applications is generally the cost of content creation. Real-time ray tracing could mitigate that cost.


Not necessarily - rasterisation can suffer from "overdraw", which raytracing doesn't (although it has its own issues with acceleration structure requirements, especially where moving objects in the scene is concerned, so it's not a complete win).

One of the reasons VFX has moved to raytracing/pathtracing is that it copes with complexity a lot better in general (we're talking scenes with 500 million uninstanced triangles).

It also allows you to do much more accurate camera projections like spherical / fisheye (which could be useful for VR). Doing things like omni-directional spherical 3D projection in rasterisation requires tricks or warping in 2D, whereas it's pretty trivial to trace rays from the camera and do it completely accurately with raytracing.


Ray tracing WILL DEFINITELY fully take over rasterization once hardware capable of doing it is commoditized.

If for no other reason than the huge amount of effort it will save game developers and game engine developers. Not needing to perform all those hacks to approximate what offline renderers can do. Not needing to write those features into engines at all... It will be so much simpler.


> Ray tracing WILL DEFINITELY fully take over rasterization once hardware capable of doing it is commoditized.

That's not what this demo is about, and what you wish for is decades away at best. DirectX realtime raytracing relies upon a rasterization step to estimate the light for most of the objects on the screen.

https://blogs.msdn.microsoft.com/directx/2018/03/19/announci...

--------------

> Not needing to perform all those hacks to approximate what offline renderers can do.

The Gooseberry benchmark (Blender's hardest test) takes 35 minutes to render on my 16-core / 32-thread Threadripper 1950x, while GPUs simply crash because literally one frame of animation requires 12GB of RAM (far more than whats available on commodity GPUs).

Offline renderers are about 5-orders of magnitude too slow for realtime and about 1-order of magnitude of RAM usage too large for conventional ~2GB or 4GB GPUs.

We're at the stage where SOME simplified raytracing effects can replace SOME aspects of the rasterization pipeline, and only if you buy 4x Tesla V100 (Ohhh... that's more expensive. $8k+ per Tesla V100 == $32,000 on GPUs alone). Furthermore, there were only ~1 to ~2 rays per pixel. All the ray-tracing effects were processed through a neural-network to denoise, and they also average those rays through history (so this technique only works on slow-moving objects).

That's what this demo represents. A LOT of compromises, but hey, they achieved a very low-end realtime raytracer and integrated it into a standard rasterization pipeline to provide mirroring and ambient occlusion.

Its an impressive feat for sure. But we're no where near reproducing movie-quality / offline quality in realtime (~64 or ~1000 rays per pixel).


I didn't say we we're close. I said raytracing will eventually replace rasterization. And it will.


"ust use a dedicated cube map for a scene"

Most scenes with baked lighting these days many (sometimes hundreds or thousands of) cube maps. One per scene tends to look really bad.

Unreal Engine (without the raytracing stuff) uses something similar to this:

https://seblagarde.wordpress.com/2012/09/29/image-based-ligh...

Cubemaps are blended together and traced against simple box and sphere geometry.


All those things are great - but nothing kills my immersion more than very very reflective things (cars, water, shiny metal) not actually being reflective.

Raytracing would bring that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: