I used to be active in computer graphics. Great to see an article by Whitted himself. A bit of trivia: I think _the_ pioneer in the field of ray-tracing is Albrecht Dürer [1] who ray traced by hand in the 16th century.
It always amazes me how things we think of as high-tech modern innovations were often known by people a long time ago. There have always been ingenious people, and sometimes they achieved things long before I would have thought it possible. Archimedes almost invented calculus thousands of years ago, and the Greeks knew the distance to the moon and sun[1]. A little ingenuity can figure something out from simple principles; always a reminder that each person in history were living, breathing, thinking people as real as you and I, with a unique viewpoint and achievements.
1. [https://en.wikipedia.org/wiki/On_the_Sizes_and_Distances_(Ar...]
That immediately reminded me of "Tim's Vermeer", have you seen it? Software guy Tim hypothesizes that Vermeer was tracing rays using an optical setup without strings, maybe vaguely similar to Dürer's. He sets out to recreate the device and Vermeer's painting's. It's a wonderful project.
This is both funny and awful at the same time. I've been in graphics forever, and his name alone demands respect. He's so much of a corner stone, since forever, that it never occurred to me that the guy might still be alive, let alone working at Nvidia Research! Nvidia Research seems to have scooped up all of the legends.
It's a good historical anecdote about making progress in computer graphics, but ray tracing is largely an obsolete method, since 'path tracing', the method that stochastically solves the rendering equation by James Kajiya, can do global illumination way more realistically (in principle, it can give true realism). In fact ray tracing cannot do global illuminuation and any ray tracing engine that does that is essentially including a hack on top of its core ray tracer. You can even look at the images shared in the article and see for yourself as to how realistic or unrealistic they look.
Path tracing is not good for 'real-time' computer graphics at the moment, because GPUs need to be still more computationally powerful, though I think 2018 or 2019 is the year we'll have the first computer game based on path tracing. On the other hand, ray tracing is already easily doable in GPUs, since it's computationally less expensive than path tracing, but IMO, no one is interested in it since games either want maximum possible realism, or if they want "cartoonish graphics," they want a certain look, not the look imposed on them by the renderer, and ray tracing cannot give enough control to produce any look the artist has in mind (without massive hacks).
EDIT: I just realized, nVidia is promoting ray-tracing as the future of CG, and everything they're describing are actually qualities/features of path tracing and have nothing to do with ray tracing (except for this historical article). I'm seriously perplexed. Maybe nVidia's marking department thought the word 'ray tracing' is more catchy, and more people know about it, so they should use it everywhere. It's common knowledge in CG research community that ray tracing cannot do global illumination.
EDIT 2: Another possible reason is that if they start mentioning path tracing, people will start asking questions about the difference between ray tracing and path tracing, and sales people don't want to get into that. It's a bit like calling LCD displays thin CRTs out of fear that if they mention LCD, people would start asking about the difference between CRT and LCD.
There are certain aspects of light which the rendering equation doesn't model, so it's insufficient for true realism (the Limitations sections of the Wikipedia page is a good summary). It's pretty damn close, though.
I feel that the "ray tracing" vs. "path tracing" question is nit-picking for a very specialized audience at best. Most people simply understand ray tracing as being about tracing rays through geometry in some way (so both ray casting and path tracing are a form of using ray tracing). How you combine this tracing to give you pixel values is a higher-level question. If you look at ray-tracing frameworks / APIs like Optix or DXR, it's clear that this combination of tracing is left up to the programmer (for good reasons). What's important for the API to expose is the primitive operation of tracing a single ray, after all.
What specifically makes path tracing not a special case of ray tracing? To me, ray tracing is anything that traces rays (unlike the typical rasterization using z-buffering), and path tracing is just a poorly-named high fidelity method of using ray tracing...
True, it doesn't. I didn't read the question carefully enough. Path-tracing is indeed a more sophisticated form of ray-tracing. In a path-tracing engine you'll still be calculating intersections to surfaces by ray-casting, ray-marching etc.
As the wikipedia page for ray-tracing states, "ray tracing is a rendering technique for generating an image by tracing the path of light as pixels in an image plane and simulating the effects of its encounters with virtual objects". The range of complexity possible in that definition was already discussed in the above Quora article.
Yes. I wish there were more casual acceptance for “ray tracing” as essentially light computation via ray intersections, rather than the assumption that it means “naive ray tracing” or “direct lighting ray tracing”.
Right. The internet certainly has no shortage of folks willing to answer this question, but the answers almost always imply the difference is merely rhetorical or something like a matter of opinion. Same thing with ‘biased’ vs ‘unbiased’ rendering, but that one is much less surprising.
> It's a good historical anecdote about making progress in computer graphics, but ray tracing is largely an obsolete method,
A ray tracer and a path tracer are very similar and are based on the same foundation of ray-object intersection tests, render the same kinds of geometry, and use the same kinds of acceleration structures. Lighting and textures are a little different, and the process of tracing a ray into the scene is a little different, but a ray tracer is really just a simplified path tracer.
Calling it an obsolete method seems strange, since about 90% of the code in a typical ray tracer could be re-used directly to make a path tracer, and a person setting out to create a path tracer in 2018 would probably start by making a ray tracer and then enabling path tracing once they've verified that ray tracing is working. Path tracing is more like a feature you enable than a completely different method.
The thing about path tracing that isn’t mentioned often is that it can take anywhere from 1/60th of second to 16 hours to render a single frame. The images that people associate with path tracers are usually of the 16 hour variety. Just because a game could technically use a path tracer as it’s render engine doesn’t mean it will provide any visual benefit over traditional realtime rendering.
This is of course true, to the extent that it can take a long time for a path-traced image to converge to a low level of noise. But there are significant advances in using advanced filtering techniques (e.g. temporal filters) that are already very good at filtering out noise in images moving at or near interactive framerates. In combination with the kinds of effects you can capture with path tracing, definitely will provide visual benefit over rasterization.
That said, I'm not convinced that path tracing and physically-based rendering methods will ever completely replace rasterization. For some applications (especially games), the kinds of non-realistic effects you want to achieve are actually very hard to model and render using techniques that aim for realism. Probably for games we will end up with hybrid rendering that uses both at the same time.
- What you said is true of any rendering method you use. If you have a complex scene, it'll take more time to render one frame.
- What I said about 2018 or 2019 is what nVidia's promotion is all about. See the demos linked at the bottom of my comment. They keep calling it 'real-time ray tracing' but it's really 'real-time path tracing'. And it's already happening.
I don’t think that’s realtime path tracing. It looks to me like they’re doing ray traced reflections and shadows and combining that with light probe global illumination. It’s cool, but not revolutionary.
Good to see that they are following the universal law of raytracing demos: a reflective sphere stuck randomly into the scene (at just under two minutes into the first video)
If you're trying to demonstrate the 'real world usefulness' of raytracing in games, the appearance of shiny ball bearings is a clear sign that you've run out of ideas.
These samples just don’t look that impressive to me. It looks like they are just milking some tweaks in each one.
[1] Looks really good but only one light source and barely even a bounce of light. Nice backlight halo, but that’s not ray-tracing.
[2] Obviously a lot going on here and some progress of some sort but if only all the reflections weren’t so blurry. What’s up with that?
[3] 99% of what makes this scene look good is the PBR texture tuning. This lighting does not look good and nothing is moving. Very few reflections and small but they do look proper.
Path tracing doesn't exist without ray tracing. Tracing a ray is one part of tracing a path, because a path is a connected chain of ray segments.
"Whitted ray tracing" or "recursive ray tracing" are sometimes used to refer to a complete renderer or complete shading technique used in the 80s that is not global illumination. That sounds like what you're thinking of. But that's not today's meaning of ray tracing. "Ray tracing" can (and these days normally does) refer to a visibility primitive that is used in many different kinds of renderers, and that's what NVIDIA is promoting, that's what graphics people mean by "ray tracing" these days.
> ray tracing is largely an obsolete method
Recursive ray tracing as a rendering technique might be out of fashion, but tracing rays is far from obsolete. Exactly the opposite, it's about to challenge rasterization.
> In fact ray tracing cannot do global illuminuation and any ray tracing engine that does that is essentially including a hack on top of its core ray tracer.
Not true at all. Path tracing is done using a ray tracing engine. There are several other popular techniques for GI on top of ray tracing engines as well, e.g., photon mapping. There aren't many GI renderers that don't use ray tracing these days. It's possible to avoid ray tracing, but not at all common in practice.
> I just realized, nVidia is promoting ray-tracing as the future of CG, and everything they're describing are actually qualities/features of path tracing and have nothing to do with ray tracing (except for this historical article). I'm seriously perplexed.
I think you're perplexed because you don't quite understand what "ray tracing" means. NVIDIA is promoting ray tracing as a visibility primitive that can replace rasterization. The implications of this are that 1) path tracing and other forms of global illumination using the GPU are now possible, 2) visibility queries & acceleration structures will have GPU support and can be done very fast. 3) We get things like instancing at a much lower cost than with rasterization. 4) Lots of ad-hoc special effects can be done in a simpler and more principled and complete way. I'm referring to, for example, screen space effects that games are using. AO, reflections, bounce lighting, shadows, etc. All these techniques are tricky and separate from each other. With ray tracing they can be unified. 5) Ray tracing queries can be used in games for all kinds of other things including collision detection, path planning, and audio.
Would absolutely pay up if I could get realtime raytracing for Fusion360 (Industrial/Mechanical Design software). Waiting for renders to bake only to find out that the lighting needs to be adjusted slightly can be frustratingly slow.
I don't know if they will ever integrate this, but that's what I like about path tracing.
Even on slow computers you will get a real time photorealistic result. You will have to wait long before all noise is gone but you will notice lighting problems in a sec.
Hmm, IIRC Inventor does realtime raytracing, with first image after ~10 seconds and production quality images after a couple of minutes (this was on a XPS 15 laptop with the GTX 960M GPU).
My guess is that it's one of the features Autodesk doesn't want to put in Fusion360 so they retain a market for the more expensive Inventor.
I don't think we can call it real-time below 6 FPS (0.17s). That's the framerate of cheap anime. And the goal should be at least 30, which is a typical video game framerate.
The only examples I have where real-time raytracing is used are demoscene productions where shapes are defined using simple mathematical primitives. (see the 4k intros "absolute territory" and "zetsubo", both from prismbeings for example).
Ten seconds is hardly real-time, that's how long Fusion takes too. You can't easily adjust the camera view when you have to wait ten seconds between adjustments.
That said, I think Fusion shows you a really low resolution image in real time, if I remember correctly.
Most renderers support near-realtime preview. If fusion360 doesn't maybe write a script that watches for file changes, converts the CAD data into whatever your renderer of choice takes as input and then display in preview mode?
Maya+Vray spits out a preview within seconds on my machine.
Do you have some examples you can link to of something you have rendered in a few seconds?
Fusion has some sort of simpler preview, my guess is that its some closed form lighting solver and in certain contexts it can look ok but for photorealism of a scene, its nowhere close.
I see a decent number of suggestions that people have really fast renders but I'm suspicious that we are talking about very different things.
Here is what I am typically rendering:
-Full screen on a 15" retina MBP
-Ray Traced
-In additional to the product, whole rooms/scenes that contextualize including 3D textured woods, books with Decals that are wrapped, elements of various opacity/transmissiveness, objects with some luminance, etc.
- Some objects with very high numbers of points (there is a technical name I'm forgetting), but essentially, circles render with enough polygons to not appear as a collection of lots of small/slightly noticeable line segments
Unfortunately, there is no file. Fusion has this monstrosity where everything is stored in Autodesk servers. I don't know if you can export the files, but saving uploads them to Autodesk.
I thought KeyShot had a quick plugin for that. I would be surprised if V-Ray for GPU doesn’t becomes available for Fusion 360 before long. I can’t honestly claim to know much about Fusion 360 but I do know the CAD folks love it. Octane or Redshift is what you’re really after but they are more boutique companies so it’s probably unlikely.
[1]: https://www.researchgate.net/figure/Man-drawing-a-lute-by-Al...