Ah yes.. Thanks for the post! I have my fix for a little while longer! ;)
But in all seriousness, caustics renering is one of those things that a path tracer seems to do effortlessly, but can only be faked semi-convincingly under extremely constrained conditions and with much effort via rasterizers. Something as simple as tracing light as it passes though a glass on a table can add an incredible depth to a scene!
I'm glad that you're still giving us updates! Otoy music have something really exciting in store!
Sean, yeah, I haven't had much time to do any serious testing with Brigade lately. A lot has been added, like instancing, motion blur, and a new kernel that can render caustics. There are still some features missing and more work is needed to optimize everything for real-time games, but we're now actually getting there, which makes me a very happy camper :) Unidirectional path tracers are actually not good at all for caustics, you need something like bdpt or photon mapping/tracing, which is now implemented in Brigade as you can see. Hopefully more updates soon.
Unknown: it takes less than a second to render this image, the caustic pattern appeared instantly. The line is a multi GPU artefact, it shouldn't be there obviously.
Very interesting! This confirms to my intuition as it can't be fast to solve caustics when rays of light can be arbitrarily bent! Of course the same intuition told me that path tracing would be too slow to be useful in a realtime setting... :P In any case, photon mapping or bi-directional pt makes a lot more sense!
Bdpt is especially interesting. I skimmed a paper some months back (though I didn't understand most of it) and took away that bdpt had integration rates that were very competitively with other methods, and ground truth convergences.
Hey Sam, I just had an idea that I would like you to keep in mind. It's certainly off-topic, and I will try to be as succinct as possible. Here goes:
Pushing RTPT graphics to the home via the cloud is a pretty big deal, and major future use-case of the technology, and will enable games of qualities that were once reserved exclusively for movies. But there's something else that is coming of age that will transform gaming: head-mounted Virtual Reality.
Now head-mounted VR is an interesting case, because latency starts to play a very big role in the consistency and believability of an immersing VR scene. If you move your head, and the world lags behind, this can cause severe vertigo as we are hard-wired to expect immediate changes.
Cloud streaming of games will introduce latencies even with very fast connections that provides a challenge for using the technology in a VR-type application.
But there is a solution.
To mitigate the latency effect, a Google Street View like graphic could be the target of the render. That is, the rendered scene could be targeted to the inside of a sphere and extend outside of the users field of view. Thus the client software that receives the render is free to look around the FoV of the current frame and have it refresh far faster than the server is able to send new frames.
If you want to get crazy, you can also account for depth (and even perhaps overdraw) and more accurately display parallax (with a full-frame OpenGL shader) as the users view yaws, pitches and rolls, with only a slight bump in complexity to the client software (and the render). In this way, the slight parallax effect can be accurately preserved in the high quality final frame!
It's just a thought. It would be lovely to be able to walk through a brigade render in virtual reality someday. :)
Just so there's zero confusion: I take no ownership of the idea above, and release any said idea into the public-domain for use without Warranty or responsibility for its implementation.
Would it be possible to use BDPT for indoor scene then switch to uni-directional PT as you ahead outside? It would mean the best of both worlds if possible ;)
Hi sam ! have you tried to implement a noise reduction filter on brigade ? like the Enhanced Bilateral filter ( http://www.csie.ntu.edu.tw/~fuh/personal/NoiseReductionUsingEnhancedBilateralFilter.pdf )
I tried this on a CPU path tracer and it works very well. Furthermore this filter is a good match for GPU implementation.
14 comments:
Ah yes.. Thanks for the post! I have my fix for a little while longer! ;)
But in all seriousness, caustics renering is one of those things that a path tracer seems to do effortlessly, but can only be faked semi-convincingly under extremely constrained conditions and with much effort via rasterizers. Something as simple as tracing light as it passes though a glass on a table can add an incredible depth to a scene!
I'm glad that you're still giving us updates! Otoy music have something really exciting in store!
Looking good :) ..how long does it take Brigade to compute this? How many samples per pixel are shown in this image?
Oh, and where does the horizontal line come from? :-)
Sean, yeah, I haven't had much time to do any serious testing with Brigade lately. A lot has been added, like instancing, motion blur, and a new kernel that can render caustics. There are still some features missing and more work is needed to optimize everything for real-time games, but we're now actually getting there, which makes me a very happy camper :)
Unidirectional path tracers are actually not good at all for caustics, you need something like bdpt or photon mapping/tracing, which is now implemented in Brigade as you can see. Hopefully more updates soon.
Unknown: it takes less than a second to render this image, the caustic pattern appeared instantly. The line is a multi GPU artefact, it shouldn't be there obviously.
Awesome stuff. These are very impressive compared to faked ones in classic game engines.
How is the noise distributed ? Do you still converge like default indirect rays ?
Are you guys implementing vertex merging? (bi-dir+photon mapping via MISampling)
I'm referring to Siggraph Asia papers are coming.
Bye
The noise looks the same, except that caustic patterns appear immediately now and are pretty well defined after only a few samples.
Anonymous: It's not vertex merging.
Very interesting! This confirms to my intuition as it can't be fast to solve caustics when rays of light can be arbitrarily bent! Of course the same intuition told me that path tracing would be too slow to be useful in a realtime setting... :P In any case, photon mapping or bi-directional pt makes a lot more sense!
Bdpt is especially interesting. I skimmed a paper some months back (though I didn't understand most of it) and took away that bdpt had integration rates that were very competitively with other methods, and ground truth convergences.
yep, bdpt is good for indoors with a lot of indirect lighting, for outdoors, unidirectional is faster in most cases.
Interesting...
Hey Sam, I just had an idea that I would like you to keep in mind. It's certainly off-topic, and I will try to be as succinct as possible. Here goes:
Pushing RTPT graphics to the home via the cloud is a pretty big deal, and major future use-case of the technology, and will enable games of qualities that were once reserved exclusively for movies. But there's something else that is coming of age that will transform gaming: head-mounted Virtual Reality.
Now head-mounted VR is an interesting case, because latency starts to play a very big role in the consistency and believability of an immersing VR scene. If you move your head, and the world lags behind, this can cause severe vertigo as we are hard-wired to expect immediate changes.
Cloud streaming of games will introduce latencies even with very fast connections that provides a challenge for using the technology in a VR-type application.
But there is a solution.
To mitigate the latency effect, a Google Street View like graphic could be the target of the render. That is, the rendered scene could be targeted to the inside of a sphere and extend outside of the users field of view. Thus the client software that receives the render is free to look around the FoV of the current frame and have it refresh far faster than the server is able to send new frames.
If you want to get crazy, you can also account for depth (and even perhaps overdraw) and more accurately display parallax (with a full-frame OpenGL shader) as the users view yaws, pitches and rolls, with only a slight bump in complexity to the client software (and the render). In this way, the slight parallax effect can be accurately preserved in the high quality final frame!
It's just a thought. It would be lovely to be able to walk through a brigade render in virtual reality someday. :)
Just so there's zero confusion:
I take no ownership of the idea above, and release any said idea into the public-domain for use without Warranty or responsibility for its implementation.
@Sam Lapere
Would it be possible to use BDPT for indoor scene then switch to uni-directional PT as you ahead outside? It would mean the best of both worlds if possible ;)
To me this looks like "two-way" path tracing... :-)
Hi sam ! have you tried to implement a noise reduction filter on brigade ? like the Enhanced Bilateral filter ( http://www.csie.ntu.edu.tw/~fuh/personal/NoiseReductionUsingEnhancedBilateralFilter.pdf )
I tried this on a CPU path tracer and it works very well. Furthermore this filter is a good match for GPU implementation.
Florent Tournade
nonsense !!!
Post a Comment