Wednesday, October 26, 2011

Update 2 on Exposure Render

There is now a stable release of Exposure Render, the open source direct volume renderer with physically based lighting on the GPU: http://code.google.com/p/exposure-render/

It now even runs on a laptop with a 8600M GT (and all the example datasets fit in the GPU memory). The photoreal lighting makes this renderer stand out when compared to other volume renderers. And it converges surprisingly fast. Some pictures (rendered for only 20 seconds on the ultraslow 8600M GT):




Friday, October 21, 2011

OTOY partners with Autodesk to stream real-time path traced graphics from the cloud

Awesome news!!! In addition to the partnerships with Refractive Software (makers of Octane Render) and the development team behind the Brigade path tracer (which I blogged about here), OTOY now announced a partnership with Autodesk to integrate Octane Render into the viewport of Autodesk's 3D Studio Max 2012 and stream real-time path traced visuals for animations, movies and games from remote GPU clusters in the cloud to the client. Real-time GPU path tracing is big business, and the best part is that it's going to bring game graphics up to ultrarealistic levels in the not too distant future.

VentureBeat article (includes an image of Octane's subsurface scattering implementation): http://venturebeat.com/2011/10/20/otoy-teams-up-with-autodesk-to-create-cloud-rendering-platform/


Some quotes from the article and official press release:

Real Time Path Tracing in the Viewport - A New Level of Photorealism

OTOY's 3D rendering tools, including Octane Render™ and Brigade™, are the premier rendering solutions for next generation 'path traced' games and films.

Path tracing significantly reduces the cost and complexity of high quality rendering by automatically generating effects traditionally handled through manual post processing – including spectral rainbows, lens flares, unbiased motion blur and depth of field.

“A year ago, path tracing was considered too expensive to be used even in high-end Hollywood blockbusters. Today, thanks to advances in GPU rendering, OTOY is bringing real time path-tracing technology to millions of artists by leveraging GPU hardware that costs only a few hundred dollars. This is a game-changer on any budget,” said Jules Urbach, CEO of OTOY.

“Autodesk is the leader in 3D design software for film and video game production. We are incredibly excited about our partnership and proud to be bringing their industry leading tools to an ever-expanding market through our cloud solutions,” said Alissa Grainger, President of OTOY.

One of the images in the VentureBeat article looks like the Kajiya scene (which can now be rendered at more than 10 fps with the Brigade engine, including caustics) in an IBL environment:



Very cool!

Tuesday, October 18, 2011

Real-time path tracing of animated meshes on the GPU (2): OGRE


Another test in my never relenting quest for real-time photorealistic graphics. This time I was inspired by one of the first animations rendered with unbiased Monte Carlo path tracing. The animation was made by Daniel Martinez Lara from Pepeland in 1999 and can be seen here:

http://www.youtube.com/watch?v=VNyknZ2zrsM

It’s one of the first animations that uses Arnold, the Monte Carlo path tracing based production renderer developed by Marcos Fajardo, that is currently taking Hollywood VFX by storm: it was used in e.g. Cloudy with a Chance of Meatballs, 2012, Alice in Wonderland and the Assassin’s Creed: Brotherhood and Revelations CG trailers and is giving PRMan and mental ray a run for their money (probably making them obsolete soon, mainly because of ease of use and huge artist time savings). The animation shows a very lifelike clay figure coming to life. Despite the simplicity of the scene, the whole looks very believable thanks to physically accurate global illumination and materials and an extensive use of depth-of-field and camera shake.

In an attempt to reproduce that particular scene, I’ve used the animated Ogre model from a ray tracing demo developed by Javor Kalojanov which can be found at http://javor.tech.officelive.com/tmp.aspx. The Ogre model (which was created by William Vaughan) consists of 50,855 triangles and was also used in the excellent paper “Two-level grids for ray tracing on GPUs” by Javor Kalojanov and Philipp Slusallek (really great people btw, whom I've had the pleasure to meet in person recently. The conversations I've had with them inspired me to finally try triangles as primitive for my real-time path tracing experiments (instead of just spheres), which led to this Ogre demo. To my surprise, triangle meshes are not that much slower to intersect compared to spheres. I think this is due to the fact that the cost of primitive intersection is becoming increasingly smaller compared to the cost of shading).

The following videos show an animated Ogre path traced in real-time with real-time, per frame update of the acceleration structure of the Ogre’s 50k triangle mesh (watch in 480p):



http://www.youtube.com/watch?v=u0a0mPR8jdM

Path tracing is performed entirely on the GPU, in this case a GTS 450 (a low-end GPU by today’s standards). The framerate of the walk animation is about 4 fps max on my card but should be around 15-20 fps on a GTX 580. The image converges extremely fast to a very high quality result (in about 1-2 seconds). The movement of the Ogre (translation, rotation, animation) is actually much more fluid in real life without Fraps, the overhead of the video capturing software almost halfs performance.

The images below were each rendered at 20 samples per pixel in under 2 seconds on a GTS 450 (it would take less then 0.5 seconds on a GTX 580):

Note the very subtle color bleeding from the ground plane onto the Ogre:

With a glossy floor:

If you’re interested in trying this scene out yourself, send me an e-mail at sam [dot] lapere [at] live [dot] be. A CUDA enabled GPU is required (minimum compute capability 1.1).

I’m planning to build a (very) simple game with this tech. The possibilities are really endless. We're on the cusp of having truly photorealistic games.

Friday, October 14, 2011

Real-time path tracing of animated meshes on the GPU (1)

Another experiment: no spheres this time, but a real-time animated triangle mesh (the hand is from the Utah 3D Animation Repository, a textured mesh containing 15,855 triangles). The goal was to create a simple animated scene and achieve a look as close to photorealism as possible with completely dynamic, physically accurate global illumination in real-time using path tracing.

The following animation was rendered in real-time on a GTS 450 (stretching the compute capabilities of my GPU to the max):



Details will follow.

The image below was rendered with path tracing at 26 samples per pixel in 0.8 seconds on a GeForce GTS 450 (192 cuda cores at factory clocks, it would render 3 to 4 times faster on a GTX 580, which has 512 cuda cores and is clocked higher):




Thursday, October 13, 2011

Update on Exposure Render

Two new great looking videos of Exposure Render, the real-time CUDA-based volume path tracer have appeared on Youtube. It's quite cool to see these anatomical volume datasets being rendered with physically based shading in near real-time:

http://www.youtube.com/watch?v=cZaPIEo6PPs
http://www.youtube.com/watch?v=4D2HfJ5Cwqc


Someone has also put the video from the paper "On filtering the noise from the random parameters in Monte Carlo rendering" on Youtube, making it more convenient to retrieve and watch:

http://www.youtube.com/watch?v=ZQ6TFZE5QEU

The results for filtered images using only 8 input samples per pixel are extremely impressive.

Tuesday, October 4, 2011

Unbiased subsurface scattering on the GPU in next version of Octane Render

Wow, the developers behind Octane render never cease to amaze. After being the first GPU renderer to implement Population Monte Carlo (a more complex rendering method than plain path tracing which borrows concepts from Metropolis light transport and energy redistribution path tracing to handle scenes with difficult lighting more efficiently), Octane render is now adding unbiased subsurface scattering along with other features such as instancing. Here's the announcement post from radiance (Octane's main developer) over at the Octane forum:
"We have fully working and VERY fast SSS ready for release in the next test version. It renders about as fast (a tiny bit slower) as a glossy specular material. And, it's unbiased/bruteforce SSS, eg no bias introducing photon grids or other precomputed approximations.
[...]
This is the prime new feature in the next test release, along with instancing support and FAST voxelisation, and another suprise feature, aswell as the soon to be publically released first of a series of new products, OctaneRender for 3DS Max."
Surprise feature? Refractive Software knows how to keep their audience hyped ;-)

Some screenshots with the new SSS method can be seen in this thread.

I think it will eventually be possible to implement every feature found in traditional CPU renderers on the GPU and make it an order of magnitude faster. For example, Radiance hinted at bidirectional path tracing + PMC:
"Bidirectional pathtracing (and PMC) should make renders like this one converge MUCH faster and bidirectional pathtracing + PMC is something we will be starting work on next, after 2.5 is out.
PMC + bidirectional will be ideal, it will be as efficient as the popular standard in CPU based unbiased engines (MLT+bidir), and this combined with the power of GPUs should really take things to a new level."
GPU rendering is going to redefine every area of rendering from movies, animation, visualization and design to games, simulation and virtual reality. Truly the most exciting time for rendering in decades. I'm very happy that this paradigm shift is in full swing and that things are evolving at nauseating speed :-)

Saturday, October 1, 2011

Real-time physically based volume rendering with CUDA

Just saw a very impressive video of "Exposure Render", an open source CUDA based volumetric path tracer which will be presented at EuroGraphics 2012:




The video shows an eerily realistic representation of a person scanned with CT (computed tomography), where anatomical layers of skin, muscle, cartilage, connective tissue and bone can be peeled away and rendered with photorealistic quality global illumination in real-time. This could be an extremely useful training tool for medical students and aspiring surgeons. The physically based photorealistic 3D renders of skull, bones and soft tissue could also benefit radiologists in diagnostic decision making.

Most medical volume rendering software uses ray casting, which is fast but provides very unrealistic looking lighting. With volumetric path tracing finally being feasible in real-time thanks to the GPU, this will hopefully change soon. I think we will see an implementation of real-time path traced sparse voxel octrees pop up sometime in the very near future.

An executable and source code (Oct 3rd) are available at http://code.google.com/p/exposure-render/