Saturday, June 22, 2019

LightTracer, the first WebGL path tracer for photorealistic rendering of complex scenes in the browser

A couple of days ago, Denis Bogolepov sent me a link to LightTracer, a browser based path tracer which he and Danila Ulyanov have developed. I'm quite impressed and excited about LightTracer, as it is the first WebGL based path tracer that can render relatively complex scenes (including textures), which is something I've been waiting to see happen for a while (I tried something similar a few years ago, WebGL still had too many limitations back then).


What makes LightTracer particularly interesting is that it has the potential to bring photoreal interactive 3D to the web, paving the way for online e-commerce stores offering their clients a fully photorealistic preview of an article (be it jewellery,  cars, wristwatches, running shoes or handbags).

Up until now, online shops have been trying several ways to offer their clients "photorealistic" previews with the ability to configure the product's materials and colours. These previews were either precomputed 360 degree videos, interactive 3D using WebGL rasterization and even using server-side rendering via cloud based ray tracing streamed to the browser (e.g. Clara.io and Lagoa Render) which requires expensive servers and is tricky to scale.

LightTracer's WebGL ray tracing offers a number of unique selling points:

- ease of use: it's entirely browser based, so nothing needs to be downloaded or installed
- intuitive: since ray tracing follow the physics of light, lights and materials behave just like in the real world, allowing non-rendering-experts to predictably light their scenes
- photorealisitic lighting and materials: as Monte Carlo path tracing solves the full rendering equations without taking shortcuts, this results in truly photoreal scenes
speed: LightTracer's ray tracing is accelerated by the GPU via WebGL, offering very fast previews. This should get even faster once WebGL will support hardware accelerated ray tracing via Nvidia's RTX technology (and whatever AMD has in the works)







LightTracer is still missing a few features, such as an easy-to-use subsurface scattering shader for realistic skin, hair and waxy materials, and there are plenty of optimisations possible (scene loading speed, UI improvements and presets, etc.) but I think this is the start of something big. 

Saturday, April 6, 2019

Unreal Engine now has real-time ray tracing and a path tracer

Epic recetly just released the stable version of Unreal Engine 4.22 which comes with real-time ray tracing and a fully fledged path tracer for ground truth images.

https://www.unrealengine.com/en-US/blog/real-time-ray-tracing-new-on-set-tools-unreal-engine-4-22

The path tracer is explained in more detail on this page: https://docs.unrealengine.com/en-us/Engine/Rendering/RayTracing/PathTracer

The following video is an incredible example of an architectural visualisation rendered with Unreal's real-time raytraced reflections and refractions:



It's fair to say that real-time photorealism on consumer graphics card has finally arrived. In the last few years, fast and performant path tracers have become available for free (e.g. Embree, OptiX, RadeonRays, Cycles) or virtually for free (e.g Arnold, Renderman). Thanks to advances in noise reduction algorithms, their rendering speed has been accelerated from multiple hours to a few seconds per frame. The rate at which game engines, with Unreal at the forefront, are taking over the offline-rendering world is staggering. Off-line rendering for architecture will most probably disappear in the near future and be replaced by game engines with real-time ray tracing features. 

Thursday, February 28, 2019

Looking for fullstack React developers

The Blue Brain Project is a Swiss neuroscience research project based in Geneva which pushes the boundaries of computational neuroscience. It has an ambitious goal: to explore strange new worlds, to seek out new lifeforms and to boldly go where no man has gone before by simulating a complete digitally reconstructed biological brain using a supercomputer.

We are currently looking for experienced fullstack React developers to help build a web application for real-time raytraced neuroscientific data (which is rendered on a remote cloud).


The Ideal candidate's profile
  • 2+ years experience in full stack/frontend engineering
  • 2+ years designing, developing, and scaling modern web applications
  • 2+ years experience with React, JavaScript, HTML5, CSS3, and other modern web technologies

Technical requirements:

  • Deep understanding of asynchronous code and the observable pattern in JavaScript
  • Experience using the browser's dev tools for debugging, profiling, performance evaluation, etc.
  • Knowledge of code chunking strategies
  • Experience writing unit tests and component tests
  • Experience with version control systems (Git, Github, etc.)
  • Continuous integration and deployment using Jenkins
  • Knowledge of common UI/UX design patterns and ability to use them accordingly
  • Knowledge of Google's Material Design spec

Required skills:

  • TypeScript, JavaScript (ES6), React, Redux, NodeJS
  • REST, WebSocket API

Nice to have skills:

  • ThreeJS, D3, Python, C++
  • Docker, OpenShift, CI/CD, Webpack, Bash

Monday, February 18, 2019

Nvidia releases OptiX 6.0 with support for hardware accelerated ray tracing

Nvidia recently released a new version of Optix, which finally adds support for the much hyped RTX cores on the Turing GPUs (RTX 2080, Quadro RTX 8000 etc), which provide hardware acceleration for ray-BVH and ray-triangle intersections.

First results are quite promising. One user reports a speedup between 4x and 5x when using the RTX cores (compared to not using them). Another interesting revelation is that the speedup gets larger with higher scene complexity (geometry-wise, not shading-wise): 


As a consequence, the Turing cards can render up to 10x faster in some scenes than the previous generation of Geforce cards, i.e. Pascal (GTX 1080), which is in fact two generations old if you take the Volta architecture into account (Volta was already a huge step up from Pascal in terms of rendering speed, so for Nvidia's sake it's better to compare Turing with Pascal).

This post will be updated with more Optix benchmark numbers as they become available.