Friday, June 1, 2018

Real-time path tracing on a 40 megapixel screen

The Blue Brain Project is a Switzerland based computational neuroscience project which aims to demystify how the brain works by simulating a biologically accurate brain using a state-of-the-art supercomputer. The simulation runs at multiple scales and goes from the whole brain level down to the tiny molecules which transport signals from one cell to another (neurotransmitters). The knowledge gathered from such an ultra-detailed simulation can be applied to advance neuroengineering and medical fields.

To visualize these detailed brain simulations, we have been working on a high performance rendering engine, aptly named "Brayns". Brayns uses raytracing to render massively complex scenes comprised of trillions of molecules interacting in real-time on a supercomputer. The core ray tracing intersection kernels in Brayns are based on Intel's Embree and Ospray high performance ray tracing libraries, which are optimised to render on recent Intel CPUs (such as the Skylake architecture). These CPUs  basically are a GPU in CPU disguise (as they are based on Intel's defunct Larrabee GPU project), but can render massive scientific scenes in real-time as they can address over a terabyte of RAM. What makes these CPUs ultrafast at ray tracing is a neat feature called AVX-512 extensions, which can run several ray tracing calculations in parallel (in combination with ispc), resulting in blazingly fast CPU ray tracing performance which rivals that of a GPU and even beats it when the scene becomes very complex. 

Besides using Intel's superfast ray tracing kernels, Brayns has lots of custom code optimisations which allows it to render a fully path traced scene in real-time. These are some of the features of Brayns:
  • hand optimised BVH traversal and geometry intersection kernels
  • real-time path traced diffuse global illumination
  • Optix real-time AI accelerated denoising
  • HDR environment map lighting
  • explicit direct lighting (next event estimation)
  • quasi-Monte Carlo sampling
  • volume rendering
  • procedural geometry
  • signed distance fields raymarching 
  • instancing, allowing to visualize billions of dynamic molecules in real-time
  • stereoscopic omnidirectional 3D rendering
  • efficient loading and rendering of multi-terabyte datasets
  • linear scaling across many nodes
  • optimised for real-time distributed rendering on a cluster with high speed network interconnection
  • ultra-low latency streaming to high resolution display walls and VR caves
  • modular architecture which makes it ideal for experimenting with new rendering techniques
  • optional noise and gluten free rendering
Below is a screenshot of an early real-time path tracing test on a 40 megapixel curved screen powered by seven 4K projectors: 

Real-time path traced scene on a 8 m by 3 m (25 by 10 ft) semi-cylindrical display,
powered by seven 4K projectors (40 megapixels in total)

Seeing this scene projected lifesize in photorealistic detail on a 180 degree stereoscopic 3D screen and interacting with it in real-time is quite a breathtaking experience. Having 3D molecules zooming past the observer will be the next milestone. I haven't felt this thrilled about path tracing in quite some time.



Technical/Medical/Scientific 3D artists wanted 


We are currently looking for technical 3D artists to join our team to produce immersive neuroscientific 3D content. If this sounds interesting to you, get in touch by emailing me at sam.lapere@live.be

3 comments:

Unknown said...

Wow Sam, that looks awesome! We're one step closer to a Holodeck. ;) Best of luck to you and your team!

Unknown said...

Why only a screenshot?

Unknown said...

Awesome work. I bet it's stunning to see this in motion :-)

What (and how many!) CPUs are you running this with?