tag:blogger.com,1999:blog-7277449027963623452.post3388330949184083155..comments2023-12-07T05:43:10.401-08:00Comments on Ray Tracey's blog: Brand new GPU path tracing research from Nvidia and AMDSam Laperehttp://www.blogger.com/profile/05688552048697970050noreply@blogger.comBlogger46125tag:blogger.com,1999:blog-7277449027963623452.post-63923599558224117832015-08-18T09:14:31.541-07:002015-08-18T09:14:31.541-07:00From the infos in the wccftech article you can der...From the infos in the wccftech article you can derive that a high end 16nmFinFET+ GPU would only consume 100Watt. Compared to a gtx980 which consumes 165W.<br />The new finfets+ are said to consume 70% less than 28nm tech and that would mean 50WattTDP. But because you double the amount of transistors it goes again up to 100W. Would be cool to see. <br />What a world it would be if one day billions of smartphones containing the lightfield chip of Magic Leap and a 10Watt high end GPU are going to be sold.<br />No bulky desktop case no more!<br />AMD FireRays looks like it would a hell of a lot less work for me when i am going to<br />parallelize my path tracer.<br />Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-80045362798282862682015-08-18T00:53:20.096-07:002015-08-18T00:53:20.096-07:00Pascal's specifications look really interestin...Pascal's specifications look really interesting indeed. I think the 10x faster than Maxwell number only applies when using half precision floats (vs Maxwell's single precision).<br /><br />Path tracing has indeed taken over the entire movie industry (with Disney's Hyperion, Pixar's Renderman RIS renderer, Weta's Manuka, Arnold and Animal Logic's Glimpse) and has pretty much entirely displaced rasterization/REYES rendering (the last REYES movie was Good Dinosaur). It's just a matter of time before path tracing will take over real-time rendering as well.<br /><br />Thanks for the link to AMD's FireRays renderer, that calls for a new blogpost :)Sam Laperehttps://www.blogger.com/profile/05688552048697970050noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-79634993600697314192015-08-17T15:36:06.529-07:002015-08-17T15:36:06.529-07:00I wonder if this 10x speed up is real in the end. ...I wonder if this 10x speed up is real in the end. At least Pascal will <br />have 2x raw power over Maxwell. <br />http://wccftech.com/nvidia-pascal-gpu-17-billion-transistors-32-gb-hbm2-vram-arrives-in-2016/ <br /><br />This Rex Computing pulls out a very good gflop/w ratio.<br />If they scale it up to 200w@14nm, the chip could hit around 100tflop.<br />That is some serious compute power.<br /><br />Seems like the research in filtering algorithm is increasing. <br />Even Disney is now in the path tracing business :D<br />http://www.disneyresearch.com/publication/adaptive-rendering-with-linear-predictions/<br /><br />AMD published some informations about its new opencl renderer FireRays :<br />http://developer.amd.com/community/blog/2015/08/14/amd-firerays-library/ <br /><br />CPFUUUnoreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-42540374334723423852015-08-16T18:44:26.523-07:002015-08-16T18:44:26.523-07:00Nvidia's upcoming Pascal GPU architecture has ...Nvidia's upcoming Pascal GPU architecture has a feature called "mixed precision" which enables it to use half precision floats (16-bits). This would make Pascal purportedly 10x faster than Maxwell in certain computations according to the marketing slides. This is hard to believe since Maxwell is only 25% faster than Kepler in CUDA tasks (both architectures are based on 28 nm and while Maxwell is more power efficient, it has also traded double precision for more single precision performance). <br />Being able to perform computations at half precision should be really interesting for ray tracing though. <br />I like the MIMD philosophy of Rex Computing and the Unum stuff looks nifty. Adaptiva seems to have stopped production of the Epiphany IV Parallela chips, I suppose there wasn't enough interest. Sam Laperehttps://www.blogger.com/profile/05688552048697970050noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-77506275682221795502015-08-16T15:01:08.658-07:002015-08-16T15:01:08.658-07:00@CPFUUU:
As John Gustafson,former CTO of AMD, poi...@CPFUUU:<br /><br />As John Gustafson,former CTO of AMD, pointed out it will. He's done tests on it.<br />Actually the number size can adapt. It would add more complexity on the circuit design, but in the end calculations would highly benefit of it.<br />In an interview with VRFocus he got asked if AMD was interested in it. They rejected. Maybe thats why he was only 1 year employed over there.<br />By the way.<br />Syoyo Fujita, a japanese raytracing enthusiast did some interesting tests on<br />the 16 core chip Epiphany from adapteva. https://www.youtube.com/watch?v=_t4p4Is0Z3E<br />Its 25x more efficient than a Intel Xeon. Adapteva wants to put up to 64,000 cores onto one chip in the future.<br />The sourcecode for aobench is available on github.<br />In this video Andreas Olofsson the creator of the epiphany chips shares some amazing insights on the future of chip design.<br />https://www.youtube.com/watch?v=9qPJdF_soFQ<br />Did you know that putting RAM onto the processor can make a chip 10-25x faster? :)Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-78502039960121096342015-08-16T12:38:14.112-07:002015-08-16T12:38:14.112-07:00This Unum numbers could be a big hit.
I dont thin...This Unum numbers could be a big hit. <br />I dont think they will reduce the number of rays directly in path tracing.<br />Seems more like they will reduce the compute cost per ray ?<br /><br />Path tracing with 16 or 8 bit instead of 32. <br />Maybe it can accelerate filtering technique too. CPFUUUnoreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-82485687412553195292015-08-15T08:18:29.539-07:002015-08-15T08:18:29.539-07:00Another thing that i find highly exciting are Unum...Another thing that i find highly exciting are Unum numbers.<br />Invented by Johan Gustafson it is a super accurate way of performing mathematic calculations on the Computer with definite Solutions by using 29bit Unum numbers instead of floating numbers.<br />I have a strong feeling that you should watch his talk with Rich Report.<br />Path tracing makes use of monte carlo algorithm which is a guess. <br /><br />Isnt this the reason for this annoying noise? <br />Unum numbers can find exact results to the Problem. So no more noise?<br />JOhan Gustafson mentioned ray tracing at the end of his talk. 47:50<br />https://www.youtube.com/watch?v=jN9L7TpMxeA<br />Rex computing is also thinking about implementing unums eventually on their chip.<br />There is a book about them called: The end of error : Unum Computing<br />I am going to read that one, highly exciting!!!<br />It is written from a non mathematician perspective. So i dont worry....haha.....<br />With unum computing valid Information/second will be measured, no longer FLOPS....<br />Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-22316174207443668532015-08-15T04:48:37.452-07:002015-08-15T04:48:37.452-07:00I suspected sth like that.
I suspect that many th...I suspected sth like that. <br />I suspect that many things will change in the next 5 years.<br />If Magic Leap is really what they say it is supposed to be then we will probably witness a tectonic shift in the whole consumer electronics industry/business.<br />Games or 3d graphics will no longer be a big niche , like it is today, it will become ubiquitous, like TV. <br />It will be the #1 driver for the CPU/GPU industry.<br />There is this teenager Thomas Sohmers who has designed a new CPU(rexcomputing)<br />that is 20x more efficient than a top Intel CPU.<br />http://www.theplatform.net/2015/03/12/the-little-chip-that-could-disrupt-exascale-computing/<br />That would mean that you could put a high end CPU into a Smartphone. :D<br />I dont know maybe it would be also possible to make a high end GPU much more efficient so it would fit also into a Smartphone what would be desirable because there is currently no high Speed wireless Connection(50+Gbit/s) available.<br />At 10 W TDP and with a 50 Whours battery (build by Sakti3/cheap high energy density solid state battery) the highend Desktop PC in a Smartphone could become a reality. I think everyone is going to demand/want that, even the financial managers of Nvidia and AMD. <br />I really hope that Magic Leap will disrupt the circuit designer world for a reincarnation of the PC.<br /><br />Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-61506421085137098982015-08-14T14:24:45.775-07:002015-08-14T14:24:45.775-07:00That doesn't sound right. It might have been b...That doesn't sound right. It might have been because the CPU version of Cycles doesn't scale well beyond 8 cores, and throwing 18000 octa cores at it won't make it any faster. <br />There's no reason why that sort of massive scenes wouldn't be possible in real-time some day, provided GPU technology crawls out of the slump it's been in for the past 5 years.Sam Laperehttps://www.blogger.com/profile/05688552048697970050noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-64799494721065715362015-08-14T02:31:31.682-07:002015-08-14T02:31:31.682-07:00I only ask because i found that the discrepancy be...I only ask because i found that the discrepancy between different render times were quiet high. The creator of the scence George Kim rendered 12 seconds in 7 hours with one high end GPU whereas on the supercomputer MUC which consists of 18,000 Intel Xeon E5 2680 octa cores and reaches 3 PetaFLop of performance it needed 1 hour for one frame to render. <br />My gaming dream is to find myself one day in a vast mountainous VR jungle sitting in a mech hiding from my opponents. haha <br />So that was my concern for the question. I just wanted to know if that was ever possible, or if i should stop dreaming forever. <br />Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-48766803753960593982015-08-13T17:02:58.693-07:002015-08-13T17:02:58.693-07:00LOD schemes would definitely work. Weta used this ...LOD schemes would definitely work. Weta used this in their GPU accelerated PantaRay renderer to precompute geometry occlusion for the CG scenes in Avatar (and feeding the results to RenderMan): https://research.nvidia.com/publication/pantaray-fast-ray-traced-occlusion-caching-massive-scenes <br />With Manuka they've now got a full path tracer, which can be optimized quite aggressively to deal with massively instanced geometry.Sam Laperehttps://www.blogger.com/profile/05688552048697970050noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-52448037674213833002015-08-13T16:35:57.798-07:002015-08-13T16:35:57.798-07:00Maybe a lot of stuff can be done on the LOD part t...Maybe a lot of stuff can be done on the LOD part too.<br /><br />For example:<br />- from very near distance, you will have full unique trees;<br />- from halfway distance, you'll have a tree composed of instances;<br />- and at far distance, a full cloned tree instance.<br /><br />The trick would probably be to make transitions smooth and to seek for a correlation algorithm between LODs to avoid morphing meshes too different.MrPapillonhttps://www.blogger.com/profile/17168393245426979719noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-33125549455232242032015-08-13T16:12:43.528-07:002015-08-13T16:12:43.528-07:00I've seen it before on the Blender forum, it w...I've seen it before on the Blender forum, it was rendered in Cycles. Crazy amounts of geometry are not a problem for ray tracers when using instancing, you don't need Unlimited Detail for that. The difficulty is making sure your rays don't get trapped in the dense foliage and keep bouncing around. There are ways to prevent that from happening or at least mitigate it.Sam Laperehttps://www.blogger.com/profile/05688552048697970050noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-72450577459267466202015-08-13T09:24:55.098-07:002015-08-13T09:24:55.098-07:00Hey Sam, quick question.
Do you think that it will...Hey Sam, quick question.<br />Do you think that it will be ever possible to render such a thing in real time?<br />https://www.youtube.com/watch?v=pHLjZkpdnyQ<br />It was rendered on the super computer Supermuc and contains half a million trees.<br />If not, then hybriding with Unlimited Detail would be an option?<br />Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-86866886328423936692015-07-29T07:05:37.181-07:002015-07-29T07:05:37.181-07:00Anonymous , what do you think about Intel and Micr...Anonymous , what do you think about Intel and Microns new 3D XPoint Technology?<br />Sounds like some sort of resistive cell tech.<br />Intel reckons that it will lead to even more new storage technolgies.Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-41157762762506483562015-07-28T06:09:27.018-07:002015-07-28T06:09:27.018-07:00@MrPapillon
These archviz demos are created with ...@MrPapillon<br /><br />These archviz demos are created with the lightmass tool. You cant create landscape with it,<br />all static, lots of errors in building process. Quality is not on par with path tracing.<br /><br />Theres a reason why people hope for this as a better baking tool :<br />https://www.youtube.com/watch?v=RSpc3JBosF4<br /><br />On the other hand, engine devs know that realtime raster graphics are at a dead end.<br />Unreal and Cryengine try to implement cone and distance field tracing.<br /><br />But these hybrid solutions are very complex with a lot of limitations.<br />Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-73231652152596625172015-07-28T05:09:27.799-07:002015-07-28T05:09:27.799-07:00@MrPapillon: I like this demo too. It looks so rea...@MrPapillon: I like this demo too. It looks so realistic to us because the photogrammetric assests just match exactly what we've stored in our brains.<br />With enough artistic effort you can do sth amazing ,i agree. <br />I am doing right now sth in UE4 with scanned trees, rocks and such things.<br />Looks amazing! But as Sam pointed already out correctly, novelty wears off.<br />The amount of detail is also a very important aspect. The more geometric/pixel detail the more foregiving i am with wrong shading. It distracts my brain from the errors because it is so busy/overwhelmed recognizing objects. If then the objects are also of photogrammtric nature the illusion could be quiet perfect.<br />Right now it is all about error hiding. With path tracing your objects could always be of artificial/less realistic nature, wouldnt matter because lighting is correct.<br />Although this would also look unreal, but not because the lighting is wrong but because our brain is not used to artificially created objects. Experience.<br />Concerning FullDive: I know it could work, but question how good.<br />It is simple. Why are all people talking about it? Because there are two things missing. Motoric control and Sensation.<br />Strangely enough Vision and Audio turns out to be the 'easiest' part.<br />Interfacing nerves could be possibly in 20 years but there is actually a easier way <br />to do it with far more puclic acceptance. <br />You would be buried in a medium so you would rest immovable. Some how fixed.<br />You would wear a electronic skin that could measure very accurately pressure profiles(or stretching) once you would try to move a limb. Sounds odd but it would work. Try it yourself holding one hand on another.<br />Now with These commands you could control a virtual avatar. But , proprioception would be missing. KEy to move, you cant move without it.(there are few people on the world who have an illness where they can move but dont feel their limbs, bound to a wheelchair)<br />You can generate ghost movements by vibrating your muscle tendons.<br />Some guys even want to reproduce those movements induced by vibration to use it for <br />advanced prosthesis and virtual reality. https://www.youtube.com/watch?v=zTW5rq6jmuo<br />https://www.youtube.com/watch?v=CAOBdeqov6E<br />What i want to stress is. All sensitive cells in our skin, tendons, joints, inner ear(rotation,acceleration) are of pressure sensitive nature. Mecanoreceptors, which can be stimulated by force. Ultrasound. There was a time when neuroscientists wanted<br />to control the brain with ultrasound. I say: why not control our whole tactile/proprioceptive System via ultrasound. Those are a 1000x more sensitive to force than electric brain neurons are.<br />We need an electronic skin that could do that. https://www.youtube.com/watch?v=4oqf--GMNrA<br />@SamLapere: Maybe in 5y Nvidia and AMD will do that. 'Unfortunately' they are client no. 1 at Globalfroundries. All about the money.<br />We will see how the world will change once ML is going to sell DK1s next year.<br />They create visual worlds that are 100% neurologically true. They just dont do atoms......That is another company..... XD (Rony Abovitz)<br /><br /><br /><br />Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-9930090759737579282015-07-27T19:09:26.499-07:002015-07-27T19:09:26.499-07:00Dedicated hardware?.. Then, you definitely should ...Dedicated hardware?.. Then, you definitely should be longing for an ECL-based game console with no OS and no API, where you could do everything you wanted to. It is so simple, but no-one cares. 'tis a small talk, but you'll see who was right. Johnhttps://www.blogger.com/profile/08154705424835302415noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-72491009303547793092015-07-27T16:44:33.985-07:002015-07-27T16:44:33.985-07:00I agree with MrPapillon that some of the Unreal En...I agree with MrPapillon that some of the Unreal Engine 4 videos look fantastic, especially the architectural ones with baked lighting, but the amount of artist hours, effort and skill to achieve such a result is disproportionate compared to what you can do with path tracing, even if that means that you have to throw much more computational power at it. The baked lightmaps in the UE4 video also require your scene to be completely static, which is a huge drawback and shatters the illusion once you start moving around. Once you're over the initial appeal of a perfectly photorealistic (but static) scene, there's actually very little incentive to keep going back to it. The novelty wears off very fast and it gets boring incredibly quickly. A scene must be interactive to capture someone's attention for more than 30 seconds, and I believe a fully real-time realistic lighting solution is paramount for immersion or "suspension of disbelief".<br /><br />Btw, @Retina: I think Magic Leap buying their own chip facility is a great idea, but I would actually like to see someone create dedicated hardware that can do high quality noise filtering in real-time, since path tracing is already possible at 8 samples per pixel at HD resolution in real-time.Sam Laperehttps://www.blogger.com/profile/05688552048697970050noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-73192079806354615592015-07-27T15:52:03.961-07:002015-07-27T15:52:03.961-07:00Retina, I don't agree with you. I'll give ...Retina, I don't agree with you. I'll give you this example: https://www.youtube.com/watch?v=DRqMbHgBIyY. This looks quite good. If you don't find it good, the average joe will. This is all about how the creative guys manage to create illusion. So this little video still looks much better than any Brigade video, even if it is theory a much more poor lighting result. When you'll have your path trace solution, amazing graphists and overall tech/production team will manage to put 80x your efforts in simple rasterization and will put to shame your first working path tracing sample. With time, it will certainly reverse at some point, because Monte Carlo path tracing and even simple tracing with few bounces still have some advantages. <br /><br />So in my opinion that's not where Path tracing/ray tracing will shine. In my opinion, path tracing should help democratizing quality. Because it's very simple to just throw a set in there and change few parameters and make it look good. So indie games could look good, teams with poor taste will make it look good, and so on. I think that it is the real advantage of that technology. An other advantage is the seamless LOD. Either foveated rendering or spatial LOD, or any LOD. It's even easy to do it on a per frame basis. That could also help for the streaming part. With the rays, it's easier to know what to load, even for indirect bounces I guess.<br /><br />About your idea of a "full dive" tech, I stand in my position, as a programmer, that if you have no prototype, an idea holds no clear value. Without confirmation, an idea is bound to fail most of the time. There are billions of "good ideas" that simply fail on first attempts. Also in our world the "idea" is not the end result. The end result is the product, so you have to have the idea and make it grow until it becomes a product. If you fail somewhere, your idea will have no real value. It might for another person though. Ideas are great, making things happen is what is expected. Even less cool ideas are shining just simply because they passed the whole filter of production. For example VR, the initial idea is simple: throw images into your eyes and capture movement. The correct result is not yet fully available after tons of money have been thrown at for years and years, and we still have to see if nausea will have a definitive solution or not. <br /><br />As a simple note about your proprioception thing: I had about 100 people to test the Oculus DK1 back in time. All people reacted different and in that way, it was quite weird as we are used to have people in the same tone when playing on 2D screens. So I think maybe proprioception may have the same subjectivity issues as proprioception itself is a quite subjective sensor. MrPapillonhttps://www.blogger.com/profile/17168393245426979719noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-9867084754994766382015-07-27T02:30:36.159-07:002015-07-27T02:30:36.159-07:00Nothing about ray traced images. Rasterized images...Nothing about ray traced images. Rasterized images will always look fakish.<br />The Star Wars Battlefront demo from Dice really turned me on.<br />BUt after a week or so as i watched gameplay demos without all the TA TA star wars music<br />i realized that it doesnt look much better than battlefield 4. Only cookie is the photogrammetry. But shading is so wrong!<br />No way around ray tracing. Unfortunately it takes exponentially more power to get the last 10 percent of shaded pixels right. XD<br />By the way , Magic Leap bought its own semicondúctor facility for 38 million to produce their lightfield chip. https://www.youtube.com/watch?v=bmHSIEx69TQ<br />I have THE idea how to get Full Dive technology NOW. without nerve interfaces and such things..... just to let you know.....XD It involves proprioceptive illusions induced by vibrations and such things. electronic skin....etc.....Motoric output is possible without muscle tranquilizers. XD There is 1 path. <br />There is basically only one way to do full dive non invasively. <br />Retinahttps://www.blogger.com/profile/07837338662579054049noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-38999589772801088342015-07-26T22:08:08.603-07:002015-07-26T22:08:08.603-07:00And we will also have to imagine what stuff we cou...And we will also have to imagine what stuff we could do with 80 GPUs and triangle rasterizers.MrPapillonhttps://www.blogger.com/profile/17168393245426979719noreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-81579883098164376822015-07-26T07:34:58.796-07:002015-07-26T07:34:58.796-07:00Its from one of the Octane devs in the Otoy forum....Its from one of the Octane devs in the Otoy forum. <br />"To get the quality we showed at GTC you will need roughly 80 amazon GPUs, so it is not cheap or for everyone, and is cloud only. We can use filtering to get it to work on 2-4 GPUs."<br />https://www.youtube.com/watch?v=FbGm66DCWok<br /><br />I dont know how good filtering would work. But pure path tracing still needs a lot of <br />teraflops for calculation. Without dedicated ray tracing hardware we can wait another<br />decade to see this tech in games. <br />Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-7218691079777856882015-07-25T18:24:15.349-07:002015-07-25T18:24:15.349-07:00Where is it stated Brigade used 80 GPUs for that d...Where is it stated Brigade used 80 GPUs for that demo? Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-7277449027963623452.post-83905332219501601012015-07-25T05:23:07.531-07:002015-07-25T05:23:07.531-07:00Yes their visualizer is not on par with the top pa...Yes their visualizer is not on par with the top path tracers.<br />I can only speak for myself, but i like the graphics a lot more than anything in todays games.<br />https://www.youtube.com/watch?v=ufbDH5qbRCE<br />These caustics cards were manufactured in 90nm and still used the cpu for shading and textures. So im kind of courious what a high end 14nm gpu + imagination tech could achieve ?<br /><br />Maybe fixed function hardware lacks flexibility in regards to ongoing research.<br />But if i look back on the last five years of gpu path tracing, i am optimistic :)<br />The marketing of course could be a lot better.<br />But at least they promised some more demos for this year. Anonymousnoreply@blogger.com