If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
New! You can now log in to the forums with your chaos.com account as well as your forum account.
Really good as first attempt!
The main visual disturbing issue, from my point of view, is the ghosting effect, in particular at the start of the demo.
Speaking about visual quality there are great improvements to do but I'm really confident that you already know and you're researching on it.... but I think that this is pure raytracing, without any kind of additional work for realtime.... I just say "finally!!!"
Probably for the next years will see some blend solution raster-raytracing like 25 years ago with renderer, but looking what you achieved now I think that 5/8 years and there will be only Unreal or Unity full raytracing based engine.
Thank you for sharing this and my bests with Lavina!!!
I see it a bit as race. Who will be there first. Will the raster graphics, wich are realtime but heavily rely on fakes and approximations, reach maximum quality first. Or will raytracing, wich is photoreal and physically correct, reach acceptable speed first. It' s fantastic to see that you got on the racetrack, too Really looking forward how this evolves. I' ve seen many realtime raytracing and GI demos over the years. But as it comes from Chaosgroup it' s one of the first I take serious.
The same you make now with Unreal Engine (or Unity or any other realtime engine)... but with the benefit to have the precision and quality of a raytracer and with low or anything middle work to prepare assets for realtime.
Yeah I don't need that huge of a data set rendered real time - just let me drop into an interior scene for VR exploring without all the prep and transfer time to re-assemble in Unity/UE. But I wonder what 1 bounce of GI will look like for an interior...
The same you make now with Unreal Engine (or Unity or any other realtime engine)... but with the benefit to have the precision and quality of a raytracer and with low or anything middle work to prepare assets for realtime.
Where did you get that this is a unity/UE thing? It says in the blurb it's rendered from a .vrscene.
I am enjoying seeing things like this progress, but also v. aware that it's a $6k card and it'll be a lot time before I get my hands on it. Looking forward to seeing more demos though!
Where did you get that this is a unity/UE thing? It says in the blurb it's rendered from a .vrscene.
I am enjoying seeing things like this progress, but also v. aware that it's a $6k card and it'll be a lot time before I get my hands on it. Looking forward to seeing more demos though!
For now this is just an exploration on what that RTX technology can bring for us. Like you, I was also curious how far it can be pushed and what kind of results can be expected, and the best way to do that is to try and actually code something. I'm not quite sure where this will bring us yet. Yes, we can trace 5 rays per pixel instead of 1, in realtime, and that helps, but final production quality requires thousands of rays. We wanted to see if denoising can fill this gap to some extent. Things can definitely be improved and there is a lot of research going on this area.
We are also obviously exploring RTX in terms of V-Ray GPU as well, but it will take a bit of time to adjust the code for this type of hardware acceleration.
For now this is just an exploration on what that RTX technology can bring for us. Like you, I was also curious how far it can be pushed and what kind of results can be expected, and the best way to do that is to try and actually code something. I'm not quite sure where this will bring us yet. Yes, we can trace 5 rays per pixel instead of 1, in realtime, and that helps, but final production quality requires thousands of rays. We wanted to see if denoising can fill this gap to some extent. Things can definitely be improved and there is a lot of research going on this area.
We are also obviously exploring RTX in terms of V-Ray GPU as well, but it will take a bit of time to adjust the code for this type of hardware acceleration.
Best regards,
Vlado
So.... this is absolutely different than "traditional" GPU rendering, right?
A few more question if you don't mind...
1) Could this utilize max legacy shaders on CPU? or it has the same limitation as any GPU render?
2) Is it still limited by Vram?
3) Arnold showed a checkbos for GPU accellation before. I guess this is that. Is that mean this tech is for VRay CPU? "also obviously exploring RTX in terms of V-Ray GPU" seems impling that.
Comment