cancel
Showing results for 
Search instead for 
Did you mean: 

Nvidia and GamesCom MEGATHREAD.RTX 2080/Ti (FIRST BENCHMARKS COMING OUT) NDA lifted.

LZoltowski
Champion
Made this thread so that we can all discuss the Nvidia announcements and anything else juicy being revealed at GamesCom 2018

EDIT 5

GeForce RTX 2080 3DMark Time Spy Benchmark Leak Approaches Titan Xp Performance


https://wccftech.com/nvidia-geforce-rtx-2080-3dmark-timespy-score-leaked-clocked-at-2ghz-and-beats-a... https://hothardware.com/news/geforce-rtx-2080-3dmark-timespy-benchmark-leak-titan-xp-performance


EDIT 4

Nvidia 2080 first benchmarks:

Nvidia Shares RTX 2080 Test Results: 35 - 125% Faster Than GTX 1080

https://www.tomshardware.com/news/nvidia-rtx-2080-gaming-benchmarks-rasterized,37679.html

Under certain conditions, 50%< most likely across the board without the use of AI-powered anti-aliasing.

We’re not expecting to average 50%-higher frame rates across our benchmark suite. However, enthusiasts who previously speculated that Turing wouldn’t be much faster than Pascal due to its relatively lower CUDA core count weren’t taking underlying architecture into account. There’s more going on under the hood than the specification sheet suggests.

This is what I mentioned earlier. Now the 2080 Ti has almost 50% more Turing CUDA cores than the 2080, holy shit, that thing is going to be a beast.


EDIT 3
Here is how to watch:
  • UK: 5PM BST
  • Central Europe: 6PM CEST
  • East Coast US: 12PM EDT
  • West Coast US: 9AM PDT
  • Japan: 3AM JST Tuesday 21 August
You can watch the event Live here:

https://www.twitch.tv/nvidia?tt_content=text_link&tt_medium=live_embed


EDIT 2
RTX 2080 at the Cologne event is pretty much confirmed in 2 more days!

Nvidia has posted a trailer/teaser that has some interesting clues about the next gen cards.

https://www.youtube.com/watch?v=F7ElMOiAOBI

EDIT 1
Nvidia announces QUATRO workstation cards based on Turing Technology and RTX (Real-Time Raytracing)
Interesting tidbit from Nvidia press release on new Quadros:
  • New RT Cores to enable real-time ray tracing of objects and environments with physically accurate shadows, reflections, refractions and global illumination.
  • Hardware support for USB Type-C™ and VirtualLink™(1), a new open industry standard being developed to meet the power, display and bandwidth demands of next-generation VR headsets through a single USB-C™ connector.
  • New and enhanced technologies to improve the performance of VR applications, including Variable Rate Shading, Multi-View Rendering and VRWorks Audio.
Press release: https://nvidianews.nvidia.com/news/nvidia-unveils-quadro-rtx-worlds-first-ray-tracing-gpu?linkId=100000003236181


Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂
847 REPLIES 847

Yep, I can't tell the difference at all. If it's baked I assume any moving objects in that room would have stand-out poor shadows/lighting in comparison?

LZoltowski
Champion

RedRizla said:



That looks suspiciously like the Ray trace version RedRizla



Just shows what can be done without it..

So you're saying, because it looks good enough we should not move on and make it better?
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂

RedRizla
Honored Visionary
@LZoltowski - I'm quite happy to play a game if it looks as good as Resident Evil does. All I'm saying is that he showed that room at it's worst while he was doing the Ray Tracing demo. Maybe he should have showed this room in the demo instead. I'm aware of how good RayTracing is and it will save Devs time, but I want more for my £1000 as a gamer..

LZoltowski
Champion

RedRizla said:

@LZoltowski - I'm quite happy to play a game if it looks as good as Resident Evil does. All I'm saying is that he showed that room at it's worst while he was doing the Ray Tracing demo. Maybe he should have showed this room in the demo instead. I'm aware of how good RayTracing is and it will save Devs time, but I want more for my £1000..

What about AI-powered anti-aliasing ... no more jaggy/blurry edges, imagine what difference that will make in VR, you won't have to supersample so much to get a sharper image. Also AI-powered upscaling, meaning you can push higher res VR screens without sacrificing performance as much.
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂

KlodsBrik
Expert Trustee
Wasn´t the only time performance was really mentioned in yesterdays reveal when he talked about infiltrator getting 90fps on 20xx which is 46-49fps on the 10xx series.
Still they showed a demo in 4k with 60fps capped.
If the new cards had so much better performance they would surely have  spend a lot more time talking about that, rather than ray tracing all the way.

Don´t get me wrong, the ray tracing looks stunning. Im just not sure if it´s worth the upgrade, price considered.

I bought my 1080 very cheap less than a year ago, so unless I can supersample a hell lot more than I can now in my favorit VR games ( they all run at 90fps already on my system ), I really don´t see the need to upgrade.
Be good, die great !

RedRizla
Honored Visionary



RedRizla said:

@LZoltowski - I'm quite happy to play a game if it looks as good as Resident Evil does. All I'm saying is that he showed that room at it's worst while he was doing the Ray Tracing demo. Maybe he should have showed this room in the demo instead. I'm aware of how good RayTracing is and it will save Devs time, but I want more for my £1000..

What about AI-powered anti-aliasing ... no more jaggy/blurry edges, imagine what difference that will make in VR, you won't have to supersample so much to get a sharper image. Also AI-powered upscaling, meaning you can push higher res VR screens without sacrificing performance as much.



While it's good for VR, it just makes VR look even more expensive now. I'll make do with just a higher resolution in VR, which will help with the jaggy edges a bit.. 

LZoltowski
Champion

RedRizla said:




RedRizla said:

@LZoltowski - I'm quite happy to play a game if it looks as good as Resident Evil does. All I'm saying is that he showed that room at it's worst while he was doing the Ray Tracing demo. Maybe he should have showed this room in the demo instead. I'm aware of how good RayTracing is and it will save Devs time, but I want more for my £1000..

What about AI-powered anti-aliasing ... no more jaggy/blurry edges, imagine what difference that will make in VR, you won't have to supersample so much to get a sharper image. Also AI-powered upscaling, meaning you can push higher res VR screens without sacrificing performance as much.



While it's good for VR, it just makes VR look even more expensive now. I'll make do with just a higher resolution in VR, which will help with the jaggy edges a bit.. 


The 2070 has Tensor Cores too, so in theory its capable of AI upscaling and anti-aliasing, meaning that in theory SS could be done using the tensor cores.
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂

KlodsBrik
Expert Trustee
@LZoltowski True, I forgot how good the AI anti aliasing looked in the demostrations.
Be good, die great !

RedRizla
Honored Visionary
@LZoltowski - At what cost to performance does RayTracing have on games? Shadows setting are currently a thing people usually look to change or turn off to increase performance in their games..

LZoltowski
Champion

RedRizla said:

@LZoltowski - At what cost to performance does RayTracing have on games? Shadows setting are currently a thing people usually look to turn off to increase performance in their games..


 Since RT and Tensor cores are independent of CUDA cores, I would say negligible no?, perhaps even better as you effectively are offloading the calculations from the CUDA cores? 
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂