cancel
Showing results for 
Search instead for 
Did you mean: 

Nvidia and GamesCom MEGATHREAD.RTX 2080/Ti (FIRST BENCHMARKS COMING OUT) NDA lifted.

LZoltowski
Champion
Made this thread so that we can all discuss the Nvidia announcements and anything else juicy being revealed at GamesCom 2018

EDIT 5

GeForce RTX 2080 3DMark Time Spy Benchmark Leak Approaches Titan Xp Performance


https://wccftech.com/nvidia-geforce-rtx-2080-3dmark-timespy-score-leaked-clocked-at-2ghz-and-beats-a... https://hothardware.com/news/geforce-rtx-2080-3dmark-timespy-benchmark-leak-titan-xp-performance


EDIT 4

Nvidia 2080 first benchmarks:

Nvidia Shares RTX 2080 Test Results: 35 - 125% Faster Than GTX 1080

https://www.tomshardware.com/news/nvidia-rtx-2080-gaming-benchmarks-rasterized,37679.html

Under certain conditions, 50%< most likely across the board without the use of AI-powered anti-aliasing.

We’re not expecting to average 50%-higher frame rates across our benchmark suite. However, enthusiasts who previously speculated that Turing wouldn’t be much faster than Pascal due to its relatively lower CUDA core count weren’t taking underlying architecture into account. There’s more going on under the hood than the specification sheet suggests.

This is what I mentioned earlier. Now the 2080 Ti has almost 50% more Turing CUDA cores than the 2080, holy shit, that thing is going to be a beast.


EDIT 3
Here is how to watch:
  • UK: 5PM BST
  • Central Europe: 6PM CEST
  • East Coast US: 12PM EDT
  • West Coast US: 9AM PDT
  • Japan: 3AM JST Tuesday 21 August
You can watch the event Live here:

https://www.twitch.tv/nvidia?tt_content=text_link&tt_medium=live_embed


EDIT 2
RTX 2080 at the Cologne event is pretty much confirmed in 2 more days!

Nvidia has posted a trailer/teaser that has some interesting clues about the next gen cards.

https://www.youtube.com/watch?v=F7ElMOiAOBI

EDIT 1
Nvidia announces QUATRO workstation cards based on Turing Technology and RTX (Real-Time Raytracing)
Interesting tidbit from Nvidia press release on new Quadros:
  • New RT Cores to enable real-time ray tracing of objects and environments with physically accurate shadows, reflections, refractions and global illumination.
  • Hardware support for USB Type-C™ and VirtualLink™(1), a new open industry standard being developed to meet the power, display and bandwidth demands of next-generation VR headsets through a single USB-C™ connector.
  • New and enhanced technologies to improve the performance of VR applications, including Variable Rate Shading, Multi-View Rendering and VRWorks Audio.
Press release: https://nvidianews.nvidia.com/news/nvidia-unveils-quadro-rtx-worlds-first-ray-tracing-gpu?linkId=100000003236181


Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂
847 REPLIES 847

LZoltowski
Champion

snowdog said:

The problem with that is that these bells and whistles will only be available for Nvidia cards. As a developer I'm not going to waste my time having these features in my game if AMD cards can't benefit from them.


True for pancake games (AMD is working on their own version, this will be HairWorks all over again) in terms of Oculus Home VR games, this is the latest report on GPU's from Oculus owners:

1lg2yboxyxgv.jpg
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂

cybereality
Grand Champion
I'm excited for new video cards, but I hope that the ray tracing stuff makes it into open cross-vendor standards. Having a game tied to one vendor, and only a small set of newest cards from that vendor, is not tenable. 
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV

bigmike20vt
Visionary

snowdog said:

The problem with that is that these bells and whistles will only be available for Nvidia cards. As a developer I'm not going to waste my time having these features in my game if AMD cards can't benefit from them.


Fair point... To be honest if Nvidia support dx12 properly it will be a positive step (10 series doesn't).

Whilst it is not to the same degree as Turing I believe amd have supported hardware ray tracing for some time but was not supported because of the reason you state

It's also why hardware physx was rarely supported despite being v v cool
Fiat Coupe, gone. 350Z gone. Dirty nappies, no sleep & practical transport incoming. Thank goodness for VR 🙂

kojack
MVP
MVP
Hehe, "world's first ray tracing gpu".
Sillicon Arts beat them to that slogan by 6 years.
Plus there were others before that, going back to the Saarland University RPU that was used for all those raytraced quake 3 videos that came out 16 years ago.

Good to see more ray tracing though. I've written numerous ray tracers, and I use a ray tracer I wrote as an optimisation exercise for my students (since it's the easiest system to learn multithreading on, plus they can get into mathematics optimisation, spatial partitioning, gpgpu, network distributed rendering, etc).


6 gigarays is pretty good. That's 7.5 times faster than the Sillicon Arts RayChip.
Although once you start doing shadows and reflections, those rays get eaten up pretty quickly.


Author: Oculus Monitor,  Auto Oculus Touch,  Forum Dark Mode, Phantom Touch Remover,  X-Plane Fixer
Hardware: Threadripper 1950x, MSI Gaming Trio 2080TI, Asrock X399 Taich
Headsets: Wrap 1200VR, DK1, DK2, CV1, Rift-S, GearVR, Go, Quest, Quest 2, Reverb G2

LZoltowski
Champion

kojack said:

Hehe, "world's first ray tracing gpu".
Sillicon Arts beat them to that slogan by 6 years.
Plus there were others before that, going back to the Saarland University RPU that was used for all those raytraced quake 3 videos that came out 16 years ago.

Good to see more ray tracing though. I've written numerous ray tracers, and I use a ray tracer I wrote as an optimisation exercise for my students (since it's the easiest system to learn multithreading on, plus they can get into mathematics optimisation, spatial partitioning, gpgpu, network distributed rendering, etc).


6 gigarays is pretty good. That's 7.5 times faster than the Sillicon Arts RayChip.
Although once you start doing shadows and reflections, those rays get eaten up pretty quickly.



They are also utilising "DLAA — deep learning anti-aliasing, which is a breakthrough in high-quality motion image generation — denoising, resolution scaling and video re-timing"

This is not just pure raytracing (low sampling is used to allow more complex scenes), combined with raster + DLAA you can really make some amazing stuff.

Metro Exodus is said to use RTX

https://www.youtube.com/watch?v=2NsdM1VS5u8

New: Unreal Engine Real-Time RTX Demos
https://www.youtube.com/watch?v=3jb3flTRykQ

https://www.youtube.com/watch?v=VkXQLLyRCA4

All running live on a single Quadro.
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂

LZoltowski
Champion
RTX 2080 is also going to be Turing based (like the new Quadros)... based on the clues in the Nvidia promo (AlanaT = Alan Turing)
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂

kojack
MVP
MVP
One of the interesting things with pure ray tracing is that you can generate rays that match the panels and lenses of a vr headset.

Normally apps render at a certain resolution, then the oculus sdk distorts that based on the inverse of the lens distortion and sends it to the panels. So what is rendered isn't the same as what is shown, there isn't a 1:1 texel to pixel relationship across the whole view. To make the centre of the view 1:1, you end up rendering too much for the sides. If you make the sides 1:1, the centre is low res.
But generating rays based on the lens distortion directly could help a lot with that, because the distortion step is removed. Mixing that with foveated rendering shouldn't be too hard either.

Although that wouldn't work with hybrid rendering with scanline rasterising, since that still has to go through a distortion step.

Author: Oculus Monitor,  Auto Oculus Touch,  Forum Dark Mode, Phantom Touch Remover,  X-Plane Fixer
Hardware: Threadripper 1950x, MSI Gaming Trio 2080TI, Asrock X399 Taich
Headsets: Wrap 1200VR, DK1, DK2, CV1, Rift-S, GearVR, Go, Quest, Quest 2, Reverb G2

RuneSR2
Grand Champion


RTX 2080 at the Cologne event is pretty much confirmed. 6 more days!

In this video, there are clues where everyone has found them on twitter. 

https://www.youtube.com/watch?v=F7ElMOiAOBI


You're fast 🙂 - I didn't read your posts before now and can see I made similar posts in another thread yesterday, sigh. Well, at least the message gets around B)

I can still remember all the hype for Geforce3 in 2001 (I bought one back then for something like $800 - the Asus Deluxe version ❤️ ) Back then there were amazing videos showing early Doom3 realtime calculated performance - and we were all amazed. But when Doom3 finally arrived, Geforce 3 wasn't really interesting anymore. 
In short, I'm not into a lot of tech-talk about new features that may or may not become important for new VR games and apps, I just want to see the raw benchmarks in current games B) 

Show-me-the-numbers!  

PS. Fun thing - you can see the amazing realtime GeForce3-rendered Doom images here - from an article I wrote in 2001, I got the images from VoodooExtreme, those were the days ❤️

http://www.hardwaretidende.dk/hard/artikel/01/02/22/7595866

Thus I'm getting numb to all the amazing things Nvidia might promise new cards can do. I'll just be very happy if the RTX 2080 is at least 50% faster in 4K gaming than my GTX 1080 - and if Nvidia can deliver 180-200w power consumption. 

Oculus Rift CV1, Valve Index & PSVR2, Asus Strix OC RTX™ 3090, i9-10900K (5.3Ghz), 32GB 3200MHz, 16TB SSD
"Ask not what VR can do for you, but what you can do for VR"

Anonymous
Not applicable
Must...resist...buying one...must...resist... 😮 😄

Anonymous
Not applicable
  • NVIDIA Titan RTX – $3000 US (50% Faster Than 1080 Ti)
  • NVIDIA GeForce RTX 2080 8GB GDDR6 – $500-$700 US (50% Faster Than 1080)
  • NVIDIA GeForce RTX 2070 7GB GDDR6 – $300-$500 US (40% Faster Than 1070)
  • NVIDIA GeForce GTX 2060 5GB GDDR6 – $200-$300 US (27% Faster Than 1060)
  • NVIDIA GeForce GTX 2050 4GB GDDR5 – $100-$200 US (50% Faster Than 1050 Ti)
https://wccftech.com/nvidia-geforce-gtx-rtx-20-series-titan-rtx-geforce-rtx-2080-rumors/

I am looking more into the idea of waiting for that 2080 TI if these rumors are true:) Getting something at between 30-50% faster than current 1080TIs would be massive 😄