cancel
Showing results for 
Search instead for 
Did you mean: 

Gtx1080Ti and RTX 2080Ti best price/comparison thread

Techy111
MVP
MVP

As per the title, this thread is intended to be a quick reference for current best prices on these two cards, a they're generating a lot of interest right now. If theres a stand-out great deal, please tag me and I'll update this first post to include that. Feel free to add prices for whatever region you're in and in whatever currency, but please avoid any posts not related to this specific topic, thanks and keep a tight hold of those credit cards.


To start off:

Scan UK prices:

EVGA GeForce GTX 1080 Ti SC Black Edition GAMING SC  £629.99

Gigabyte GeForce RTX 2080 Ti WINDFORCE OC    £1,049.99

EVGA 2080 ti xc will cost $1700 ( 10.999 dkr ) 

Dutch prices for Founders edition =

RTX 2080 TI - 1260 Euro's
RTX 2080 - 850 Euro's
RTX 2070 - 640 Euro's

Canadian prices -

EVGA RTX 2080 XC Gaming: $1099.
EVGA RTX 2080 XC Ultra Gaming: $1199.
Other 2080 variants are in between those two prices. 

Ti versions are between $1600 and $1750.
No 2070 listed yet.
A PC with lots of gadgets inside and a thing to see in 3D that you put on your head.

538 REPLIES 538

RedRizla
Honored Visionary

RuneSR2 said:


RedRizla said:

I'm starting to wonder if the Geforce 7nm are going to be $2000, with the prices the way they are..


Exactly - Volta cards did even go higher, but it's not for gamers. Last evening using the CV1 I couldn't help thinking that my oc'ed 1080 was a nice match for the quality I'm experiencing in current VR games - that is, the CV1 resolution is limited, there're god rays and if you look for it the SDE is quite noticeable - and not to forget the grey fog (spud - lack of true OLED blacks). I couldn't help thinking that - unless some awesome VR game arrives soon - maybe it's better to wait for the CV2 and then consider what hardware I'll need to run the CV2 optimally. If we're going beyond 2k, I'd like a more high-res HMD with close to no SDE - and not just a new shiny turbo-motor to drive the old car... 



Just imagine in a few years time people will still want half of what they paid for these cards. This means they will want something like $600 for a used Geforce 2080Ti in a few years time. I think we can safely say it's good bye to Geforce Ti cards ever been $699 when they are released again..

Anonymous
Not applicable

RuneSR2 said:


RedRizla said:

I'm starting to wonder if the Geforce 7nm are going to be $2000, with the prices the way they are..


Exactly - Volta cards did even go higher, but it's not for gamers. Last evening using the CV1 I couldn't help thinking that my oc'ed 1080 was a nice match for the quality I'm experiencing in current VR games - that is, the CV1 resolution is limited, there're god rays and if you look for it the SDE is quite noticeable - and not to forget the grey fog (spud - lack of true OLED blacks). I couldn't help thinking that - unless some awesome VR game arrives soon - maybe it's better to wait for the CV2 and then consider what hardware I'll need to run the CV2 optimally. If we're going beyond 2k, I'd like a more high-res HMD with close to no SDE - and not just a new shiny turbo-motor to drive the old car... 



I reckon we'll both be fine with our 1080s for what I believe will be the 4K CV2 with 140degrees FOV thanks to eye tracking and dynamic foveated rendering.

In a weird way though I'm hoping we'll only see 2K because otherwise Oculus won't have any worthwhile competition which is going to be bad news for VR in general. I can't see anyone with any sense buying a competitor's headset unless they have a SERIOUS problem with Zuckerberg and Facebook.

RedRizla
Honored Visionary
@snowdog - I'll take 4K and arses to the rest of the competition..

Anonymous
Not applicable
Yup. Me too tbh lol

I think they'll need to go 4K because HTC, assuming they're still in business by then, will probably be releasing their own 4K headset in 2020.

JohnnyDioxin
Expert Trustee
I think they will be trying to keep their current user base into CV2 (of course). Which means they have to consider not just what they want to charge for the CV2, but how much it's going to cost people to buy the necessary hardware to run it, as opposed to the CV1. That could affect which resolution they go for in CV2.

I don't think it will just be "lets use the largest res screens we can".

i5 9600k @4.5GHz; 16GB DDR4 3200; 6xSSD; RTX2080ti; Gigabyte Z390D Mobo
Rift CV1; Index; Quest; Quest 2

Anonymous
Not applicable

Brixmis said:

I think they will be trying to keep their current user base into CV2 (of course). Which means they have to consider not just what they want to charge for the CV2, but how much it's going to cost people to buy the necessary hardware to run it, as opposed to the CV1. That could affect which resolution they go for in CV2.

I don't think it will just be "lets use the largest res screens we can".



Yup, but this is where the eye tracking and dynamic foveated rendering come into play. We'll probably see the current Recommended Spec VR Ready PC being the new Minimum Spec VR Ready PC.

LZoltowski
Champion
An Nvidia Engineer said in an interview to expect 35-40% over the previous gen. (for games that do not take advantage of the RT or Tensor cores) so a 2080 vs 1080 will be 35-40% ... there might be a larger gap between 1080ti and 2080ti as the CUDA core increase delta is higher.
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂

Anonymous
Not applicable
Yup, that's pretty standard between generations if I'm remembering correctly.

Not sure if that's enough to warrant me getting a 2080 tbh, I might wait until the 2180 is released before I upgrade again...assuming that's going to be the new naming convention. They went from 10s to 20s because of the new RTX technology I think so unless they add something else new we should be seeing 21 cards next generation I think.

LZoltowski
Champion

snowdog said:

Yup, that's pretty standard between generations if I'm remembering correctly.

Not sure if that's enough to warrant me getting a 2080 tbh, I might wait until the 2180 is released before I upgrade again...assuming that's going to be the new naming convention. They went from 10s to 20s because of the new RTX technology I think so unless they add something else new we should be seeing 21 cards next generation I think.

Although yeah that is pretty typical, the price $ to fps ratio this time around is one of the worst around, if you consider just "classic" games, the ones that do not take advantage of the new tech.

And in here lies the issue, do we just want more CUDA cores for ever, until we start hitting a ceiling? Or are we happy with the fact that for things to move forward, things have to change and we need to accept the fact that this generation of cards is a new type of tech, normal game gains will be less, but it opens up the door for higher quality and higher performance down the road.

It's a bit like when multithreading and multi-core CPU's started to appear ... it took a bit of time for developers to start utilising extra cores on the CPU, but eventually it lead to faster applications. Actually many games just a few years ago only ever used 1 core on the CPU, untill multicore rigs came down in price. and adoption became widespread. Battlefield 5 now wants 6 cores! 🙂
Core i7-7700k @ 4.9 Ghz | 32 GB DDR4 Corsair Vengeance @ 3000Mhz | 2x 1TB Samsung Evo | 2x 4GB WD Black
ASUS MAXIMUS IX HERO | MSI AERO GTX 1080 OC @ 2000Mhz | Corsair Carbide Series 400C White (RGB FTW!) 

Be kind to one another 🙂

MowTin
Expert Trustee

LZoltowski said:

It's a bit like when multithreading and multi-core CPU's started to appear ... it took a bit of time for developers to start utilising extra cores on the CPU, but eventually it lead to faster applications. Actually many games just a few years ago only ever used 1 core on the CPU, untill multicore rigs came down in price. and adoption became widespread. Battlefield 5 now wants 6 cores! 🙂

Good point. I remember people complaining that the new multi-core cpus were slower in games than the sing cores. 

I also remember when DX10 came out. It killed frame rates and it seemed to add very little visual improvement. 
i7 9700k 3090 rtx   CV1, Rift-S, Index, G2