cancel
Showing results for 
Search instead for 
Did you mean: 

AMD unveils next-gen Radeon VII, launches February 7 for $699

RuneSR2
Grand Champion
Looks (potentially) awesome: 

https://www.pcgamer.com/amd-unveils-next-gen-radeon-vii-launches-february-7-for-dollar699/



Quote:

"The Radeon VII boasts 60 compute units and 3,840 stream processors running at up to 1.8GHz. It also has a generous frame buffer—16GB of high bandwidth memory (HBM2) delivering 1TB/s of memory bandwidth. It's not really a surprise that AMD is sticking with HBM2 instead of GDDR6, as that's been the case with every Vega model so far.

AMD's reference design follows Nvidia in ditching the blower format, going for a triple fan design instead. That's probably for the best, as the former blowers could get incredibly loud at higher fan speeds. Plus, AMD apparently still needs to cool a 295W chip.

AMD says its 7nm architecture delivers 25 percent faster performance than the previous model, while consuming the same amount of power."


We'll see when more trustworthy benchmarks arrive 😉 300W for 7 nm will be disappointing in my book - that's much worse performance per W than the 2080 Ti. But drivers are young, and if not for anything else, I hope AMD can help reduce Nvidia's RTX prices. 

Oculus Rift CV1, Valve Index & PSVR2, Asus Strix OC RTX™ 3090, i9-10900K (5.3Ghz), 32GB 3200MHz, 16TB SSD
"Ask not what VR can do for you, but what you can do for VR"

32 REPLIES 32

bigmike20vt
Visionary
different technology......... nvidia have been investing a lot more money into developing their gpu tech over the last years and are simply more advanced than AMD.  AMD have less cash to invest and could not directly compete on 2 fronts.... so they chose to work mostly on their cpus and take on intel.....  it seems to have worked. the ryzen chips are great  (their apus damn fine too and this has meant they have the contracts for both the PS5 and the new xbox).

the rumours are now however they DO have a lot more cash to invest in gpu R&D, and that this new radeon is a stop gap - much like the 590  to keep something on the table to keep them even remotely relevant because it takes time to go from R&D to having a viable competing card on the table.

also they are not a total washout. they still have advantages, just not really for gaming.  Also the parts are salvaged from their really high end chips so they had to do something with them.

Also we have to remember the high end gaming gpu is pretty niche. AMD provide chips for both MS and sony current gen consoles and as i said the new ryzen 7 APUs look fab..... so its not like they are not competing elsewhere in gaming... just not at the very high end. 

the RX 580 with 2 games bundle for £200 or RX590 with 3 games bundle for £250 is pretty good for mid range gamers who just wantto game at 1080p.

it was not always this way..... at the time the nvidia GTX 480 or "Fermi" as it was known was comparatively high power usage and ran really hot.  it was not till the architecture matured in its refreshed form that it really took off.
Fiat Coupe, gone. 350Z gone. Dirty nappies, no sleep & practical transport incoming. Thank goodness for VR 🙂

Anonymous
Not applicable
Honestly though they could've went with 8GB version and match their Vega brother. HBM2 is costly and I'm sure at least 75$ of that price tag was for that extra 8GB. At least for gaming anyways. It just doesn't make sense to have 16GB of VRAM on this card for that alone. Granted - I'm sure they design this card for more uses than a gaming card so I understand what they did and why - but as with most products - it needs to make sense for that market that is looking to buy it.

Right now R7 doesn't make sense here for gaming alone and could've been price a bit better if they would've made two versions of the card for sell and said that in their keynote. One for $599 and one at 699$ would've made a bigger splash.

At this point I think calling it a gaming card alone seems a bit silly. Instead - I think they should've said it was more of a workstation card design with high memory usage task in mind. This way it make more sense and VERY worth vs other workstation cards such as the Titan coming in at 1k$+.

From there they could've went into more detail plain for later in the year to release new gaming GPUs that will be design for price to performance ratios without the needed RTX stuff in mind. Unfortunately - AMD is behind in GPU land right now. Even with the jump to 7nm their current architure is just NOT good enough anymore and doesn't scale well. They need to revamp their cards and that comes with big cost as well. AMD was smart focusing on one side of the business over the other and it paid off in this case, but it's going to hurt them big in the other. I hope Ryzen 3 release will push them even more forward though with cash so their next line up of GPUs will make sense. I'm actually looking forward to building AMD computer and double my performance over my current I7 3770. 

RTX features aside - if you are in the market for a new GPU - I would look at the NV 2060 or 2070 (or anything in the 10s) or the 580 and 590 from AMD. This way you get the best bang for the buck if anything.

@bigmike20vt
They couldn't. GDDR6 is actually too slow for the card in this case and as a result they can't go back. Plus - any memory changes would take time and money to refit for their needs. Not to say it's not possible - no - I think it has to be with the fact they need to stay relevant in the market - so changes right now would've been bad in this case. I say you are correct though and that they know HMB2 is just too costly and as a result - their Navi line up will feature GDDR6 instead later on in year. The only downside to that is the fact that 20s cards will also be coming down in price - and NV following shortly after AMD for their 7nm as well. 

If anything - it looks like we could see close to 30-60% performance increase from both AMD and NV for their 7nm lineups. That really good for VR because that would allow 4k by 4k screens over eye tracking no problem for example for gen 3 of VR. Flat gaming will see 144 HZ become the new refresh rate. 4k gaming on the other will also see a increase too by 15-30%.

RattyUK
Trustee
Sadly it will equate to about the same in GBP as Dollars - 20% VAT will eat up the £/$ exchange rate, but still an expensive & power hungry card - think I'll wait for the next generation (or hope Nvidia drop their base price on the 2080ti 🙂 )
PC info: AMD Ryzen 9 5900X - Sapphire 7900XTX - 32GB DDR4 4000 - 3 NVMe + 3SATA SSD - Quest 2 & 3

falken76
Expert Consultant
So all this really says is everyone wanting to play PC games had better expect it to become the norm to just pay as much as a fucking entire game console to just get the GPU period.  Going forward, Sub $300 cards will probably become a thing of the past.  I can't afford these damn things anymore.  I will certainly buy these if they fall off a truck, I don't care if they straight up told me it's stolen, if it's $250 or less, I'd buy it.  These things are simply to god damn expensive now.

falken76
Expert Consultant

snowdog said:

The next generation of RTX GPUs will be cheaper because the new tech in the current 20xx GPUs will be cheaper to manufacture. We only saw a big leap in price for the 20xx GPUs because new tech is expensive.



I think Nvidia will opt out of giving a discounted price to the consumer and will opt in to increasing their profit margins.  Mark my words....

bigmike20vt
Visionary
Hi
so on another forum i have just been told that what i suggested above is a non starter.

apparently because these are essentially failed vega 20 chips  things like the memory is already designed to be 16gb, so half of the HBM2 cant simply be taken out.
and also, for the same reason as above, and because it would take a significant retooling anyway (expensive) to put GDDR 6 on it it is not viable to change the memory type either, not for a card which is going to have a very short shelflife as a stopgap.

edit and @Mradr also pretty much said some of  the same here too i just noticed 😉
Fiat Coupe, gone. 350Z gone. Dirty nappies, no sleep & practical transport incoming. Thank goodness for VR 🙂

Anonymous
Not applicable

falken76 said:

So all this really says is everyone wanting to play PC games had better expect it to become the norm to just pay as much as a fucking entire game console to just get the GPU period.  Going forward, Sub $300 cards will probably become a thing of the past.  I can't afford these damn things anymore.  I will certainly buy these if they fall off a truck, I don't care if they straight up told me it's stolen, if it's $250 or less, I'd buy it.  These things are simply to god damn expensive now.


I wouldn't say that. Right now - everything is just mess up in the technology world. Delays in the next shrink cause a bunch of problems then on top of that you had the mining craze and the memory shortages. I say they will go back to normal for both NV and AMD by end of this year in terms of pricing. Well - price wise - things will have to slowly increase as well as innovation becomes harder to do too. We did take advantage of the fact that things kept getting smaller and faster - now we're hitting speed bumps. Either way prices will fall back down because of the fact memory needs should fall as most people are not upgrading phones every year anymore. Plus - it looks like the mining craze will not recover either (still be around) but will not be as bit as it once was. 7nm yields will also improve over time too.

I say don't worry about it right now. I'm still on a Fury X card and it has done me well for a while now. I can play most games with reasonable settings - so long as that is what you are looking for - then that is enough for the most part.

bigmike20vt
Visionary
as again @Mradr touches on.. the upside is upgrades are really not needed so often. my mate still games on a gtx 780, and this includes vr on an oculus rift.
the only reason i updated my gtx 980 was because i got a new 4k tv......  4k is a really big ask..... for for 1080p and the rift my gtx 980 would have seen out the entire oculus rift generation.  now we have ray tracing as well.
sure pcs did not cost as much to upgrade back in the day****... but then back in the day you could play the newest games at full detail for 6 - 12 months if you were lucky and then you had to be looking for the next upgrade.
next generation i hope that we will be looking at 4k 60fps WITH raytracing...  and if we get that, even if it costs £1000 to get it, i could live with that because chances are i wont need to upgrade for 5 years after it.  and that is proper at the sharp end.
again the majority of gamers are happy at 1080p... with this in mind hopefully the next generation £300 gpu will be more than capable of this again for many years.
my cpu......... that is an i7 5820k... it over clocks well but it is 3.5 years old now and i fully expect it to still be fine for the next 3.5 years (i may need another 16gb of ram that is all)
so i would add in that as well as all the stuff mradr says we also stepping up to a new huge resolution (4k) and TBH i do not see the need to go higher than that any time soon if ever for gaming on a screen..... and implementing the holy grail technology that many gamers and developers have been fantasising over for years .. real time ray tracing.... the development to make that tech costs proper money that has to come from somewhere...
but again now its here and essentially working, i do not think it will change much for a few years just get faster and cheaper to produce.

**** I am not even that sure how true that is. my ibm 486 DX3 75 with ... 4mb i think of ram, 420mb hdd, 2x cd rom and sound blaster 16 with 15 in monitor cost over £1400.... that would have been in 1994.

taking into account inflation (real world cost of living)  that must be £3000 - £4000 in today's money and truth be told, even back then it wasnt that good there were far better pcs out there
Fiat Coupe, gone. 350Z gone. Dirty nappies, no sleep & practical transport incoming. Thank goodness for VR 🙂

Anonymous
Not applicable
Yea, right now RT needs to show in software usage. The other problem - AMD also needs to pick up RT if it's going to worth any salt. NV alone supporting it will not be enough for software devs to really add it into their games. This also means NV needs to be more open with their RT software if they wish for both consoles and AMD to pick it up too. I know they are in for the money - but if you want to sell a feature - you need to sell it to your competition first so you look better at the end of the day with the higher quality of said feature. 

Right now 4k at 60FPS RT is already a thing though. The problem is people have higher refresh monitors in the upper 144 HZ range. Even at 1080p - they're unable to hit that target FPS with RT on. It's not so much a resolution problem as it is just the software way of doing RT isn't mature enough. That should come with time though. The fact we saw such a big performance leap already from BF5 last update shows there is still a ton of room for improvement. Actually - someone over on reddit said that they're not even using the TR cores to their max performance. This makes me wonder if something else is going on.


https://www.youtube.com/watch?v=QKV8VdhZuW4

falken76
Expert Consultant

Mradr said:


falken76 said:

So all this really says is everyone wanting to play PC games had better expect it to become the norm to just pay as much as a fucking entire game console to just get the GPU period.  Going forward, Sub $300 cards will probably become a thing of the past.  I can't afford these damn things anymore.  I will certainly buy these if they fall off a truck, I don't care if they straight up told me it's stolen, if it's $250 or less, I'd buy it.  These things are simply to god damn expensive now.


I wouldn't say that. Right now - everything is just mess up in the technology world. Delays in the next shrink cause a bunch of problems then on top of that you had the mining craze and the memory shortages. I say they will go back to normal for both NV and AMD by end of this year in terms of pricing. Well - price wise - things will have to slowly increase as well as innovation becomes harder to do too. We did take advantage of the fact that things kept getting smaller and faster - now we're hitting speed bumps. Either way prices will fall back down because of the fact memory needs should fall as most people are not upgrading phones every year anymore. Plus - it looks like the mining craze will not recover either (still be around) but will not be as bit as it once was. 7nm yields will also improve over time too.

I say don't worry about it right now. I'm still on a Fury X card and it has done me well for a while now. I can play most games with reasonable settings - so long as that is what you are looking for - then that is enough for the most part.



I'm on a GTX 1060 6gb and I want better performance.  Now I should wait another year?  I already waited a couple years and the 1080ti I was going to get is still a rip off.  I was going to get a 1070ti which is still a rip off considering it's age, that damn card shouldn't cost $500 or more.  Now I'm considering the RTX 2070 because it's like $50 to $80 more than a 1070ti, but WTF happened here?  Top end cards used to be in the $300 range at one point, now they're absolutely insane with their pricing.