New to the forums? Click here to read the "How To" Guide.

Developer? Click here to go to the Developer Forums.

DK2 limping home to the CV1

LuciferousLuciferous Posts: 1,611
Project 2501
edited June 2014 in General
Hi,

I have a pretty old system Quad Core Q6600 @ 2.4GHZ 900MHZ or so my computer says with 6GB ram. I have the DK2 on order and want to be able to use it but was hoping I could leave my PC upgrade for another year until CV1 to get a more powerful system when the time comes.

If I bought a 780ti now do you think the CPU would be too big a bottle neck?

Thanks in advance as I am a little out of touch with hardware these days.

Comments

  • moltonmolton Posts: 388
    Hiro Protagonist
    I say if you're developing the CPU / mobo upgrade will be worth it for the time you'll save yourself waiting for stuff to build. I actually (in like January or something) upgraded from a 2.4 quad core to a 4.2 quad core with developing in mind. Less waiting is good, just keep an older laptop around or something to test lower end hardware
  • GeraldGerald Posts: 1,068
    Nexus 6
    I guess that many games will be CPU limited with your setup, but I would not go out and buy a new one just yet. Test it with a couple of normal games if you want an answer now, or wait until you have your DK2.
    Some titles will suffer due to CPU limitations, some titles are likely run perfectly well. At least you can be sure the GPU is not the problem :)
    check out my Mobile VR Jam 2015 title Guns N' Dragons
  • Which GPU do you have at the moment? Are you going to get CV1?

    Q6600, especially without a decent oc, would bottleneck 780 Ti like crazy and even with overclocking (for example to 3,2 GHz) it would still be cause a huge bottleneck. Obviously you'd still get a big performance boost (depending on the game) but nothing close to 780 Ti with an up-to-date CPU. Also, since Q6600 is LGA 775 mobo CPU, you have PCIe 2.0 at the best, which could also hinder the performance of 780 Ti (not sure about this though).
  • ThreeEyesThreeEyes Posts: 2,230
    NerveGear
    I agree with Molton. Studies show over and over that for those doing anything complex enough that they have to wait on their computer for pretty much anything, it's cost effective to upgrade to at least where you aren't waiting if that is possible.

    The time spent idle, waiting on a slow computer, is the gift that keeps on giving. It saps productivity big time and it's far cheaper to pay once up front than to pay over and over and over in a death by 1000 cuts. In situations where there is management in the decision chain, they are quite often penny wise and pound foolish. I've seen it way too many times.

    But when management is you and money may be tight, you'll just have to make that call.

    Without knowing the motherboard you are using, will a 780 even plug in? On my system, the graphics card has a PCIe X3 interface but the mobo is only X2. It plugs in and works, and reviews indicate the only time X3 pays dividends is if you are running SLI and such where you need to get the data over the bus fast because you are feeding two cards. But it's stuff like that you have to look out for when doing partial upgrades. Then there is the question of power supply, etc. If you start having to replace too much stuff, maybe you want to wait a bit and just go with what you have or bite the bullet and upgrade. Prices usually just fall and specs just increase over time. It could be in your benefit overall to just wait and see what the future holds and how the applications you want to run can perform.

    A lot of the apps have toggleable framerate counters so you can set resolution and see how they do. That will give you an idea but won't tell the whole tale. But if you can't get good frame rates at DK2 resolution just doing one image, you then know you will need an upgrade and the only question is when.
    But... but... but... I just NEED to know about the Baba! The Baba has me hypmotized! :shock:
  • hellaryhellary Posts: 236
    Hiro Protagonist
    You'd be better off buying a new CPU/Motherboard/RAM solution now and getting a mid ranged graphics card to last until next year. The difference in performance between a current gen i5 or i7 CPU and a potential next gen i5 or i7 next year is going to be relatively minimal. The difference between a current gen graphics card and a potential next gen graphics card will be much greater.

    TL:DR Processors increase in performance slower year on year and so their worth decreases slower. Graphics cards increase in performance faster year on year so their worth decreases slower. A GPU will be more of a 'waste' of money than a CPU.
  • ThreeEyesThreeEyes Posts: 2,230
    NerveGear
    But... but... but... I just NEED to know about the Baba! The Baba has me hypmotized! :shock:
  • CgpnzCgpnz Posts: 455
    This makes one wonder what could have been if Sony had not given up on the cell processor.
    It was already at 4ghz with 7 cell core thingies. Today they could have 30 of them, which would blow any underloaded unparallisable cache cowed BS that are the multi cores today.
  • bp2008bp2008 Posts: 256
    Hiro Protagonist
    I would wait, probably. Yes that is an underpowered system by today's standards, but any CPU or GPU you buy now will be at least one generation old by the time CV1 comes.

    I think any VR game worth taking seriously will be able to run properly even on systems that are a few years old, so it wouldn't exactly be a waste for you to upgrade now...
  • willstewillste Posts: 675
    Brain Burst
    Just my two cents.

    Potentially more game changing hardware changes are at least 2 years out for Nvidia, not sure about Intels broadwell which could turn up in 2015. But I don't think it will matter much without a really good new GPU.

    That being said when Intel, Nvidia and AMD change architectures hopefully in 2016-2017 we will all want to upgrade.

    See Nvidia's road map:

    http://www.anandtech.com/show/7900/nvidia-updates-gpu-roadmap-unveils-pascal-architecture-for-2016

    That pascal GPU is the new target for probably the first single GPU to competently handle 4k gaming given its new memory architecture goals. Also it sounds like it will require a whole new motherboard so even if you have broadwell you will be upgrading again.

    I think if you buy now you may miss the DD4 boat and the 8 series GPU but you will likely only be giving up 10-20% performance for 2015, but you will be in a better position to upgrade immediately in 2016 when really good GPUs may start to roll out. AMD should have something lined up by then too.

    If Nvidia goes Nvlink in 2016 that will be a mandatory upgrade for next gen performance that will likely negate any upgrade path for 2015 systems.
  • LuciferousLuciferous Posts: 1,611
    Project 2501
    I have a 8800 GT SLI (or not SLI for the rift) so pretty old and PCI-E 2.0 which will also slow the whole thing down.
    Amazingly it still plays modern games without the Rift.

    I think taking your comments in balance I will update my PC MOTHERBAORD,CPU,RAM and with SSD and get a reasonably priced mid range card. Then when CV1 comes out update it to a high end GPU.

    Thanks for your posts much appreciated.
  • khazarkhazar Posts: 161
    Luciferous wrote:
    I have a 8800 GT SLI (or not SLI for the rift) so pretty old and PCI-E 2.0 which will also slow the whole thing down.
    Amazingly it still plays modern games without the Rift.

    If the shaders and DirectX versions support it, why wouldn't it? ^^ But I doubt most people would call it a nice gaming experience.
    DK2 Status: ARRIVED AND WORKING
    Arrival: 29.07.
  • hellaryhellary Posts: 236
    Hiro Protagonist
    Luciferous wrote:
    I have a 8800 GT SLI (or not SLI for the rift) so pretty old and PCI-E 2.0 which will also slow the whole thing down.
    Amazingly it still plays modern games without the Rift.

    Out of interest, have you tried playing Rift games in SLI, in particular with Split Frame Rendering mode (would need to fiddle with nVidia settings to force that over Alternate Frame Rendering)?
  • ThreeEyesThreeEyes Posts: 2,230
    NerveGear
    willste wrote:
    If Nvidia goes Nvlink in 2016 that will be a mandatory upgrade for next gen performance that will likely negate any upgrade path for 2015 systems.

    I was looking at the Nvlink stuff - that's going to be a radical change but they also look to be rolling it out in the server side before consumer desktops so the wait could be even longer. Probably some changes to how they use PCI first in the consumer world.

    But Nvlink is radical and kind of runs out board area with the mezzanine connectors. If that moves into the consumer area, it really does look like everything will change. Motherboards, power supplies, and cases.

    I like the idea of speed but it is a very radical change at least in how it is being presented now. I don't like the idea of obsoleting every bit of expensive computer equipment I already have.
    But... but... but... I just NEED to know about the Baba! The Baba has me hypmotized! :shock:
  • LuciferousLuciferous Posts: 1,611
    Project 2501
    hellary wrote:
    Luciferous wrote:
    I have a 8800 GT SLI (or not SLI for the rift) so pretty old and PCI-E 2.0 which will also slow the whole thing down.
    Amazingly it still plays modern games without the Rift.

    Out of interest, have you tried playing Rift games in SLI, in particular with Split Frame Rendering mode (would need to fiddle with nVidia settings to force that over Alternate Frame Rendering)?

    I have tried SLI with Rift games but not with Split Frame Rendering mode. Is it any good?

    Just seen the road to VR why you should hold off buying a new Rig until next year. I have toyed with the idea of buying a second hand one off Ebay, just to get me by.
  • willstewillste Posts: 675
    Brain Burst
    ThreeEyes wrote:
    willste wrote:
    If Nvidia goes Nvlink in 2016 that will be a mandatory upgrade for next gen performance that will likely negate any upgrade path for 2015 systems.

    I was looking at the Nvlink stuff - that's going to be a radical change but they also look to be rolling it out in the server side before consumer desktops so the wait could be even longer. Probably some changes to how they use PCI first in the consumer world.

    But Nvlink is radical and kind of runs out board area with the mezzanine connectors. If that moves into the consumer area, it really does look like everything will change. Motherboards, power supplies, and cases.

    I like the idea of speed but it is a very radical change at least in how it is being presented now. I don't like the idea of obsoleting every bit of expensive computer equipment I already have.


    I would tend to agree that this will likely get pushed out. In the road map just prior they had the 8 series slated for a better GPU with an onboard unified memory interface that sounded like a new architecture that would have still used PCI-E.

    However they scratched that and made the 8 series just a die shrink. Nvidia at least seems more commited to the Pascal\Volta direction and aren't bothering to screw around with an intermediary design. [url]http://blogs.nvidia.com/blog/2014/03/25/gpu-roadmap-pascal/[\url]

    I guess if they go this route we may end up with Nvidia only motherboards if AMD doesn't want to license their tech and Nvidia doesn't make it an open design. I really wish a third party would come up with a better interface because this direction sounds like the right one to go to get next gen graphics performance. This could be the next 8800 GTX.[/url]
  • Crespo80Crespo80 Posts: 604
    Art3mis
    NvLink is an attempt from Nvidia to control the high-end PC graphics, but it will take many years to replace the current still-in-active-development standard (PCI-Express 4.0 is about to come in 2016 with Intel Skylake).
    Major competitors Intel and AMD are moving towards integrated CPU and GPU cores that eventually will provide a good-enough solution for 90% of the consumer PC market, so that NvLink will be used only by professionals (super-computers, workstations, servers) or on high-end multi-GPU consumer setups.
    It's clear CPUs and GPUs won't be separated for too long, and the slow but steady decline of traditional desktop PCs will eventually make non-profitable the niche market of consumer high-end mainboards and expansion cards (including GPUs) and we will see the end of the traditional PC, maybe another 10 to 15 years, let's say 2031: the death of the PC exactly 50 years from the birth of the very first IBM PC :mrgreen:

    Why am I always OT? :lol:
  • hellaryhellary Posts: 236
    Hiro Protagonist
    Luciferous wrote:
    hellary wrote:
    Luciferous wrote:
    I have a 8800 GT SLI (or not SLI for the rift) so pretty old and PCI-E 2.0 which will also slow the whole thing down.
    Amazingly it still plays modern games without the Rift.

    Out of interest, have you tried playing Rift games in SLI, in particular with Split Frame Rendering mode (would need to fiddle with nVidia settings to force that over Alternate Frame Rendering)?

    I have tried SLI with Rift games but not with Split Frame Rendering mode. Is it any good?

    Just seen the road to VR why you should hold off buying a new Rig until next year. I have toyed with the idea of buying a second hand one off Ebay, just to get me by.

    From what I've read, Split Frame Rendering doesn't give as much of a performance boost as Alternate Frame Rendering but it also doesn't suffer from microstutter issues so I'm hoping it'll allow for SLI boosts even while rifting. I've got a 590 which I'd rather not replace until the next generation!
Sign In or Register to comment.