cancel
Showing results for 
Search instead for 
Did you mean: 

NVIDIA G-Sync

tbhausen
Explorer
This is a game-changer--literally:

http://nvidianews.nvidia.com/Releases/NVIDIA-Introduces-G-SYNC-Technology-for-Gaming-Monitors-Tears-...

Ouclus VR, please incorporate this into the Consumer Rift if possible 🙂
11 REPLIES 11

cegli
Honored Guest
This will be useful if it's open and can be implemented by AMD and other GPU manufacturers. If not, I'm afraid it will be forgotten like 3dfx Glide, PhysX, CUDA, etc. If not all cards can do it, it will be niche and eventually replaced by an open standard. Same goes for AMD's new audio processing!

cybereality
Grand Champion
Sounds really awesome, and I will certainly love to see this in person.

Was looking at getting a new 3D Vision monitor (144Hz) but I may wait to see what happens with this.
AMD Ryzen 7 1800X | MSI X370 Titanium | G.Skill 16GB DDR4 3200 | EVGA SuperNOVA 1000 | Corsair Hydro H110i Gigabyte RX Vega 64 x2 | Samsung 960 Evo M.2 500GB | Seagate FireCuda SSHD 2TB | Phanteks ENTHOO EVOLV

mdk
Honored Guest
"cegli" wrote:
This will be useful if it's open and can be implemented by AMD and other GPU manufacturers. If not, I'm afraid it will be forgotten like 3dfx Glide, PhysX, CUDA, etc. If not all cards can do it, it will be niche and eventually replaced by an open standard. Same goes for AMD's new audio processing!


Will be forgotten like CUDA?!?!?

Dude CUDA is widely used by many many people. Not in gaming, but in GPU computing. I use a CUDA based renderer for 3D visualization at work.

cegli
Honored Guest
"cybereality" wrote:
Sounds really awesome, and I will certainly love to see this in person.

Was looking at getting a new 3D Vision monitor (144Hz) but I may wait to see what happens with this.


Yes, this is a much more elegant solution than a high refresh rate monitor and Vsync. G-Synch is definitely the way forward. Here's John Carmack's relevant tweets:

"John Carmack" wrote:
G-Sync won't work on any of the display panels Oculus is considering right now, but it will probably be influencing many future panels."

"Jeff Atwood ‏" wrote:
Great tech, but can it really survive as an NVIDIA only thing, versus an industry standard everyone could support?

"John Carmack" wrote:
I think everyone will wind up with it, or something similar. It is low hanging fruit.

It will be great if everyone really adopts it. Fixed refresh rates don't make any sense on computer monitors. 120hz monitors that sync to the GPU frame rate, plus light-boost? Yes please!

"mdk" wrote:

Will be forgotten like CUDA?!?!?

Dude CUDA is widely used by many many people. Not in gaming, but in GPU computing. I use a CUDA based renderer for 3D visualization at work.


Yes, I know it's still used right now, but it will be eventually forgotten. Things will slowly shift to OpenCL, or any future open standard. There was a point in history where everything used 3DFX Glide, but then Direct3D and OpenGL came out. Slowly but surely CUDA will fade, trust me.

mdk
Honored Guest
Yes, I know it's still used right now, but it will be eventually forgotten. Things will slowly shift to OpenCL, or any future open standard. There was a point in history where everything used 3DFX Glide, but then Direct3D and OpenGL came out. Slowly but surely CUDA will fade, trust me.


If you look it like that then everything will be forgotten eventually when replaced with new technology including all open standards. 😉

cegli
Honored Guest
"mdk" wrote:
Yes, I know it's still used right now, but it will be eventually forgotten. Things will slowly shift to OpenCL, or any future open standard. There was a point in history where everything used 3DFX Glide, but then Direct3D and OpenGL came out. Slowly but surely CUDA will fade, trust me.


If you look it like that then everything will be forgotten eventually when replaced with new technology including all open standards. 😉


Haha, yes, but some faster than others. Look how long Direct3D and OpenGL have lasted. It just makes sense. If you're programming something starting now, would you rather pick the one that supports 100% of all current cards being produced or 16.1% (Nvidia's current GPU market penetration)?

JoeReMi
Honored Guest
16.1% sounds very low...

cegli
Honored Guest
Yes, 16.1% includes the fact that Intel has a 62.0% market share, and AMD has a 21.9% market share. The trend will only continue as iGPUs become more powerful!

Nvidia has something like 60% of the discrete market.

Frito
Explorer
Nice, but I would like to see the pricetag on enabled monitors... :?
Backer "Have faith." -Palmer Luckey