cancel
Showing results for 
Search instead for 
Did you mean: 

Subnautica makes my GPU get stupid hot (!)

Blyss4226
Rising Star
Has anyone else had this problem? I love Subnautica but somehow it literally bakes my GPU and causes it to throttle very quickly. No other game, VR or not gets it remotely as hot. I have an R9 290 running at 1100 Core and 1450 mem and it's rock solid stable there. I've put probably 100 hours of load on it at those settings with no issues, no overheating, nothing.

I checked with GPU-Z and it shows my GPU is pulling 360 (!) watts when running Subnautica. Of course, I tried turning my overclock off. It lasts a little longer(about an hour compared to 30 minutes) but it still ultimately hits around 95C (!) and throttles all to hell and drops me to like 20ish FPS.

Specs and stuff:
i7 3770k @ 4.2Ghz
R9 290  @ 1100 core, 1450 mem (as I said I tried with no OC as well)
16GB Ram
Windows 10 x64
Subnautica latest version, on Steam, on my SSD, running with "Recommended" graphics setting

My only way to play at this point is basically take a half hour break every hour or so to let things cool down. I repeat I do not have to do this in ANY other game, VR or not. It's suspicious. My only idea is it's doing some crazy shader work that just puts a stupid level of load on there.
Gaming: Intel i7 3770k @ 4.2Ghz | R9 290 | 16GB RAM | 240GB SSD | 1 TB HDD Server: AMD FX 6300 @ 4.4Ghz | GT 610 | 8GB RAM | 240GB SSD | 320GB HDD
34 REPLIES 34

Synthetic
Rising Star
be careful on some of these early alpha beta games.... there is no control sometimes and you will stress out hardware

Blyss4226
Rising Star
Well, removing the side panel of my PC seems to have solved the problem for now. Still, shouldn't have to.
Gaming: Intel i7 3770k @ 4.2Ghz | R9 290 | 16GB RAM | 240GB SSD | 1 TB HDD Server: AMD FX 6300 @ 4.4Ghz | GT 610 | 8GB RAM | 240GB SSD | 320GB HDD

Rayvolution
Heroic Explorer
It's quite possible the game is poorly optimized on the rendering side, it's that way with many Early Access games. During Early Access it's extremely hard for game developers to optimize while adding major mechanics and content, because every major change usually bogs down the system or effects other optimizations, so if you focus too much on optimizing-as-you-go the game takes 3 times as long to develop. Of course, there's a flipside to this too, if you never optimize you end up with a terrible under-performing game (*cough* Minecraft *cough*) that ends up so messy you can't easily fix it.

For example, in my game (Also in Early Access) I tend to see myself doing this constantly;
1. Add a cool new feature, bogging down the game. Release it as an unstable patch (Opt-in beta patch basically) that runs slower than the previous stable patch.
2. Users play it, provide feedback, I made adjustments as needed while working on the next unstable patch that probably has even more new mechanics/content.
3. After a few times running through step 1 and 2, I eventually go into a pure-optimizing phase and just work on code for a few days, adding absolutely nothing new at all but trying to get the game to run well for most machines.
4. Release a "stable" patch to everyone, then restart the entire process and work on the next unstable patch.

Problem is, somewhere between 1 and 4 the end result is almost always a slower or same-performance game, so I find myself having to put aside a week here or there to just focus on improving the game code to play "catch up". It's not until the tail end of game development when most all of the major features are added you transition into just making the game run as fast as humanly possible, because a lot of the time you spend optimizing tends to unravel later as you do other work on related code.

Thus, in Early Access it's very common to have a game that works fast enough on most gaming machines, just well enough to get by without anyone grabbing the pitchforks. So even though Subnautica isn't *that* graphically intense compared to something like Wolfenstein TNO, DooM, or Fallout 4, it's probably gobbling a lot more CPU/GPU.

Not to mention playing Subnautica on the Rift is pushing almost double the pixels per second than if you played it on a standard 1080p60 display. My gut tells me they're optimizing it for 1080p60. 😄

Blyss4226
Rising Star
The thing is though it runs fine (60+ 90% of the time) until my GPU gets so hot that it throttles. It's not that it's badly optimized(and it is BEAUTIFUL LOOKING), it's just that it eventually causes my GPU to clock down by lots because of heat.
Gaming: Intel i7 3770k @ 4.2Ghz | R9 290 | 16GB RAM | 240GB SSD | 1 TB HDD Server: AMD FX 6300 @ 4.4Ghz | GT 610 | 8GB RAM | 240GB SSD | 320GB HDD

Zambrick
Expert Protege
Change the settings on the video card for that game specifically. Do not let Subnautica run while the video settings are set to Application controlled. For that game set the power management to performance instead of quality. What ever they are using to simulate Depth of field kills card performance.

Blyss4226
Rising Star
I will give that a try, thanks for the tip!

Edit: @Zambrick - Where do I find that setting? I cannot find it. Looked all through CCC (I am not and will not use Crimson settings app... useless crap it is).

Also just a note: I keep my GPU fan pegged at 100% when gaming because noise is not a concern for me.
Gaming: Intel i7 3770k @ 4.2Ghz | R9 290 | 16GB RAM | 240GB SSD | 1 TB HDD Server: AMD FX 6300 @ 4.4Ghz | GT 610 | 8GB RAM | 240GB SSD | 320GB HDD

Rayvolution
Heroic Explorer

Blyss4226 said:

The thing is though it runs fine (60+ 90% of the time) until my GPU gets so hot that it throttles. It's not that it's badly optimized(and it is BEAUTIFUL LOOKING), it's just that it eventually causes my GPU to clock down by lots because of heat.


May seem silly, but is it possible since you got a pretty beefy video card you've never actually pushed it to it's limit until now, and you don't have adequate cooling?

Blyss4226
Rising Star
I have though. I put this card through it's paces the day I got it. I've kept it at 100% load for many hours at a time(a good example of that is Fallout 4 maxed out and downsampled from 1440p or GTA V near maxed with 4x MSAA) - confirmed 100% load by GPU-Z.

It does get hot(~75C), but never hot enough to throttle. I have two front 120mm intakes, one top 120mm exhaust. The PSU intakes and exhausts completely through itself and so doesn't affect case temps. CPU is watercooled and exhausts through a rear 120mm radfan. And note I said I keep GPU fan pegged to 100% while gaming because noise doesn't matter(as I'm always using headphones of some type)

After noticing this problem, I ran some benchmarks and test to see if I could heat it up like that with anything else and I could not. Not even close. I managed to push it to 74C with Tomb Raider 2013 maxed with 4x supersampling.
Gaming: Intel i7 3770k @ 4.2Ghz | R9 290 | 16GB RAM | 240GB SSD | 1 TB HDD Server: AMD FX 6300 @ 4.4Ghz | GT 610 | 8GB RAM | 240GB SSD | 320GB HDD

Synthetic
Rising Star
sometimes I find games that are uncapped GPU usage..... they will pump as many frames as they can even tho you only need 60 or 90.... and have no vsync
As long as it doesnt go above 85 90 degrees it should be safe..especially at default clocks. I dont like pushing the hardware that much like bitcoin mining and some folding puts way to much pressure on cores, like that stress test prime 95 or one of them, puts such a load on that it killed a fair few chips...

my folding for CERN was limited to 60-80% CPU usage as well so this could be avoided

these are long period stresses tho