• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Are the power requirements of top tier videocards just going to keep increasing?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
I still remember when AGP videocards had one molex connector for power and now looking at my 4090 and its octopus cabling I wonder if this will be the future of videocards or worse. The air cooling solution for my 4090 is similarly outrageous -- I had to cut out the HDD cage on my case just to get the 4090 to fit and it's not the largest 4090 manufactured today.
 
Max power consumption of the highest end GPUs, maybe, could keep increasing.

Power consumption for a performance level, should be generally flat or trend down.

Roughly similar performance level:
980 Ti: 250W
1070: 150W
1660Ti/Super: 125/120W
3050 8GB: 130W but we do get the additional RTX features

If we start higher up in the chain, and keep within the RTX era:
2080 Ti: 250W
3070: 220W
4060 Ti 8GB: 160W

What if we instead look across marketing tiers? Let's stick to 80 GPUs:
980: 165W
1080: 180W
2080: 215W
3080: 320W
4080: 320W

In this direction we do see a ramp up but it may (or may not) have flattened out. Let's see with the 5080.

Same but lower down the stack:
960: 120W
1060 6GB: 120W
2060: 160W
3060 12GB: 170W
4060: 115W

This time it is a bit of a roller coaster, but the more mainstream buyers wont have any trouble with this power level.

There is one practical thing which will limit power consumption: how much power you can pull out of a household socket. But this is more for the extreme power users or someone doing beyond normal household stuff.
 
Max power consumption of the highest end GPUs, maybe, could keep increasing.

Power consumption for a performance level, should be generally flat or trend down.

Roughly similar performance level:
980 Ti: 250W
1070: 150W
1660Ti/Super: 125/120W
3050 8GB: 130W but we do get the additional RTX features

If we start higher up in the chain, and keep within the RTX era:
2080 Ti: 250W
3070: 220W
4060 Ti 8GB: 160W

What if we instead look across marketing tiers? Let's stick to 80 GPUs:
980: 165W
1080: 180W
2080: 215W
3080: 320W
4080: 320W

In this direction we do see a ramp up but it may (or may not) have flattened out. Let's see with the 5080.

Same but lower down the stack:
960: 120W
1060 6GB: 120W
2060: 160W
3060 12GB: 170W
4060: 115W

This time it is a bit of a roller coaster, but the more mainstream buyers wont have any trouble with this power level.

There is one practical thing which will limit power consumption: how much power you can pull out of a household socket. But this is more for the extreme power users or someone doing beyond normal household stuff.
My 1080 TIS go to 235 watts.
 
Have to agree with Mac. I dont see big increases coming to the mainstream, but can see it on the flagship/halo parts. Also, I beieve they are more efficent than their counterparts if you look at performance per watt values... that doesn't do much for our power bill or heat in the room, bht they are more efficient when you consider their work.
 
As power gets more expensive and more brands get into AI realm I think we will see a push towards efficiency and lower power consumption. I think this will drive GPU power requirements down as well since they are so closely linked. There will always be the biggest enthusiast style card (think 4090) that will have unique needs like the new ASUS BTF but for the most part I think we will see it drop eventually.
 
My 1080 TIS go to 235 watts.

My overclocked 1080Ti @ 2152 1.2V core voltage and underclocked VRAM (-503Mhz. or 5002Mhz. indicated in afterburner) was getting absolutely hammered by Metro:Exodus (as was my Rosewill Capstone 750W PSU):

Metro: Exodus: 8 hrs. (GPU: 60°, hot spot: 74.9°C, ambient: 73°F[22.8°C], HWiNFO64: 546.8Watts, aftrbrner: 468 watts,
FB usage: 93%, BUS usage: 48%, VRAM: 4804MiB, CPU: 83°C, delta T: °C, GPU 8-pin#1 min.: 11.5V, GPU 8-pin#2 min.: 11.52V,
GPU PCIe input voltage: 11.97V).

The voltage drops above were happening even after I connected the daisy chained, male 8-pin PCIe connectors on the two PSU cables to two separate +12V molex cables to provide more +12V power to the two 8-pin PCIe power connectors. My 1080ti never throttled because of the ASUS XOC VBIOS.

Metro Exodus Enhanced Edition w/RTX effects maxed posted the following power/voltage readings with my 4090 (but it was throttling constantly):

28 hrs. (GPU: 76°C, FB usage: 95%, BUS usage: 66%, VRAM: 10419MiB, CPU: 87°C, aftrbrner: 582Watts, HWiNFO64
GPU 16-pin HVPWR power max.: 550 Watts, GPU PCIe +12V Input Power max.: 16.4W, ambient: 78°F/25.6°C, hot spot: 83.9°C, max boost 2745Mhz.,
GPU 16-pin HVPWR min.: 11.8V, GPU PCIe +12V input voltage min.: 11.865V, process RAM usage: 8192MiB).

My 4090 was causing system resets w/my Rosewill Capstone 750W PSU so I had to hook it up to its own external PSU.
 
Nvidia 8600 GTS, vmod and a massive heatsink, 1ghz core and only once molex connector, LESSGO.

Honestly I'm waiting for the next generation of mid range to out perform my 2080 and this is half the reason. I like my cheap electric bill.

(And done are the days of paying $800+ for marginal improvements.)
 
@freakdiablo
You have to remember the dollar was worth a LOT more back when the 8600 GTS came out. The price on the 4090 isn't so bad when compared to the prices of the dual GPU SLI/Crossfire cards of 2014. The 4090 is actually cheaper in cost adjusted dollars than the Titan Z was and about the same as the R9 295x2.
 
Nvidia 8600 GTS, vmod and a massive heatsink, 1ghz core and only once molex connector, LESSGO.

Honestly I'm waiting for the next generation of mid range to out perform my 2080 and this is half the reason. I like my cheap electric bill.

(And done are the days of paying $800+ for marginal improvements.)

The current generation already did that by ~30W less and 20% higher average FPS (just a quick look at some tests and games). At least considering that the mid range is RTX4070 ... when you add "Super" then the results are next 10% better. It would be hard for nvidia to release something worse in the next gen.
 
Back