• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

What videocard upgrade strategy do most peeps use?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
I used to wait until I could get a videocard that was at least 100% faster just so I'd see some real gains in perf. but that was back when videocards were still relatively cheap from a price/performance ratio. I waited nearly 5 years to upgrade my 1080ti but only because its performance deficits were really noticeable in Metro: Exodus and I had no RT acceleration. I'm not considering upgrading anything ATM because of financial constraints, but what are some good videocard upgrade strats people are using following post-cryptocurrency price inflation? I'm looking for ideas on whether or not to immediately sell what you have and move on up when the next-gen videocards are released, or to hold onto to what you've got until you have no choice but to upgrade or to maybe wait until the halo model nextgen cards come out.
 
If {not fast enough} then if {affordable} then {buy faster GPU};

My "main" GPU history going back around 10 years or so:

280X
980 Ti
1080 Ti
(2080 Ti)
3070
4070

I can't remember exactly what monitor I had when, but I think I was around 1080p60 through to 980 Ti era. The 1080 Ti probably went along with getting a 1440p 144 Hz display. The move beyond that was when I wanted to drive a 4k TV, but this is occasional use so I didn't need highest performance.

All were bought new except for the 2080 Ti, which was bought used after the 30 series was already released. That was during the great shortage. I really wanted a 3080 but no chance. I found the 2080 Ti at a good enough price which kept me going until I replaced it with a new 3070. Why the sidegrade in performance? I needed 30 series for VRR support on my TV and easily sold the 2080 Ti for near enough what I bought it for. The 4070 was the 3080 I wanted. As long as I don't get a 4k gaming display as main I don't need to upgrade beyond the 4070. I'm looking at 4k gaming displays but might wait until next gen GPUs first.
 
Last edited:
@mackerel
Thanks for the reply. So for a while there you were buying top tier videocards (I also had the 980ti and 1080ti). Metro:Exodus and CP2077 still challenge my 4090 at maxed graphics settings. I can't run the Last of Us part 1 at the highest (DSR) resolutions or the System Shock remake or Control.

I was looking at upgrading my 1080ti during the ampere gen. but the crytocurrency shortage shelved those plans indefinitely. The 2080ti wasn't that impressive when compared to my overclocked 1080ti to be worth it to me.

R U at all worried about the perf. deficit you experience by only having PCIe3.0 lanes on your Skylake CPU? I've read that w/my 9700k I could be giving up some 5% of the perf. of my 4090 by running it on PCIe 3.0.
 
I've read that w/my 9700k I could be giving up some 5% of the perf. of my 4090 by running it on PCIe 3.0.
There are plenty of articles covering that (3.0 x16 is about 2% slower with a 13900k). Here's one...




....between that and core count/IPC of your processor, you definitely have a glass ceiling in some titles, even at 4k.

As far as how I buy, throw out the reviewer, and no way I have a 4090 for 2560x1440 165. I'd likely be in the 4070ti+ range. My goals would be to play any of my games at ultra 165+ fps. When that starts lacking, it's time.Everyoke is going to he different.
 
There are plenty of articles covering that (3.0 x16 is about 2% slower with a 13900k). Here's one...




....between that and core count/IPC of your processor, you definitely have a glass ceiling in some titles, even at 4k.
If you ask me 2% is nothing.
 
Thanks for the reply. So for a while there you were buying top tier videocards (I also had the 980ti and 1080ti).
Then I was running up to 1440p and on games of that time, 1080 Ti was sufficient.

The 2080ti wasn't that impressive when compared to my overclocked 1080ti to be worth it to me.
Yeah, that's only about 1 tier upgrade so not that impactful. Putting aside the feature upgrades anyway. 3080 would have been 2 tiers up but 2080 Ti was all I could get at the time, eventually settling on a 3070. It is still an upgrade, but as I was moving into 4k uses it was weak hence getting the 4070. This satisfies me for now but I am wondering where 5080 will fall for a high end 4k experience.

R U at all worried about the perf. deficit you experience by only having PCIe3.0 lanes on your Skylake CPU? I've read that w/my 9700k I could be giving up some 5% of the perf. of my 4090 by running it on PCIe 3.0.
It isn't noticeable outside of running benchmarks. I don't "play" benchmarks any more :D
 
Then I was running up to 1440p and on games of that time, 1080 Ti was sufficient.


Yeah, that's only about 1 tier upgrade so not that impactful. Putting aside the feature upgrades anyway. 3080 would have been 2 tiers up but 2080 Ti was all I could get at the time, eventually settling on a 3070. It is still an upgrade, but as I was moving into 4k uses it was weak hence getting the 4070. This satisfies me for now but I am wondering where 5080 will fall for a high end 4k experience.


It isn't noticeable outside of running benchmarks. I don't "play" benchmarks any more :D
How much of an OC did you get on that 1080 TI?
 
How much of an OC did you get on that 1080 TI?
I ran it stock outside of competitive benching. Below is one hwbot sub and you can click around for others. That wallpaper! Was it for a challenge here?

Edit: better clock on this one.
 
I wait until I have to start turning settings down in games to buy a new one. If some new game comes out that can't run max or near max I will buy a new card. My last upgrade was from a 970 to my current 3080 so it tends to be a long time.
 
I'm not someone that needs to run at ultra (in almost every game it's almost indiscernible from high, especially when not looking at side by side screenshots and nitpicking pixels). So if I can follow an optimization guide for 4K and get near my 120hz refresh rate I'm good.

I'm also along those that think RT is barely noticeable outside of a couple examples (Minecraft), so I don't spec for it.

All that said, if I see enough of a boost from my 3080 to a 5080/90 next generation I'll do it, and if it lets me run RT at my displays refresh rate I'll leave it on, but I'm not taking a framerate hit to use it
 
I'm not someone that needs to run at ultra (in almost every game it's almost indiscernible from high, especially when not looking at side by side screenshots and nitpicking pixels). So if I can follow an optimization guide for 4K and get near my 120hz refresh rate I'm good.

I'm also along those that think RT is barely noticeable outside of a couple examples (Minecraft), so I don't spec for it.

All that said, if I see enough of a boost from my 3080 to a 5080/90 next generation I'll do it, and if it lets me run RT at my displays refresh rate I'll leave it on, but I'm not taking a framerate hit to use it
I bet the prices for 5000 GPUS will be high. :bang head
 
I wait until I have to start turning settings down in games to buy a new one. If some new game comes out that can't run max or near max I will buy a new card. My last upgrade was from a 970 to my current 3080 so it tends to be a long time.
IMO I'm ok with turning down settings, to a degree. Just be careful that there is no standard to what low, medium, high, ultra/max actually means. Some games fall apart at medium where others can still look great at low. It's reaching a balance of both quality and performance we're looking for, not arbitrary labels on presets.
 
Indeed.

Still, generically, that's a goal for most users as far as IQ, is ultra. Unless you're a competitive gamer where turning settings down benefits in some way (be it higher FPS or seeing people better), that's typically your goal. Followed closely, if not, 1A, by FPS. The goal is to hit a minimum of 60 FPS (some genres fare better below that value) or w/e the refresh rate of your monitor is. Ideally, you don't want to walk into a new video card not able to run Ultra settings at your refresh rate. Regardless of what small difference that may make, it's a mental hurdle not to be able to run ultra on day 1, to me.
 
I'm not someone that needs to run at ultra (in almost every game it's almost indiscernible from high, especially when not looking at side by side screenshots and nitpicking pixels). So if I can follow an optimization guide for 4K and get near my 120hz refresh rate I'm good.

I'm also along those that think RT is barely noticeable outside of a couple examples (Minecraft), so I don't spec for it.

All that said, if I see enough of a boost from my 3080 to a 5080/90 next generation I'll do it, and if it lets me run RT at my displays refresh rate I'll leave it on, but I'm not taking a framerate hit to use it
I used to think RT was barely noticeable as well but in Control the diff. between RT and non-RT is instantly noticeable -- practically everything that can reflect light has reflections w/RT on (incl. windows and water). I believe even the visors of the enemy feature reflections w/RT on in Control. In Metro Exodus though, I really didn't notice the RT effects in Metro Exodus Enhanced Edition over the regular version.

Resident Evil 7 also has RT effects, but the game is so dark the only differences I noticed were that water reflections were more accurate:

One of the comments about RE7 RT effects in the youtube video:
'RT gave more light to the world in general. I could finally see in dark zones. I literally couldn't see the door in Main House 1F Laundry Room because of the dark XD. So Ray Tracing was way better option for me'

Maybe in CP2077 or its DLC the RT effects are more noticeable but I've never really checked, because, while I have the game, I've never played it.
 
My strategy has loosely been based on availability coupled with my trips back to the US (exchange rate FTW). I can't remember how the entire history has gone (starting with a TNT2) but the last bits have been from 8800GTs in SLI to I think a GTX460->GTX570s in SLI->OG Titans in SLI->1080Ti->2080Ti->3080Ti.

I'll hold off for the 5000 series at this point. I'm doing 5k2x ultrawide and it works fine for now. I've alluded to looking at the Samsung G95NC so that will require some serious horsepower and no need to pick up a current gen card when the 5k is essentially right around the corner. Not to mention I will need DP2.1 which the 4k series lacks.
 
When I can't run a game that I want to play. Not many games I like playing so, I get quite a bit of life out of my card. After playing The Last of Us in HDR, I want to get a TV that can do HDR@120Hz@4K not that interested in RTX but it's a bonus 8900XTX/9900XTX probably on the cards....
 
Last edited:
I had the following
GTX680 ->GTX 970 -> GTX 1080 -> GTX 1080Ti -> RTX 2080s -> RTX 2080Ti -> RTX 3080Ti -> RTX 4090

My upgrade strategy? Buy low sell high... Except the 3080 all of those cards were used.
 
Back