• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Gigabyte RTX 2060 12GB or MSI RTX 2070 Super 8GB?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

Droidriven

Registered
Joined
Jul 11, 2015
I'm trying to decide between these two cards. I can get the 2070 8GB for about $30 less than the 2060 12GB. Benchmarks show the 2070 is slightly faster than the 2060 but I'm wondering if 8GB vs 12GB should be a factor in my decision.
 
If you only game at 1080p... 8GB will be fine for bit.

What price are you getting these for?
 
If you only game at 1080p... 8GB will be fine for bit.

What price are you getting these ?
I'll tell you that after I buy it. The deal is the 2070 Super 8GB I can get is $30-35 cheaper than the 2060 12GB but I have to drive 45 minutes away to pick it up and the 2060 has to be shipped. And I have a 1440 monitor so I'd like to play at 1440 if I can.

Also, I plan to carry whichever card I get forward to my next rig on a 14th Gen DDR5 board in a year or so.
 
Yikes... tough call. 8gb is probably fine in most tiles and settings. Just keep an eye on vram use.
 
Yikes... tough call. 8gb is probably fine in most tiles and settings. Just keep an eye on vram use.
I don't play any titles that depend on really high frame rates. I don't do any of the first person mmo shooters or cyberpunk or heavy demanding titles like that. I think the 8GB will be fine too but I will probably use this card for at least 3-4 years so my only concern is playing titles that come out between now and then.

I think I'll go ahead and get the 2070 super, I can get it for $165. The 2060 12GB is $190.
 
2070 Super will be faster in most titles. Since it's an older gen, it won't have the same VRAM problems as the latest PCIe x8 cards. At least RTX3000 with 8GB, like 3070, doesn't have the mentioned problems and a significant FPS drop compared to RTX4060 or RX7600.

It's not about high FPS. It's more about some games with high graphics details that use large textures. If you lower graphics details then nearly all games will be fine.
 
Last edited:
I think the 8Gig will last for awhile.:)

Unless you want to play AAA titles with max details RT and high-quality textures at 1440p+ (more like 4K, but there are exceptions). The GB RX7900 GRE, 16GB from the last review, reached 16GB+ VRAM in benchmarks (GPU-Z screenshots with readings are in the review).

It's easy to recommend a new high-end graphics card, but prices are still far from reasonable, and most gamers can live at one step lower display resolution or details set to mid or high instead of ultra. The 8GB is more than enough. Ray tracing in most games also isn't so spectacular, but it's still very demanding and causes a 30%+ FPS drop.
 
Unless you want to play AAA titles with max details RT and high-quality textures at 1440p+ (more like 4K, but there are exceptions). The GB RX7900 GRE, 16GB from the last review, reached 16GB+ VRAM in benchmarks (GPU-Z screenshots with readings are in the review).
Thats a very good point.:unsure:
 
Unless you want to play AAA titles with max details RT and high-quality textures at 1440p+ (more like 4K, but there are exceptions). The GB RX7900 GRE, 16GB from the last review, reached 16GB+ VRAM in benchmarks (GPU-Z screenshots with readings are in the review).
To be fair, that's most people's goal is to run things on Ultra (RT is a different story IMO). He will undoubtedly need to turn down some settings in games like that sooner than later. 8GB is the minium I'd have on tap today for 1440p gaming.
 
Unless you want to play AAA titles with max details RT and high-quality textures at 1440p+ (more like 4K, but there are exceptions). The GB RX7900 GRE, 16GB from the last review, reached 16GB+ VRAM in benchmarks (GPU-Z screenshots with readings are in the review).

It's easy to recommend a new high-end graphics card, but prices are still far from reasonable, and most gamers can live at one step lower display resolution or details set to mid or high instead of ultra. The 8GB is more than enough. Ray tracing in most games also isn't so spectacular, but it's still very demanding and causes a 30%+ FPS drop.
Put into those terms, the 1650 4GB I'm currently using will do. It plays everything I want to play with no lag, I'm getting 40-60 fps and everything still looks and runs good in everything that I play and I don't play anything that requires extremely snappy response timing.
 
I never had any trouble with my 8GB 2060 Super (which I'm told was not much different than a 2070 Super or a regular 2070) at 4K. I think I'm at least halfway through CyperPunk 2077. I had a little bit of trouble towards the end... but that turned out to be my Corsair power supply being worthless for gaming.

Now I have an 8GB 4060 and I'm STILL gaming in 4K in 2024. It's not the vRAM that's going to stop you from being able to game in 4K... it's the "everything else."

So a 4GB difference is negligible.

One day I might consider a 16GB card... but 8GB vs 12GB?? Meh...
 
I never had any trouble with my 8GB 2060 Super (which I'm told was not much different than a 2070 Super or a regular 2070) at 4K. I think I'm at least halfway through CyperPunk 2077. I had a little bit of trouble towards the end... but that turned out to be my Corsair power supply being worthless for gaming.

Now I have an 8GB 4060 and I'm STILL gaming in 4K in 2024. It's not the vRAM that's going to stop you from being able to game in 4K... it's the "everything else."

So a 4GB difference is negligible.

One day I might consider a 16GB card... but 8GB vs 12GB?? Meh...
I mean..... it depends. It's not black and white. There are plenty of titles where 8GB is eclipsed at 4K. That said, not all titles that breach your vRAM capacity respond the same way. Some titles allocate what it can, but may not use it all, so a card with lesser capacity may still perform well. Others need everything it allocates and takes a hit when passing that mark. So while 'everything else' is a big problem, vRAM is, without a doubt a valid consideration. 8GB vs. 12GB could be the difference between playable and not in specific titles. The higher the res and the higher the setttings you want, the more potential to go past that amount. For giggles, I took screenshots from TPU's recent game reviews that show vRAM use. I went back to 2023 skipping only CS2 b/c, well, that looks like my kid drew it and not made for beauty, but tick rate.

At 4K, there's no way I'd go less than 12GB today for best results over the lifespan of the card and assuming you're shooting for 'ultra' gaming. YMMV, of course, depending on what titles you play, the settings, and the basic horsepower of the card in the first place, but me, I'm going higher than 8GB for the games I play..... if I played at 4K (or even 1440p). :)

 
Last edited:
I mean..... it depends. It's not as black and white. There are plenty of titles where 8GB is eclipsed at 4K. That said, not all titles that breach your vRAM capacity respond the same way. Some titles allocate what it can, but may not use it all, so a card with lesser capacity may still perform well. Others need everything it allocates and takes a hit when passing that mark. So while 'everything else' is a big problem, vRAM is, without a doubt a valid consideration. 8GB vs. 12GB could be the difference between playable and not in specific titles. The higher the res and the higher the setttings you want, the more potential to go past that amount. For giggles, I took screenshots from TPU's recent game reviews that show vRAM use. I went back to 2023 skipping only CS2 b/c, well, that looks like my kid drew it and not made for beauty, but tick rate.

At 4K, there's no way I'd go less than 12GB today for best results over the lifespan of the card and assuming you're shooting for 'ultra' gaming. YMMV of course depending on what titles you play, the settings, and the basic horsepower of the card in the first place, but me, I'm going higher than 8GB for the games I play..... if I played at 4K. :)

IMO 12 gig to 16 gig to me is enough to play 1440p games at max or high settings and 8 gigs with lower setting and 1080p.
 
I mean..... it depends. It's not black and white. There are plenty of titles where 8GB is eclipsed at 4K. That said, not all titles that breach your vRAM capacity respond the same way. Some titles allocate what it can, but may not use it all, so a card with lesser capacity may still perform well. Others need everything it allocates and takes a hit when passing that mark. So while 'everything else' is a big problem, vRAM is, without a doubt a valid consideration. 8GB vs. 12GB could be the difference between playable and not in specific titles. The higher the res and the higher the setttings you want, the more potential to go past that amount. For giggles, I took screenshots from TPU's recent game reviews that show vRAM use. I went back to 2023 skipping only CS2 b/c, well, that looks like my kid drew it and not made for beauty, but tick rate.

At 4K, there's no way I'd go less than 12GB today for best results over the lifespan of the card and assuming you're shooting for 'ultra' gaming. YMMV, of course, depending on what titles you play, the settings, and the basic horsepower of the card in the first place, but me, I'm going higher than 8GB for the games I play..... if I played at 4K (or even 1440p). :)


That's... WEIRD. I didn't see these images AT ALL in your post... but I see them now that I'm quoting you in the reply. Let me scroll up...

Ah... Okay... "Spoiler Button"... but it's NOT a spoiler! I should report you to the... oh yeah... never mind. :D

There's a general misunderstanding in your interpretation of those TPU results (since I actually play those games): That's just how much vRAM those games use IF YOU HAVE IT. I never had that much vRAM... and I've been playing CyberPunk, in 4K, since back when I had a 4GB 960 GTX (which, I'm proud to say, I still own).

If you DON'T have 12GB of vRAM... then all those games will just use whatever you have. So it's not like "Oh I only have 8GB of vRAM... the game won't work..."

It doesn't work that way. :)

And I mean I game playing CyberPunk COMFORTABLY... 63-70fps... in 4K. And NOT at "Very Low" quality either. You've seen my setup.

There's a LOT of space between "Very Low" and "Ultra."

I'm not sure what would happen if I switched everything to Ultra on my new 4060... but I'm pretty sure I could still play the game. Doubtlessly I'd take a hit in performance... but it's not like it would crash or something. (I haven't had a game or system crash in over ten years now... not even when my power supply was going berserk.)

If there were a way I could export and then import my EXTREMELY COMPLICATED SETTINGS... then I'd switch it to Ultra just to see what happens. But using mostly "Medium" settings... I've had no trouble at all with CyperPunk or Alan Wake.

On top of that... I'm pretty sure like NINETY PERCENT of gamers are running on 8GB cards. That's just like the default vRAM alottment. Most pre-built systems feature 8GB cards and I'm pretty sure RTX 4060s and whatever their AMD equivalents are outsell their 4080 counterparts by some ridiculous margin. Programmers are well-aware of this. So take something brand new like Robocop: I started it up, it auto-detected my settings... and I'm off to the races in 4K without having to do anything.

That's generally the 8GB 4K experience. Not just now... but for YEARS now.
 
Yes and no, as ED said it's not black and white, most games will auto select quality settings, but some won't, and others will pick them wrong. Some will play "fine" above the alloted buffer by intelligently swapping the data in it, some will have massive performance loss and hitching/stuttering, some will downright crash, some will simply refuse to load, all depends on the programming/graphics engine involved, check out my thread on the The Last of Us for example.

Completely agree (and have done it myself for as long as I can remember since I've never had high end systems) that most games have a ton of settings you can lower/tweak and not see almost any difference visually, Cyberpunk for example is a game that looks good even at low settings at 1440p/4k so this is a good way to reduce the load and improve FPS 👍🏻
 
There's a general misunderstanding in your interpretation of those TPU results (since I actually play those games): That's just how much vRAM those games use IF YOU HAVE IT. I never had that much vRAM... and I've been playing CyberPunk, in 4K, since back when I had a 4GB 960 GTX (which, I'm proud to say, I still own).

If you DON'T have 12GB of vRAM... then all those games will just use whatever you have. So it's not like "Oh I only have 8GB of vRAM... the game won't work..."

It doesn't work that way. :)
100% there is NOT a misunderstanding in my interpretation. I would however say there was a 100% chance you misinterpreted mine. I'd even go as far as saying we agree.....and you added details supporting those thoughts. :)

To be clear, when I said....
8GB vs. 12GB could be the difference between playable and not in specific titles.
I wasn't talking that games simply "will not work/load". By playable, I was talking about the 60 FPS threshold people generally shoot for (and again, I'm well aware that for some titles, you can get away with less and still have a solid gaming experience).

There's a LOT of space between "Very Low" and "Ultra."
Agree. That was covered in this passage...
YMMV, of course, depending on what titles you play, the settings,

On top of that... I'm pretty sure like NINETY PERCENT of gamers are running on 8GB cards. That's just like the default vRAM alottment. Most pre-built systems feature 8GB cards and I'm pretty sure RTX 4060s and whatever their AMD equivalents are outsell their 4080 counterparts by some ridiculous margin. Programmers are well-aware of this. So take something brand new like Robocop: I started it up, it auto-detected my settings... and I'm off to the races in 4K without having to do anything.
You're not wrong (that most users are running 8GB cards). But I think the significant part you're missing (that I've mentioned in other threads to you) is that ~3% of Steam gamers have a 4K monitor. I'd also venture to say that most people who have a 4K monitor are not rocking entry-level graphics cards for that resolution and shooting for less than ultra (that is the ultimate goal, right? - if you have the horsepower, that's generally where you try to set things). Surely you aren't alone, but I can't image it's common for gamers to go 4K with literally the least expensive current-gen GPU. ;)

Just understand that there are experiences other people have/had (like games crashing or not running due to lack of vRAM) that are valid too. Like mine and other GPU reviewers, who have experienced those things (and more, what Kenrou said) with budget cards. At the risk of this being said a third time...it's not black and white. ;)

Some will play "fine" above the alloted buffer by intelligently swapping the data in it, some will have massive performance loss and hitching/stuttering, some will downright crash, some will simply refuse to load, all depends on the programming/graphics engine involved, check out my thread on the The Last of Us for example.
QFT. The reviews I took those screenshots from (in the spoiler tags) state as much too. Some games had a message below that vRAM chart saying *see how it responds in the conclusion... and goes on to discuss what performance was like when surpassing those values. It varied. ;)

Anyway, I'm pretty sure, for the most part, we agree. The details you provided, Rainless (you too, K), will surely be helpful for others to understand the complexities of the 'not black and white' issue. :)
 
Back