- Joined
- Jan 4, 2024
- Location
- Indiana
If someone explain to me that how come the ram in your computer can't somehow be used by the GPU? Also in 2024 I think any GPU over $500 should at least have 16 GIGS of Vram.
Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!
Random take...fair enough, lol.Also in 2024 I think any GPU over $500 should at least have 16 GIGS of Vram.
Ok I now understand but IMO I play 1440p 144hz 1GtG and I do have a few games that come close to 12 Gigs. And game like alien wake 2 and cyberpunk 2077 I might not be able to play on highest settings. I have the first alien wake on highest settings it is like 50 FPS and 60 FPS. My 2 1080 tis sc2s does not play crisis on max settings and is using the 2 cards. I know we all want the best when it comes to gamming, but I have learned over time sometimes you just can't max out the settings.Because the latency/speed between accessing the system RAM vs the integrated on the GPU vRAM (GDDR6x, for example) is a lot slower.
Random take...fair enough, lol.
The ones with 12GB aren't choking on any res it's appropriate for. So, have more you likely won't use and pay for it, lol. I'd be plenty happy buying a 12GB card today for 2560x1440 and most 4K games. So long as you aren't modding games, 12GB is plenty... remember most users are at 1080p (~60%)..... with 4K representing a mere 4% (2560x1440 is around 17%) according to steam stats. So, that's cool to wish for more, but, most users just plain don't need it. And by the time they do, the GPU can be too slow for modern titles in the first place.
Does that have anything to do with vRAM limits, though? If you're using 12GB it could... otherwise, it's just the game/coding.I have the first alien wake on highest settings it is like 50 FPS and 60 FPS.
SLI doesn't add up the memory. It's mirrored.My 2 1080 tis sc2s does not play crisis on max settings and is using the 2 cards.
That's true but it isn't always attributed (I'd say more rarely) to a lack of vRAM. If you're running out of vRAM, you have too little card for the resolution you're trying to play.but I have learned over time sometimes you just can't max out the settings.
GPUs can use system ram as spill over. The thing is, you usually don't want it to because it is so much slower to access. Latency has been mentioned but IMO doesn't matter. When you're transferring GBs of data, bandwidth is king. Picking a 4070 as an example, that has just over 500GB/s of bandwidth on card. Say you have a dual channel DDR5 system running at 6000 MT/s. That's just under 94GB/s. But that isn't the only problem. It has to go over PCIe between GPU and ram. PCIe 4.0 x16 is only 32GB/s bandwidth. Even doubling that for PCIe 5.0 when it eventually arrives and you're still quite a bit short. NVLink provided an additional connection outside of PCIe. On GA102 that offered 56GB/s each direction. It helps, but still far below what the GPU local VRAM is capable of.If someone explain to me that how come the ram in your computer can't somehow be used by the GPU? Also in 2024 I think any GPU over $500 should at least have 16 GIGS of Vram.
Thank you for that great expiation.GPUs can use system ram as spill over. The thing is, you usually don't want it to because it is so much slower to access. Latency has been mentioned but IMO doesn't matter. When you're transferring GBs of data, bandwidth is king. Picking a 4070 as an example, that has just over 500GB/s of bandwidth on card. Say you have a dual channel DDR5 system running at 6000 MT/s. That's just under 94GB/s. But that isn't the only problem. It has to go over PCIe between GPU and ram. PCIe 4.0 x16 is only 32GB/s bandwidth. Even doubling that for PCIe 5.0 when it eventually arrives and you're still quite a bit short. NVLink provided an additional connection outside of PCIe. On GA102 that offered 56GB/s each direction. It helps, but still far below what the GPU local VRAM is capable of.
IMO the perceived problem of VRAM quantity is more on some gamers who have unrealistic expectations of applicable settings. While I still fall into the same trap myself using the following terms, "high" or "ultra" are not standardised across game developers. Appropriate settings can be picked for the hardware. There is no guarantee you can use highest presets at high resolutions with any GPU, although a 4090 certainly would help. For cross platform games, the better current gen consoles on "performance" setting are roughly equivalent to low to medium settings on PC. Anything over that is a nice bonus for PC gamers.
I hear you man. One thing that annoys the piss out of me about armchair reviewers is that for eons they were complaining 8GB was not enough in 202x.. while those guys were complaining, I was playing at 4K with my 3070Ti. And a bunch of those same guys crying about 12GB cards when they don't even own the damned hardware lol.I'm running 1440p 165hz (120fps+ realistically on most games), and my 8gb are more than enough, the only game so far that I've had to cherry pick settings because of VRAM was The Last of Us, and with the latest patches even that runs at mostly ultra settings without hitting the limit. Side note though, I don't usually use RT
The area in red is on board ram, area in yellow is addon ram. VLBus video cardhow about we go back to when you could buy ram to add on to your video card. i know the ram was alot slower back then but still. Matrox millennium II with 4mb, 4mb ram add-on card. even older ISA video cards you could just pop the chip in, granted ISA had socked ram spaces for adding more in. the matrox card had pin like sockets, think like hooking your you led/hd led/power button type sockets. though if NV/Amd started doing that ram add-on cards are going to cost. then NV/AMD will pass along the cost of board re-design and money lost on you buying the higher end card with more ram. one way or the other they are going to make their money off you, be it a $500 8gb card or $500 12gb card.
Don't worry about that, programs/apps have priority over system ram.That is not good!!!