• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

rtx 40x0 talk

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
lol i am becoming so impatient right now, im about to get a 3070!
 
If I told you, I'd have to strip you naked and do nefarious things.

... wait. I'd have to kill you. Sorry... getting confused in my old age. :p

LOL! So that's a yes!

Im so old that I can't tell the difference from nefarious things and a case of the vapors. Vapors is old people talk for gas. hahahah
Post magically merged:

lol i am becoming so impatient right now, im about to get a 3070!
Dude, I hear you. Been tempted to buy a 3090. But thats what Nvidia wants you to do. Hoooold out and you wont be sorry. I would be kicking myself if I bought a 3090 and then when the 4090 is released I would be butt hurt for months.
 
If I told you, I'd have to strip you naked and do nefarious things.
dont threaten him with a good time
Post magically merged:

Dude, I hear you. Been tempted to buy a 3090. But thats what Nvidia wants you to do. Hoooold out and you wont be sorry. I would be kicking myself if I bought a 3090 and then when the 4090 is released I would be butt hurt for months.
for sure, i really am starting to look at the 4070. i think they are gimping the 4060 this time around with the memory bw, forcing people to go a step up. now i forget dont the TI models come out like 6 months to a year later? i really would hate to have to wait...

on another note i might get a intel A380 card, hopefully they have smite worked out in the drivers. they updated the game to run on DX11 so i wouldn't have to wait for anything. clearly this would not be my main card but in my 5th gen backup PC. lol or i might just wait for their other ones to come out then choose on price.
 
if someone could, can you see if the L2 spec is correct for the 4090/4080/4070 cards?
 
Yeah as far as I'm aware everything is still rumors and numbers are changing
 
i was just asking because i doubt NV would drastically change the L2 cache. i am also assuming AMD calls the L2, the infinity cache, which case the top card has 96mb. that would match the 4090 specs currently listed on techpowerup. as far as i know, you wouldn't be able to confirm how much L2 it has unless NV said. I do not know GPUZ well enough to say it would actually see the L2 size poll or it just pulls a ID for a list in the program to say hey you have XXmb of L2.

As i still want to know how much this larger L2 pool is going to help gaming performance. with compute functions i can see it making a big difference.
 
Why do you want to know that specific value? What benefits does this have without knowing the other bits? Ive heard clocks speeds, cuda cores, vRAM, but I don't recall anyone asking specifically about L2 on a GPU.

As i still want to know how much this larger L2 pool is going to help gaming performance. with compute functions i can see it making a big difference.
This came in after, lol...

How can you tell??? So many others things were updated and improved in the new arch that it seems impossible to isolate what L2 cache did for it...?
 
just kind of feel like NV is going hey we got 96mb L2 just like AMD now! really how much of that is going to help or if it doesnt how much did it increase the TDP of the card for what we didnt need. while lower cards do have less L2 on them is still follows suite vs 3090 with 6mb. the new 4000's compared to 3000's L2 seems over kill.

if the 4060 is gimped that much by the memory bw then there is really no need for it to have 48mb L2 vs 3060 3mb L2.

lol i lost what im saying and my point is now, imma just sit quietly and wait for reviews.
 
Yeah, I think it's a bit too early to dissect unconfirmed specs. While I'm sure the amount of cache has been set for quite a while, I'd imagine clock speeds and memory speeds are still being tweaked.

My take... judge the cake as a whole, because who cares what kind of flour is used as it doesn't really make or break the cake. It's just one ingredient among many to make a great cake.

I guess I'm not bothered by matching L2 for a pissing contest... I just want to know the FPS results of the whole. :chair: :escape::rock:
 
I downgraded my gaming PC (again) and for what I need, I can't really see the difference. Right now I play on RTX3060, mainly in mmorpg at 2560x1440 and 100-165FPS, mixed medium-very high settings. I just like it mixed as all is sharper and for me looks better than at max details with too many reflections and blurred movement.
If I get anything from RTX4000 series then it will be for tests, just to run something new. However, I'm more interested in new AMD and if both were in stores right now then I would probably get AMD.
 
I downgraded my gaming PC (again) and for what I need, I can't really see the difference. Right now I play on RTX3060, mainly in mmorpg at 2560x1440 and 100-165FPS, mixed medium-very high settings. I just like it mixed as all is sharper and for me looks better than at max details with too many reflections and blurred movement.
If I get anything from RTX4000 series then it will be for tests, just to run something new. However, I'm more interested in new AMD and if both were in stores right now then I would probably get AMD.
I was temped to buy the RTX 3060 (12GB) instead of the RTX 3060 Ti (8GB) because it had more VRAM and was cheaper. Reviewers said the extra VRAM in the RTX 3060 was wasted because it was not fast enough to use it. In the games I play I don't see my RTX 3060 Ti even using all of its 8GB. How much of the VRAM in your RTX 3060 do you find your games using?
 
I was temped to buy the RTX 3060 (12GB) instead of the RTX 3060 Ti (8GB) because it had more VRAM and was cheaper. Reviewers said the extra VRAM in the RTX 3060 was wasted because it was not fast enough to use it. In the games I play I don't see my RTX 3060 Ti even using all of its 8GB. How much of the VRAM in your RTX 3060 do you find your games using?

In games that I play, the card uses no more than 6GB. Right now I have hwinfo64 running in the background for almost a week and it shows 4.6GB max at some point.
I got the RTX3060 only because I needed something lower power for my ITX box and it couldn't be any longer than EVGA XC. I also wanted something good enough for games but cheaper and the next step so the Ti cost ~$150 more back then. It wasn't meant to be used in my gaming PC but recently someone asked if I have some stuff for sale and in the end I sold my whole gaming PC without a graphics card ... but the graphics card that left is RX6800XT, and it won't fit the ITX case. I'm using it for tests and reviews right now but it will probably back to a gaming PC when I rebuild it in some time and something new comes out. I just don't want to spend money on the last gen when the new one will be in stores soon. Recently hardware is getting old so fast and in case of any upgrade, I have a hard time selling anything at a reasonable price.

RTX3060Ti is much faster and most games won't use more than 6-6.5GB, even at 4k and higher details what is hard for stronger cards. So you did good going with the Ti. Nvidia has improved texture compression so it uses less memory than it would some generations ago.

I run an Alienware 34" 3440x1440@120Hz. I'm good. :)

Recently I got MSI MPG ARTYMIS 323CQR, 32" 2560x1440 165hz/1ms monitor because there was a 50% price cut and MSI online store discount. I'm not a fan of curved displays, but I thought that at this price I will get it. It was something about $280. It's pretty good, so I can't complain. The next step would be probably something 4k, but everything that I like costs way too much.
 
Back