• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

FEATURED Ivy Bridge Rumors

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
We can't say, the date is covered under NDA. There are a lot of rumors around indicating dates. Those rumors are just above in this thread.
 
If I had to go out on a limb and guess, I would guess 1-3 minutes after the NDA drops, which may or may not be the 23rd of this month at 12:01AM pacific time.

I don't actually know, the NDA I signed (different site, for the Z77 chipset but also listed IB results) was an older one and had a later launch date.
 
is it the 23rd yet? sad thing is I'll probably wait a month to see what happens with the market, both in prices and release of IB based mobo. I guess anything would be an upgrade from my 925, plus not really ready to rebuilt WC loop after a few weeks of use. :)
 
That's quite a statement. I'll be interested to see what Mr. Kill-A-Watt has to say in a week or so.
 
TDP is worst case scenario with highest VID power usage for cooling approximation.

Both my i940 and i950 have TDP of 130W. And I have DES chip on GB board which has reportedly accurate power dissipated, and at stock settings, both at prime load were between 80 and 90 watts. But putting in max vid 1.35v vcore from white paper, and both i940/950 were in 122-128W range at load. Same for my old E8400/E8500, stock power usage significantly below TDP.

So even if IVY TDP is 95W, at typical stock settings, it should still be using 77W or lower. In fact, if you find an IVY that runs at 77W at typical stock settings, that suggests 77W isnt proper TDP for that chip.
 
That's quite a statement. I'll be interested to see what Mr. Kill-A-Watt has to say in a week or so.

Yeah, when I get my retail 3770k after they are released (yes, I'm buying one) I plan to put the old kii-a-watt on the system with both SB and IB while testing and comparing temps at various overclocks.
 
Saw this on another site. A lot of conjecture there, although I can see why Intel would want to make sure manufacturers use robust-enough (or better than enough) VRM sections given that people might want to buy into a Z77 board but opt for a 2500K depending on pricing and availability. They wouldn't want to be stuck in the position where AMD was where some manufacturer's boards (even 990-chipset-based boards) had insufficient VRM sections or lacked VRM heatsinks, so higher-TDP chips or overclocking could be problematic.

It probably doesn't hurt their booming sales to have the 2500K reputation of 4.2ghz+ with any board, so you'd want to make sure the motherboard lines would support that instead of ending up with a Gigabyte P55 situation with spontaneously combusting voltage regulators.

Overall they handled the Ivybridge launch well, as there's a ton of pent-up demand in the (albeit minor) enthusiast and system-builder segment from looking at other forums, though it doesn't look like it'd be worth moving to IB from SB unless you have an i3 currently.
 
Yeah, when I get my retail 3770k after they are released (yes, I'm buying one) I plan to put the old kii-a-watt on the system with both SB and IB while testing and comparing temps at various overclocks.

That I would like to see, exact same system, and just switching out cpu.
 
Overall they handled the Ivybridge launch well, as there's a ton of pent-up demand in the (albeit minor) enthusiast and system-builder segment from looking at other forums, though it doesn't look like it'd be worth moving to IB from SB unless you have an i3 currently.

Or C2Q 65nm chip... :rain:
 
I'll be moving my extreme benching operations.
Whether my general use operations make the move or not I don't know yet, I need to have a chip or two in hand first.
 
I'll be moving my extreme benching operations.
Whether my general use operations make the move or not I don't know yet, I need to have a chip or two in hand first.

Yeah. z77 and I for me even though I really don't want one.
 
Not really, AMD's plan is to make up for poor x86 performance with having apps coded to make use of GCN instructions included in their latest GPU's. Should be useful for a wider range of compute tasks than the older VLIW based architectures, likewise, x86 cannot hope to compete with those tasks that work well on GPU's.

Most likely to get users on the supers, workstations and possibly not much else initially. There should be a trickle down effect to mainstream users over a few years though.
 
Last edited:
Not really, the plan is to make up for poor x86 performance with having apps coded to make use of GCN instructions included in their latest GPU's. Should be useful for a wider range of compute tasks than the older VLIW based architectures, likewise, x86 cannot hope to compete with those tasks that work well on GPU's.

Most likely to get users on the supers, workstations and possibly not much else initially. There should be a trickle down effect to mainstream users over a few years though.

Wait, what? AMD does GCN in GPUs, not Intel.
 
Yes, I should have quoted the user. It was worrying about AMD being left too far behind.... In x86 they're pretty well out of the picture already, they're on a different game plan.

If they can't beat Intel at their own game, then change the game. That's what it appears they're trying to to.

On the plus side, Intels current heat issue + increased IPC mean that they should stay roughly where they were with Sandy Bridge until they work out the kinks in their process. Good for competition in the short term.
 
Last edited:
Back