• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Intel Foundry update - April 2024

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

mackerel

Member
Joined
Mar 7, 2008
https://morethanmoore.substack.com/p/intel-foundry-realigning-the-money

Intel held an investor focused webinar on 2 April but there are some interesting bits of tech-related info from it.

18A will ramp in 2025, counting significant revenue 2026. This is similar to TSMC, where they'll introduce a node and it can be ball park a year before it hits High Volume Manufacturing (HVM).

20604486-12bb-439e-8088-0b7c416c4fd4_1600x897.jpg
This slide shows where Intel sees its process vs competition at the time of offering, not based on like for like node. If you wonder where 4 and 20A are, they're not listed since they're primarily used internally. These listed ones will be the ones offered for anyone to use.

7 is rated as behind because if you look at AMD and nvidia, they're already using N5 class for consumer products. 3 closes the gap, but by the time Intel shifts volume of it, TSMC has been shipping N3 for some time. 18A is where they think they'll match or take the lead and clearly pass with 14A.

Intel's next desktop gen CPU core dies will be made on 20A, which can be seen as an early version of 18A, so that should at least be generally competitive going against Zen 5 which is expected to be on TSMC 3nm class.

It is interesting to note they see costs going down as we move to the newer EUV nodes. Don't expect that to mean cheaper products though, as it'll more likely be used to restore their profitability which isn't so strong at the moment.

Intel thinks they're at a peak on external wafers (TSMC?) and expect to bring more back internally going forwards. Currently around 30% of wafers are external.

Of course, all this depends on Intel executing to their plan. They are getting access to ASML's newest toys first so they can get a lead against TSMC from that. TSMC's head has previously played down Intel's claimed future performance.
 
As I see it, this slide shows Intel’s path within the HPC (high performance computing) market.
I suspect that the consumer market might just become low powered/efficient terminals connected (via fibre optic cable) to large corporate servers providing high powered compute in the cloud to us plebes.
Wouldn’t surprise me, as advances in cloud computing speed has allowed for online gaming, gambling, rendering etc
We may experiment with AI at home on our workstations to learn LLM, but the real AI that will be available to us is done at a corporate/cloud level using massive GPU cluster servers.
Cloud Computing. Ya, just may be where we are heading en masse.
 
Last edited:
As I see it, this slide shows Intel’s path within the HPC (high performance computing) market.
I suspect that the consumer market might just become low powered/efficient terminals connected (via fibre optic cable) to large corporate servers providing high powered compute in the cloud to us plebes.
Wouldn’t surprise me, as advances in cloud computing speed has allowed for online gaming, gambling, rendering etc
We may experiment with AI at home on our workstations to learn LLM, but the real AI that will be available to us is done at a corporate/cloud level using massive GPU cluster servers.
Cloud Computing. Ya, just may be where we are heading en masse.
I read it also you are so very right. The future looks bright.(y)
 
The prediction everything will go to servers has been around for a long time, and I don't feel we're any closer than we were say 5 or 10 years ago. Network infrastructure still isn't there. Some services do make sense to go server, but there's a lot that still makes sense to do locally.
 
The prediction everything will go to servers has been around for a long time, and I don't feel we're any closer than we were say 5 or 10 years ago. Network infrastructure still isn't there. Some services do make sense to go server, but there's a lot that still makes sense to do locally.
Yes it has been predicted, and it's coming full steam for sure. The majority of users in any metropolitan area, and most medium and small communities have fibre to the home. Powerful Workstations and HEDT systems are just too expensive since just a couple of years ago. By design, to force you to accept cloud based computing. Did you ever think a render farm would be necessary for the average Blender user of today? Rendering my animation takes mere minutes on an online render farm, that may take hours on my HEDT system. GPU compute online today is exponentially faster and more powerful than just a year ago. We will be interfacing with GPT more and more in everything we do in the very near future, and that is why the big push in the last decade for fibre to the home. Distributed and cloud computing will be mainstream in the very near future. ThinClient is imminent.
 
We do not have fiber in out area yet.:rain:
If not already, a large portion of the western world have highspeed internet. A Cloud computing business model doesn't require the minority to have fibre to make a business case to profitability.
5G or what every comes next and compression algorithms of the future, may provide relief/access to outliers.
 
Yes it has been predicted, and it's coming full steam for sure. The majority of users in any metropolitan area, and most medium and small communities have fibre to the home. Powerful Workstations and HEDT systems are just too expensive since just a couple of years ago. By design, to force you to accept cloud based computing. Did you ever think a render farm would be necessary for the average Blender user of today? Rendering my animation takes mere minutes on an online render farm, that may take hours on my HEDT system. GPU compute online today is exponentially faster and more powerful than just a year ago. We will be interfacing with GPT more and more in everything we do in the very near future, and that is why the big push in the last decade for fibre to the home. Distributed and cloud computing will be mainstream in the very near future. ThinClient is imminent.
I feel we're looking at different segments of the market.

I'm mainly looking at it from a personal computing perspective. Even before HEDT going expensive it was a tiny niche. Depending on how you define HEDT, the higher core count era we entered since Ryzen has allowed consumer tier systems to take over many (not all) HEDT use cases.

Outside of the gamer space laptops are probably the most popular. Maybe you could argue some home users are effectively thin client already if you set a low enough bar. Some users could do what they need through a browser, but that wont scale up to everything.

In near future NPUs will give local responsiveness that cloud services can't. Cloud will for sure be used where it makes sense. Services that can't scale down to a low power laptop. We can have the best of both worlds. There will be a balance of data transfer/latency, power efficiency and performance in where things are done.

I don't feel AI being a driver at all in internet speeds in past years. Streaming video and gaming I'm going to say would probably be the biggest drivers in home uses. GPT-like isn't bandwidth sensitive so I view it as "just another service" and/or supplementing services already online.

I'm not familiar with Blender asset sizes, but I'd also put that outside of typical home usage. If I pick video editing as another use case, I can't see that being practical at all in the cloud for anything other than the smallest/shortest trivial cases. The network will limit harder than a local compute resource.

All considered, I don't see any significant shift from what we have today for at least the next 5 years, probably much longer.

In some business use cases, thin client already was used 10+ years ago. In my past work I've visited many customer sites (banks and telecoms companies) where the average user gets a thin client. No data stored locally for security reasons. But that's low bandwidth and not latency sensitive use case.

Can you elaborate on what you’re referring to here?
If you want to do everything in the cloud, getting data to and from it quickly will be important. I don't view current consumer affordable wired connections as sufficient for that currently, and mobile connections are much worse.
 
Latency will be solved. Some truth to your outlined network conditions presently. But, as you acknowledged technology doesn’t standstill, and 5 years from now will come fast.
Can you believe COVID started 4 years ago, where’s the time go?
I’m of the volition that Co-Pilot and ChatGPT will advance such that humanity will become so dependent on it, just like how our smart phones have just become an extension of ourselves.
 
I'm mildly AI positive, and don't get the extreme hate from some. But at the same time it is only one part of our digital lives. Some things are enhancements of what we had before. Some things are new. Swinging it back to why I started this thread, AI is impacting semiconductors if in no other way that they're another use case for it. Intel's open capacity could help there. Personally I still don't see a rapid transition in how we do most things, but the future will happen regardless of what we think it could or should be.
 
If you want to do everything in the cloud, getting data to and from it quickly will be important. I don't view current consumer affordable wired connections as sufficient for that currently, and mobile connections are much worse.
I feel we're looking at different segments of the market.

I think this sums it up. I can't see any consumer based service requiring any heavy lifting to anything outside of a residential setting. I don't do a lot with "AI," but looking at things I do do; video up-scaling, image processing and the like, models are downloaded as and when required to the local machine and in those instances, they're not huge.

I can't see any path for AI being heavy enough to require residential network infrastructure to be beefed up from where it is now. Other things, yes. And that's not to say that I don't think residential infra sucks...because it does...but the requirement for getting that in a better state isn't AI.

Game streaming, which is already a thing. Better quality video streaming. Those I'd say are more relevant.

And in my experience, cellular is a far better and faster option for lack of wired infrastructure. Alternatively, one can only hope for influxes of altnets that give the AT&Ts, BTs, Virgin Medias, and Comcasts a run for their money. I, for example, am now on a 8Gb symmetrical FTTH service and couldn't be happier.
 
Back