My question is, why is it that Apple can achieve such massive jump in the heat/performance ratio? And why couldn't Intel? Is it just that Intel can not embrace ARM or is it an issue of backwards compatibility forced on Intel that makes their chips so inefficient in comparison?
Seems like building a CPU is a very non trivial task and seems this current situation would be similar to Intel making it's own laptops.
So Apple have been designing their own CPUs for a decade now. Over that time, they've been massively focused on performance through efficiency - because they're concerned about battery life, power dissipation etc. Whereas Intel has been making bank by designing more and more powerful processors for the data centre.
Of course Apple don't fabricate their own chips - they go to fabs like TSMC to actually manufacture the chips. Whereas Intel does that themselves. It's worth noting that TSMC is more of a direct competitor to Intel (they fab AMD chips too) and they've been executing their road map better than Intel have for a few years now.
CISC offered an interesting advantage to RISC - if you have instructions for everything, then given a long enough time frame, you could optimize the heck out of each and every instruction, so the advantages of RISC would be minimal.
However, x86 is purely ugly hackery with such ridiculously variable instruction lengths, a paucity of registers and enhancements which have been made more in the interest of marketing than in actual progress (see Linus' recent comments about AVX-512).
x86 still could be optimized, but you end up over a long enough time with what we have now - so much extra silicon handling so many edge cases and making attempts to eek tiny performance gains out of an ugly, horrible ISA that even the best of what 2020 has to offer can't compete with a clean architecture like ARM that doesn't have all that crappy legacy.
The fact that AMD has outperformed Intel is also telling - AMD went back to the drawing board, probably not just proverbially, and reinvented ways to get that horrible instruction set to perform better, which is probably something Intel has wished they had the foresight to start to do ten years ago.
It's also telling that Intel's security has been as bad as it has - it shows that Intel has cared about performance to the precise detriment of security. It doesn't matter how many billions of dollars of legacy you have pushing the architecture if you're constantly operating at a pronounced disadvantage.
Even now, Intel is pretending that their big / little x86 cores is something new. ARM has been doing that for almost a decade.
Using arm you tapping into a huge industry of people who know how to make chips. Plus as others said TSMC are ahead of Intel and will actually be making the chips.
I have been thinking starting a blog on the topic considering how often this question keeps popping up. From a very high level overview.
1. Apple is partnering with TSMC. TSMC is now the leader in leading edge semiconductor manufacturing. A title that used to belong to Intel.
2. TSMC is now a generation ahead of Intel, meaning those thermal efficiency you see comes from using a better node. Nothing much to do with ARM or x86.
3. Both ARM and TSMC has dramatically change the Industry, you can now buy Designs / Blueprints from ARM, ( Or any other IP vendors such as Img PowerVR ) and Fab ( meaning producing them ) them with a Foundry.
4. TSMC is a Pure Play Foundry, meaning the Foundry does not produce their own chip and sell in the market to compete with its customers. A Non Pure Play Example would be Samsung or Intel, where Samsung produce their own Mobile SoC Exynos, and Intel with their x86 Chip. If you were Qualcomm producing your chip in Samsung's Fab, you are directly competing with them.
5. Apple now has the volume, or economy of scale to produce CPU themselves. Apple makes more silicon in unit volume than Intel per year.
So the inevitable Question:
Why Doesn't Intel go to TSMC then?
Intel would earn more margins by producing it themselves. Second being moving these designs take years, especially for Intel which have their own design tools in house. Intel cares about margins, and you can tell from their investor meetings.
He was the one who designed A4/A5 - the very first apple silicon, that I distinctly remember having performance on par with other offerings at the time.
just take an example if intel will make performance base chip same price as apple ? which one you will prefer ?
If AWS or Microsoft decide they need to have an in-house fab capability, I wonder if Intel will become a takeover target?
They can out-CPU Intel's 10nm.
25-30% is not what I'd call a "massive jump".
They have deliberately slowed down their tick/tock speed of development because they know they are reaching the end of the road, and will nothing to improve upon because the physical limits of silicon as a material are being approached. Other materials are possible, but very costly.
Apple wants to bring their huge iOS software library to their laptops. OSX has been declining for years as a software market, and fewer products are being made for it. In contrast, Apple has millions of excellent titles in IOS, but they all use ARM instruction set. So this allows them to unify the software availability.
The laptop computers Apple sells are a small part of their business, they know that in the future computers are a commodity and won't be profitable, but software will be, and they want to capitalize on their iTunes App Store which is the largest employer of independent software developers in the entire world. It is a bigger publishing entity than all of the US book publishers put together. With a 70% royalty rate paid to authors, it is also the highest paying publisher in the history of the world. It is common in Music and Book publishing to pay around 5% royalties. Steve Jobs really disrupted that industry with a new economic model! Hooray for Jobs! We sure miss him. I bet he would be opening his own bank and credit card, as that industry is ripe for revolution; the credit card companies are still ripping off businesses by taking 3% on a transaction, when 0.3% would be more reasonable given modern technology.
The ARM chips, just like intel chips, will be thermally limited in all these laptops. Its very hard to extract the heat without going to water cooling which nobody has the nerve to do.