Patrick Moorhead is a stock analyst for the chip industry. In Wall Street terms, he is aligned with Intel others who are shaking in their boots right now. He's not going to say that the M1 is great until Intel/NVidia/etc. allow him to.
These pundits know which side of his bread is buttered. He knows that his clients were caught off-guard by the M1 announcement. He needs to give his clients enough time to sell their Intel stock before it becomes conventional wisdom that the legacy chip makers have a business model that doesn't work.
The history of the tech industry is dotted with pundits nay-saying anything new from companies they aren't in bed with. Eventually they can't deny reality any longer. Through the magic of the media's short-term memory they change their song and deny they've every said anything else.
I remember in the 1980s when the top pundit of the industry was John C. Dvorak. Everyone read his column. EVERYONE. The Amiga was a potential threat to John, who was aligned with the MS-DOS world. He wrote many columns about how multitasking was stupid because "your desk can't fit more than one keyboard and mouse". Yeah, that was his reason. Of course, once MS-Windows arrived, suddenly his column was about how multitasking is this new thing, the best thing, the thing everyone should have. If you don't have it, you're an ignorant loser. I remember telling my friends that I wish I had saved his old, anti-Amiga/anti-multitasking, columns because I want to show up at one of his public appearances to ask how MS-Windows allows you to fit so many keyboards on his desk.
Was it really that shocking for Apple to announce they were shipping their own silicon? That stuck me as a pretty open (non-)secret and I’m not even in that industry.
But how can that be a surprise to anyone who has been following the development of the iPhone/iPad CPUs? The M1 is exactly what you would expect it to be by extrapolating from the performance of recent iOS chips.
This is more about trying to soften the blow on Wall Street. I'm sure Intel had a good idea Apple's chip was going to blow the doors off their offerings.
But I don't really see how this will affect Intel financially in any meaningful way in the short term.
They won't provide chips for Apple anymore, but that must have been a relatively small number of units, so no great financial loss to them.
And Apple are never going to sell these chips generally, so there's no competition there.
In the long term though it may actually benefit Intel (and AMD/NVidia) who might work out exactly what Apple have done and possibly replicate it in their own CPUs.
That assumes Apple won't be growing its PC market share.
Given the M1's benefits, the ability of Big Sur to run iOS apps, and Apple's marketing prowess, I'd be surprised if they don't double their market share in the next 3 to 4 years.
Then there's the (highly lucrative) server market.
> Then there's the (highly lucrative) server market.
Apple has flirted with the server market on several occasions. I could see a well priced Apple server with an M series CPU doing quite well if Apple did it right. That said, Apple has never really shined here and it's not really their strong point.
I'm sure MacStadium would love if Apple launched a blade server platform where you could just slot Mac mini logic boards in.
I could see Apple potentially licensing the design (minus IP critical to their MacOS workstation offering like the GPU or neural engine), to a big cloud provider, if the cloud provider does the heavy lifting on the Linux port (something Apple isn’t publicly talking about) you could see this become a big deal overnight.
While Apple's volumes are low, often their parts are mostly the higher end chips. I think just about every i9 laptop mention I've ever seen has been in a MBP 16". So while it's 10-15% of Intel's volumes, I've heard numbers as high as 40% of the profit.
I have no idea how much of Intel's profit can be attributed to Apple, but Apple definitely buys more higher end laptop CPUs from Intel than anyone else. That's going to hurt their bottom line.
I also expect Apple will be picking up some market share. How much remains to be seen. If price/ performance was your primary reason for avoiding Apple, it's gotten a lot harder to resist switching.
I doubt much of the market cares particularly much about price/perf. But the $700 impact does seem like an impressive value that can do a wide range of things not normally considered for this price point. Things like editing 4k video.
> the legacy chip makers have a business model that doesn't work
It seems like this implicitly assumes that everyone switching to Apple silicon is a possibility. But it doesn't seem likely that all the android users are going to switch to iphones or that all the windows/linux users are going to switch to macs.
This is hugely embarrassing for Intel and Qualcomm in particular, and stocks will fall. But the business model outside the walled garden isn't necessarily going to change.
> It seems like this implicitly assumes that everyone switching to Apple silicon is a possibility.
Apple has flipped the tables. They went from being way behind the pack in terms of price/ performance pile in the 90s and 00s to the point where Apple's base MacBook Air is now competitive in terms of performance with laptops that cost twice as much.
I suspect they will absolutely pick up share. I also expect AMD will gain some share here as OEMs look to compete with Apple's offerings and Intel has come up short.
Also, by losing Apple, Intel is also losing one of their most profitable customers. While Apple isn't Intel's biggest customer in terms of units shipped, Apple tended to buy Intel's higher end CPUs which other OEMs avoided.
Everyone isn't switching to Apple, but the competitive scene is changing a lot and it doesn't favor Intel.
I'm not sure where "Over Here" might be, but I know in some places Macs are ridiculously expensive due to tariffs and whatnot. It's hard to account for regional differences.
Though I expect they will pick up share everywhere, just might go from 0.5% to 0.9%. I suspect everyone will benefit from the secondary effects.
It proves the viability of ARM for desktops, laptops, and probably eventually servers too. Intel having been humiliated by this will loosen their grip on OEMs who will now be under pressure in the inevitable rush to produce appealing, performance-oriented ARM laptops, as the current market has basically been Chromebooks and some half-hearted Windows ARM tablets, all based on chips adapted from phones.
And of course there will also be the rush on the fab side to produce a non-Apple chip that's actually suitable for these machines, which will in turn have performance characteristics far more suited to server use than any previous ARM chip, enabling that market and feeding back a demand for larger and more performant server ARM chips and mainboards— and who knows, maybe even some kind of socket standard so they aren't soldered down like it is today.
I think there’s a qualitative difference here between viable (which is where we’re currently at) and utterly dominant (which is where I suspect we’ll be in five or so years).
I agree, but even still how does that affect Intel’s business model? Are they incapable of building ARM chips and selling them the same way as their x86 chips? Seems like the business model is fine but the product will need to change, no?
But for the last twenty years, surely pretty much every linux-capable ARM part produced in excess of Q>1M has fallen into one of two buckets— either a) a device that runs on batteries, or b) an embedded, specialized processor such as for a router, vehicle infotainment, or home appliance.
Not sure. There are clearly some problems with the x86-64 ISA. If the world switches to ARM in the coming decade, that leaves Intel in a perilous position: no advantage in IP and no advantage in fabrication.
Right, the problem for Intel & AMD is not that everybody will switch over to Apple chips (they can't, because Apple likely won't sell it to them), but rather that Apple has proven beyond any reasonable doubt that ARM chips _can_ be this good when scaled up to laptop-level power envelopes.
That will probably prompt other ARM vendors and certain PC manufacturers (Microsoft who already has the ARM based SQ chips, for instance) to further invest in similar products that chip away at Intel/AMD's lead and likely eventually overtake them.
The world may want to switch to ARM but right now there aren't really any options to switch _to_ right? Beyond Apple I can't think of any chip manufacturers offering desktop ARM cpus...
I think it has less to do with everyone switching to Apple silicon and more to do with the clarification of viable business models: you either become an IP company and outsource production (AMD) or you go completely vertical and design and produce custom-designed SoCs to be used exclusively in your consumer products (Apple). The Intel model of designing and producing all-purpose commodity CPUs seems to have hit a wall due to both lack of demand and unanticipated manufacturing problems.
The (apparent) lack of demand for intel chips is because their manufacturing problems have disrupted their roadmaps. Because their 10 nm process is still not really working, and they were working on multiple releases assuming the process would be there when the design was, they're in a bad place.
From public statements, it seems like they've thought 10 nm was going to work soon for the whole time since 2018 that it's been in limited production. And they haven't really done substantial design work on the 14 nm processors (there's some releases, but it's mostly smashing more cores on there, and less architectural changes; afaik, coffe lake and comet lake don't have much in the way of ipc gains vs kaby lake). Also, they clearly need a new naming person.
It doesn't help that they killed their atom for phones program wheb they did. Right around the time Microsoft announced continuum for windows mobile 10, which would have been much cooler if it was running on an x86 chip (and if Microsoft didn't do a terrible job with Windows mobile 10 in general).
There was a Lenovo Zenfone that shipped with an atom and even in Java apps or running the few Android NDK apps that shipped x86 APK's, they weren't fast.
Fun fact, Android used libhoudini (basically, reverse Rosetta2) allowing arm NDK apps to run (but they were pretty rough).
No, it’s not true. Most people buy their computers because of the price, style, and easily observable features (not necessarily performance, certainly not the type of difference that only shows up in benchmarks). Hacker News commenters are far from your average PC buyer. I’m sure the M1 and future desktop Apple silicon will have an impact, but it’s far from a killing stroke. If people only cared about maximum performance, budget machines wouldn’t sell at all. In reality, they sell in much higher volumes than high end devices.
And not all HN readers are won over either. I for one am not willing to switch to Apple's closed ecosystem for a chip that is a little faster. Or even a lot faster.
Up until two days ago, I was using a 6 year old laptop with a Celeron processor as my main home computer.
I think the whole Apple Silicon thing seems really interesting, but honestly, the only thing I do that is remotely taxing on decade-old hardware is load Facebook.
There are plenty of other laptops out there that don't get hot or loud and have all day battery life, though the M1 has multi-day battery life so fair point there.
You know, I would have sworn that Dvorak used to write a Mac column for an American Macuser or Macworld when I was a student in the 1990s, waxing lyrical about the Mac.
He was sometimes in the back page of one of those (I think MacUser) but just as ultimately wrong about everything as ever — for example, online shopping.
He did write for a while in one of those publications, but he hardly waxed lyrical. It was mostly vaguely negative “insights” about Apples business strategy and how awesome they could be if they just listened to him.
Is Intel really so blind that they didn't see this coming or even have detailed inside info about it?
I would wager that they knew, but execs chose to protect their personal short term interests overt those of the company and its long term shareholders. This is typical modern US publicly traded company behavior.
There's been comments here and elsewhere for a while saying that Intel's remuneration policy (i.e. salaries and benefits) have been uncompetitive and between that and the culture some of the best people have left (a number of them to Apple) so the answer's pretty obvious.
Throw money at it - find the best and brightest people and stuff their mouths with gold to have them solve whatever problems Intel's still having with 10nm and beyond as quickly as possible and with honest timelines to when that'll happen (not "oh it'll totally work this quarter").
Intel blowing the lead they had on semiconductor process technology is an existential threat, especially if it's blocking them getting out improvements in IPC (i.e. see what AMD are doing) also, so it's impossible to overstate how important this should be to Intel's senior leadership (by which I mean if it's not solved real soon, the shareholders ought to be removing them from post).
But unfortunately that's not how publicly traded companies operate. The highest paid execs just hang around for a few years, and almost regardless of how the company performs, they exit with a nice package and move into another company with a similar or better agreement.
Maybe you should read Andrew Grove's book. Intel used to be this way and I suspect they just intentionally made the easy choice, not the socially/business good choice.
This is nearly right - I don't think he's a stock analyst but he clearly gets some work from Intel, Dell etc - so hardly an independent reviewer (not sure Gruber would claim to be either fwiw) although he's also written this on Arm over the last few days.
In any event he could have led with "Why you might want to wait" rather than "Why you might want to pass" given that most of the criticisms are likely to be fixed soon.
Is that the same John C Dvorak who is the older and crotchetier of the "two crotchety old men" podcast who spend half of each show hyping their "Amway meets PBS meets LARPing" monetization scheme? That podcast would actually be listenable if they just ran some ads... Still, he has landed on his feet, even if he no longer claims any particular tech expertise.
These pundits know which side of his bread is buttered. He knows that his clients were caught off-guard by the M1 announcement. He needs to give his clients enough time to sell their Intel stock before it becomes conventional wisdom that the legacy chip makers have a business model that doesn't work.
The history of the tech industry is dotted with pundits nay-saying anything new from companies they aren't in bed with. Eventually they can't deny reality any longer. Through the magic of the media's short-term memory they change their song and deny they've every said anything else.
I remember in the 1980s when the top pundit of the industry was John C. Dvorak. Everyone read his column. EVERYONE. The Amiga was a potential threat to John, who was aligned with the MS-DOS world. He wrote many columns about how multitasking was stupid because "your desk can't fit more than one keyboard and mouse". Yeah, that was his reason. Of course, once MS-Windows arrived, suddenly his column was about how multitasking is this new thing, the best thing, the thing everyone should have. If you don't have it, you're an ignorant loser. I remember telling my friends that I wish I had saved his old, anti-Amiga/anti-multitasking, columns because I want to show up at one of his public appearances to ask how MS-Windows allows you to fit so many keyboards on his desk.