The M series macbooks are amazing machines. I don't want to take away from them at all, but I recently realized I can get the same power that a 3k Macbook pro has with about 1k when I build my own PC (excluding the screen), and with it full upgradeability making for a much better bang for a buck, and illustrating how ridiculous the MacBook prices are. I mean they charge $400 for an extra terabyte of storage, when I can get around 6TB for the same price.
But, unfortunately you can't build your own notebooks that would remain compact and decent, so now I just use a cheap laptop as a thin client that remotely connects to my PC.
What you pay for is the form factor, the silence, the nice and cool temperatures, the battery life and the ability to actually use the damn thing without being plugged in. added bonus as a dev and a creator is the unified memory, hardware decoding for modern codecs line h265(something that is weirdly missing from nvidia gpu's) and of course the beautiful screen. if this stuff doesn't matter to you, then i agree it's a waste of money
i recently bought my first MacBook ever (M3 Max) and all i can say is that this machine is liberating. I've been jumping from pc to pc every couple of years from Dell xps to razor blade, thermo nuclear piece of shit to a customized clevo 10kg brick almost taking off from the table running too many tabs in chrome. all i can say is that these new M chip machines are on another level, truly amazing.
But at least it runs fast enough, and it's portable too.
To run the same llm on other laptops is basically impossible without an external graphics card, or you have to give up the laptop for a desktop. And that still sounds like an airplane taking off.
But that's argument absurdum considering the price. Yes, it can do that, but you can also do that much better with a cheaper combo that gets you more actual hardware...
Which is exactly what the OP is saying. And I agree, laptops are generally poor value no matter the brand/side. But you can mitigate that by buying a good enough not too expensive notebook and associate a "real" desktop computer with it.
At curent Apple price, the laptop convenience factor for almost equivalent workstation is not really worth it for the vast majority of peoples (at least those who have to point with their own money...).
Fair point, my Zephyrus G14 4090 64+16GB has the same issues, it can run smaller LLaMA2 models much faster than 128GB M3 Max (3x) but 70B one is much slower due to CPU doing half of the layers (10x). So in the end I ended up with both.
My Clevo’s cooling fan is also super loud, which ramps up too frequently like many open tabs or playing video and light weight games. Unfortunately I forgot to look into the audio decibels detail when buying this most recent laptop.
Oddly I haven't yet needed to work offline in my 12 year career as a software dev. But then again it's not common for internet to go away where I live.
Indeed, the only time I'd use my laptop is because I'm offline or out and about. There's no need to use a laptop if I'm at home, since, like you, I have a desktop.
It just makes sense. You can build a desktop + a cheap laptop for half. I don't have to really worry about how what I'm doing affects battery life because it's consist as a thin client. It's cheap to replace if it gets lost/stolen/broken. It's cheaper to upgrade as needed, and I don't have to replace the whole machine when I do.
I get all the portability of a laptop and all the power of a desktop, and money to spare.
For people wanting to do LLM work at home, a fully loaded MacBook has more memory available to it than anything else a consumer can buy. All in a laptop form factor (versus Nvidia GPUs).
A small market, sure, but one they have on lock down.
Not many people use compute 24/7, so if your use cases can be ephemeral than spot/savings plans can be a good saving and using the cloud allows for opex rather than capex controls and taxes
It will be interesting to see how much further it will be possible for Apple to push their platform.
I think the Jobs/Ive legacy that has pushed the thermals ahead of raw compute had given them a massive headstart in the LLM age for both notebooks (obviously) but also desktop (less obviously).
With the size of a normal 4090 heat sink the ATX platform is showing its age. The biggest element of any system is a series of huge heat sinks. The graphics card ends up being massive. You have support struts coming out. Giant power supplies. Bespoke connectors. Paying for RAM twice.
Apple have a form factor people are comfortable with and haven’t set the bar too high with raw performance. They aren’t hindered by existing designs. The thermals are good.
The integrated GPUs Intel and AMD are shipping don't really have enough performance to be worth the trouble, and the configs that support 100+ GB of RAM and have integrated GPUs don't have much memory bandwidth (running at speeds like 6400MT/s is a lot harder with multiple ranks of memory).
Intel's Arc A770M is a discrete GPU with only 16GB of DRAM. It is not to be confused with the "UHD Graphics 770" found in their top desktop processors. The discrete graphics is on the order of 16x larger and faster than the desktop integrated graphics, and the latter is the only one that can plausibly be said to have access to 100+ GB of RAM.
Tflops isn't a reliable way to measure GPU performance anymore. For example. Nvidia Tflop numbers for their new GPUs are 4-5x the performance of 2080Ti but in actual performance, it's like 2x only.
Normal DRAM (i.e. not GDDR) is far too slow to facilitate this, you're basically limited by the CPU's cache if you want to reach a level of performance even remotely comparable to a GPU that has GDDR or HBM.
GDDR isn't as special as you think, and it's special in exactly the opposite way that HBM is special. HBM achieves high bandwidth by using extremely large bus widths running at fairly slow speeds per pin. GDDR achieves high bandwidth by running at very high speeds per pin and using medium to large bus widths.
Desktop CPUs using DDR5 have low memory bandwidth because they have narrow bus widths (128bit total) at low speeds. Server CPUs use the same memory at the same per-pin speeds but with much larger bus widths to achieve total bandwidth comparable to a mid-range GPU but supporting orders of magnitude more memory capacity. Apple's high-end chips likewise uses significantly larger bus widths than desktop CPUs, combined with slightly per-pin speeds (LPDDR rather than DDR) to also reach GPU-like total bandwidth.
I don't think it's accurate to single out any one of the above technologies as being the "normal" DRAM. GDDR, LPDDR, and DDR are all fairly different from each other and all quite mainstream.
GDDR is "special" because it's far closer to the GPU die. System memory isn't that close to the CPU, so any integrated GPU won't be able to make as good use of it as a dedicated GPU with co-located VRAM.
> GDDR is "special" because it's far closer to the GPU die.
LPDDR is always at least as close to the processor, and is often in-package so significantly closer. HBM is always in-package.
But all of that is a red herring, because the distance is only loosely correlated with performance and is a completely irrelevant metric if you've already accounted for bus width and clock rate.
It would be pretty weird to ignore the existence of smartphones when discussing LPDDR, since they probably account for most of the volume of LPDDR sold. Lots of smartphone SoCs, all of Apple's laptop SoCs, and that one Intel custom job for Asus last year have LPDDR in-package, while the rest of the x86 systems with LPDDR have it on the motherboard at similar distances from the processor as GDDR.
Doesn't fit in a backpack. Can't take it to a coffee shop. Heck it involves building an entire custom PC around it.
Sure the m3 with 128GB of RAM costs $7k, which is an absurd number, but that will run high quality LLMs that can have sci-fi level intelligent conversations.
Not a bad price for a laptop capable of doing things that was literally "haha that'll never happen" levels of unimaginable 5 years ago.
Agreed. Estimates put the H100 BOM at around $3k. If they do offer that much memory, they're going to implement some way to make it unattractive, to hold onto that sweet 800% margin [1].
I’d want a better source than someone on Twitter summarizing an unspecified report by a financial analyst before repeating that margin claim. That sounds like the people who claim an iPhone has 400% markup because they added up the bulk prices for the major components and then assumed R&D, assembly, testing, distribution, software development, and support all cost $0.
> and then assumed R&D, assembly, testing, distribution, software development, and support all cost $0.
That's literally what BOM means: bill of materials. It's not supposed to include any of those other costs. BOM is not always the right kind of cost to be talking about, but in a case like this where you're talking about just swapping out the memory on an existing product, BOM accounts for all the costs that change significantly.
Yes, now can I direct your attention to the term “margin” used in the comment I was replying to? It is common for people who are looking for the sweet upvotes from a “this company is ripping everyone off” spin to present this as if “retail sales price - BOM = UNJUST PROFIT!!!!!!”. The source is a blog post based on a tweet by one person who does not work in the industry who is repeating another unverifiable claim by someone who does not work in the industry. The term “BOM” is not mentioned anywhere in that, and none of this has any details which could be reviewed for accuracy.
The blog post really highlights how effective those headlines are at getting clicks but not informing readers: it leads with “nearly 1,000%” because that sounds even more rage bait-y than the already high “823%”, and that's clearly the impression people take away from the story even though starting with the second paragraph it makes it clear that this is a very fuzzy claim which is likely significant understating the true cost for Nvidia to make the hardware as sold or the other factors I also mentioned.
Finally, “swapping out the memory on an existing product” can mean “we replaced part A with larger part B” or it can mean “we upgraded a controller, added additional channels, and redesigned the physical form factor to provide better mechanical and thermal capacity for the extra chips”. That's why those extra details are so important since it's otherwise easy for someone to make a tweet which sounds like they know what they're talking about but is in fact leaving out a lot of important context.
Definitely a nice GPU, but considering airflow, power use, heat generated and the like it's more like a GPU with a PC attached, not so much a PC with a GPU attached.
This gets brought up every time someone talks about macOS and MacBooks.
Another thing people don’t take into account is the time and thus money saved not dealing with this crap.
Building PCs from raw parts is annoying for heaps of people. Warranties are annoying, making sure all the parts fit together is annoying. You’ll spend the extra cost in short order just making sure you have the right parts.
Also resell value of MacBooks is very very good. When it’s time to upgrade you typically are losing 20-30% of purchase price at around the 2-3 year mark. Where as PCs are hard to give away at times.
Everyone has their own needs but Apples machines for me are a work tool, I need a laptop and nobody comes even close to the quality or peeper dollar spent in the form factor.
I'm mostly interested in upgradeability and the cost of parts. With Macs I'll just have to buy totally new Macs every time to upgrade, but with a PC I can swap out RAM, SSD, GPU, etc however I want for a fraction of the price that it costs to upgrade Macs.
the thing is though, that you can actually resell your old Mac when buying a new one. not like a pc that's basically worthless the moment you take it out of the store. the upgrading thing is kind of a myth since the new hardware coming out will often be incompatible with the old hardware you have..
It's most definitely not a myth. I upgrade a component or two every year around Black Friday. This same machine has been evolving for a decade and many spare parts have been handed down to family members.
Some stuff swaps out a little at a time, but something like the massive AM5 swap has you changing 3 of the 4 most expensive parts (CPU, RAM, and Motherboard) of your machine at the same time.
Yes I just made that swap. But it's been almost a decade since the last time I had to swap all 3. I did however get to keep my graphics card, case, power supply, cooling system (liquid), and SSDs (3 x 4TB).
People don't trust potentially overclocked parts which in turn means they aren't worth very much on the used market. This is one area where an efuse indicating if a chip were overclocked could be useful and not anti-consumer.
Then, don't overclock or buy overclockable parts, you're the only one talking about overclocking in this thread. I've found great success buying and selling on online fora like r/hardwareswap.
I think you're misunderstanding why Mac products are more resellable than Windows ones, it has nothing to do with being "abused" or overclocked (that's such a niche thing even among gamers that it bears almost zero relation to reselling). It's because macOS itself has value to consumers, especially as only Apple hardware can run macOS (legally speaking at least, and practically, increasingly more so as Hackintoshes can't yet run on ARM hardware, so eventually they will be phased out as Apple removes features from its Intel builds over time, as they have already started doing).
It’s a fraction of the price if you don’t need to track the latest specs. Most of the people I know who do this end up needing to upgrade motherboards, RAM, etc. at the same time which cuts into those savings somewhat notably. There’s nothing wrong with it if you enjoy the hobby but most people don’t really save money that way because chasing top performance isn’t a poor man’s hobby.
While true, desktops are often difficult to carry around and use while mobile. It also doesn't have the benefits of macOS. I can get Linux desktops to be decent, but Mac feels and looks like a finished Linux desktop OS.
Apple charges big prices for MacBook pro's because they know other corps will foot the bill, they're work equipment like a tractor. The significantly less expensive MacBook air is the choice if this isn't the case.
I still love building PC's but honestly their main use now is either for server work or graphically heavy gaming.
Anyway, mostly agree otherwise. Actually, macOS looks hideous and confusing to me, but the level of integration with the hardware is something I envy from Linux-laptop-land (imo it works fine on the desktop, but Linux on laptops could be better), the build quality seems quite nice, and the ecosystem seems neat.
"Always use the correct Apple product names with the correct capitalization as shown on the Apple Trademark List. Always use Apple product names in singular form. Modifiers such as model, device, or collection can be plural or possessive. Never typeset Apple product names using all uppercase letters."
("MacBook Pro" is one of the trademarks in the trademark l... oops, my apologies, Trademark List.)
So I suppose the correct plural is probably "MacBook Pro devices", but Apple's branding department has no power over us as individuals posting here so we can say what we like.
Osx is very literally unix, and it's extremely polished compared to linux desktop environments out of the box. (yes I know some people have no issues with their linux desktops but I've used it over 15 years and it's NOT nearly as seemless as OSX)
They're also extremely well sandboxed and have become increasingly more focused on security the last few years (in some ways annoyingly so).
> Also what do changes to chrome have to do with apple/osx?
Maybe parent meant chrome as in "OSX is focused continously updating it's shiny ui"?
I've been using an M2 Macbook and M1 Mini for a while now and I can't confirm those statements about polish.
The dock's icon scaling animation under the mouse cursor freezes in place at least once every few days. To this day the Stage Manager has a bug where pulling a minimized window to the front can throw it across the screen, with most of the window landing outside the visible screen area. For a few months I could reliably freeze the screenshot app by clicking 2-3 of the UI's buttons in an order that occurred during normal use. If I open more than ~10-15 images at once in the preinstalled preview app it often opens two or three windows instead of one and randomly distributes the pictures across those. More than ~50 image files can cause an error message and partial failure to open the files. In the first month after getting my Macbook, the "Open Anyway" button in the settings always crashed the settings app the first time after trying to start a new program. Turning off the second connected screen left windows on that screen inaccessible and I couldn't disable/remove the screen in the settings, and I could still move the cursor to the second screen while it was turned off. Clicking an 8GB video file in the file selection dialog spawned a ThumbnailExtension process that allocated 25GB memory within seconds and if not killed the process kept running for a minute after the file chooser was already closed...on a system with 8GB RAM, so it swapped 20GB. The file saving dialog is frozen as long as any connected HDD hasn't yet spun up, no matter if I want to save the file on that drive or not.
That's the things I can remember right now within a bit over one year of use. No, I don't think Mac OS' desktop is more polished than the best current Linux distros, though the OS stayed stable through upgrades, which I can't say about Windows 10 and some Linux systems. Imho the M1-3 Macbooks are still the best laptops currently available. But the desktop UI is clearly not as stable as iOS has been for me.
Well for one the keybindings have been updated from the IBM PC. Doing that on linux isn't impossible, but it would require an enormous effort across many codebases.
I have tried for many years to match the productivity of macos on linux, and I can't. The mac (and I suspect the iphone) is just too predictable and reliable to abandon.
Number of times my MBPs have crashed on wake from sleep: 0
The number of times my last PC crashed on wake from sleep: nearly every time it went to sleep.
I ain’t got time to troubleshoot stuff like this. I just need my stuff to work. I also need something portable. I’ve also been using modern macOS for its entire existence and find it vastly superior to Windows (which I have been using since 3.1), and Linux (again, I ain’t got time to troubleshoot hardware).
You can make your pc sleep? Bro I haven’t gotten my windows 10 machine to successfully sleep in years. It just wakes itself up (or maybe never goes to sleep) and becomes a toaster oven in my bag.
and the battery, and the better speakers, and the quiet cooling, and USB-C/Thunderbolt 4 that just works (compared to most PC implementations) and of course, macOS
Sure, sure. Some pathetic 1080p monitor, or a monitor with ridiculous pixel density and an awful TN matrix. Some oven-like CPU and “turn off your heater” GPU. And, because you are talking about laptops (you are not comparing laptops to desktop computers, surely), it will be a 3-5kg monster. But cheaper, hurray!
Yeah, if you ignore many of the benefits the Macbook provides, it is "overpriced" compared to a homebuilt PC.
On the other hand, if I consider the benefits you excluded of the Macbook, your $1000 PC is literally worth $0 to me. And by that measure, the Macbook is worth infinitely more than your PC.
16GB isn’t enough for Windows. macOS is surprisingly more efficient - it’s not just swapping to their faster storage, either: lots of attention to detail everywhere means that you see less memory pressure across the board. Even using VSC, Teams, and Podman heavily, 16GB has plenty of headroom.
The reason you can’t update the RAM on Apple silicon is the engineering tradeoff they made to have it be faster. That required work on the software side – e.g. they added a CPU extension to compress memory pages and updated the OS to use it effectively – but they have the advantage of not needing to coordinate multiple internal feuding factions and outside vendors the way that Microsoft has to.
I have a 16GB MacBook Pro (M2 Pro chip), and my Linux PC (I don't use Windows) runs a lot smoother than my MacBook does for programming tasks such as running IntelliJ / Android Studio / bunch of docker vm's.
Gamers like me have known this for ages. The reason I am in IT is that I save money and built my own desktop from parts. Even now its waaaaaay better in price to performance, you can freely have any peripherals you want, any x86 OS etc...
We're very close to $2200 if we go with solid, midrange stuff and don't include a monitor.
The cost to apple self-repair guys for a 16" mini-LED display is $670 with Apple offering around $100 back if they'll turn in the old screen.
If we add in $600 for the monitor cost, we're up to $2800 while the 16" MBP we compared to is actually $4000. The savings of a desktop are there, but they aren't what you make them out to be.
If we look at a comparable laptop with good build quality like a Lenove Thinkpad, we're going to pay as much or more. For the same price, we'll have terrible battery life and our performance when not plugged in will drop off a cliff.
You need to compare likes as much as is reasonable. I literally priced everything out on Newegg.
1. You certainly can, but that's the Corsair with the crappy components rather than one with good guts. If I were being completely fair, I'd be spending well over $300 to buy a GaN PSU.
2. Maybe you're different, but I'm not spending almost $500 on a high-TDP CPU only to cripple it with a crappy cooler.
3. I went with a midrange case. In truth,You can't even buy a case on-par with the MacBook Pro case. Even trying to get close would be in the $400+ boutique range.
4. I never mentioned what work was going to be done. If I were going to compare for LLM work and be competitive with Apple, I'd have to go WAY more expensive on the GPU.
B650E and X670E are for overclocking and you'd pay WAY more to get them over their non-E counterparts. x670 offers quite a bit more IO. Maybe that's more IO than Apple offers, but the cost difference for a decent B650 isn't very much (maybe 3-4% of build cost). Certainly less than upgrading the other things you mentioned.
I see nothing wrong with the CX750M PSU [0]. What do you consider good guts? Compared to the macbook charger, which gets fairly hot under load, I see it as more capable.
2. Cooler cost has very little to do with performance. Thermalright units compete with liquid coolers for a fraction of the cost[1].
3. What does a macbook case offer? Holes for ports and the keyboard, hinge for the LCD. The components themselves do the work. What does a ~$50 case [2] not do? Plenty of cooling, mounts all your components.
4. For llama.cpp text generation, an M3 pro does ~31 tokens per sec with the Q4_0 profile[3]. A 3070 does ~34 tokens per second [4].
WRT the motherboard, I'm not sure what did I suggest upgrading? I recommended cheaper parts than the original post.
But this build is not only equivalent to an M3 MacBook - it’s probably 2+x as powerful when it comes to CPU and GPU. There are cheaper ways to build an equivalent machine.
Laptop performance is often thermally limited due to less airflow. Desktop PCs have big fans and coolers and can handle sustained load better. Bursty loads are less of a disparity, though.
But of course, the tradeoff is that it's not portable like a laptop.
Very common on pc laptops, doubly so with discrete GPUs. However Mac laptops seem pretty good about not thermally limiting and are especially at giving same performance when plugged in as when on battery.
PC Laptops with discrete GPUs often show huge differences when on battery vs wall power.
Macs are notorious for thermal throttling, partly due to inadequate cooling and party a policy to only turn on the fan once it's already throttling. This hasn't changed with Apple Silicon.
Every review I've seen for the apple silicon macs show they do quite well, especially compared to similar PC laptops. So not zero impact, but MUCH better than the competition. I'm not basing this on impressions, but the published performance metrics show performance over time as well as performance on wall power vs performance on battery.
The first part of your comment is correct. This part is not true for any benchmark I’ve seen - it’s possible to detect thermal throttling on the Air models but it requires uncommonly heavy workloads and takes much longer to kick in than it did with the Intel processors. In practice, a software developer who isn’t running LLMs locally will likely never hear the fan on their MacBook Pro or experience throttling on an Air.
I chose the 12-core because it's very close to the M3 Max in overall performance.
There are cheaper ways to build most things. A high-end Thinkpad is $3000-4000 and you can buy a terrible gaming laptop with the same components for $1000-1500. The Thinkpad is going to be more reliable and last longer than the gaming laptop because the build quality is much higher.
Apple uses good components that cost more, so comparing to bottom-of-the-barrel components isn't an apples-to-apples comparison.
I agree and that's why I added the comment about a Thinkpad at the end. My point was that the difference is closer to 25% rather than the 300% that was claimed.
I own an M3 Max laptop, so you can suffice it to say that I think the 25% markup is quite reasonable for being able to stuff a very nice desktop into my bag.
But, unfortunately you can't build your own notebooks that would remain compact and decent, so now I just use a cheap laptop as a thin client that remotely connects to my PC.