Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

By M series and amd strix halo. You don't actually need a gpu, if the manufacturer knows that the use case will be running transformer models a more specialized NPU coupled with higher memory bandwidth of on the package RAM.

This will not result in locally running SOTA sized models, but it could result in a percentage of people running 100B - 200B models, which are large enough to do some useful things.





Those also contain powerful GPUs. Maybe I oversimplified but I considered them.

More importantly, it costs a lot of money to get that high bus width before you even add the memory. There is no way things like M pro and strix halo take over the mainstream in the next few years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: