Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can you explain what that means for someone who missed part of the video today?


The Apple Intelligence cloud system uses Apple's own M-series chips, not Nvidia.


Because they will be running inference using much smaller models than GPT 4.


At least they are honest about it in the specs that they have published - there's a graph there that clearly shows their server-side model underperforming GPT-4. A refreshing change from the usual "we trained a 7B model and it's almost as good as GPT-4 in tests" hype train.

(see "Apple Foundation Model Human Evaluation" here: https://machinelearning.apple.com/research/introducing-apple...)


Yea, their models are more targeted. You can't ask Apple Intelligence/Siri about random celebrities or cocktail recipes.

But you CAN ask it to show you all pictures you took of your kids during your vacation to Cabo in 2023 and it'll find them for you.

The model "underperforms", but not in the ways that matter. This is why they partnered with OpenAI, to get the generic stuff included when people need it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: