If the system doesn't say "I'm gonna phone a friend to get an answer for this", it's going to stay either 100% local or at worst 100% within Apple Intelligence, which is audited to be completely private.
So if you're asking for a recipe for banana bread, going to ChatGPT is fine. Sending more personal information might not be.
I just don't think the average user cares enough to want this extra friction. It's like if every time you ran a google search it gave you lower-quality results and you had to click a "Yes, give me the better content" option every time to get it to then display the proper results. It's just an extra step which people are going to get sick of very fast.
You know what it's really reminiscent of? The EU cookies legislation. Do you like clicking "Yes I accept cookies" every single time you go to a new website? It enhances your privacy, after all.
In theory there isn't. In practice > 99% of the website I visit have a cookie banner thingy. Including the EU own website (https://european-union.europa.eu/index_en).
Think about it: even a government agency isn't able to produce a simple static web page without having to display that cookie banner. If their definitions of "bad cookies that require a banner" is so wide that even they can't work around it to correctly inform citizens, without collecting any private data, displaying any ad or reselling anything; maybe the definition is wrong.
For all intent and purposes, there is a cookie banner law.
They could not have a cookie banner, but their privacy policy states pretty clearly why they want your consent. It is to "gather analytics data (about user behaviour)".
Additionally you don't need to consent to this and can access everything without them "collecting any private data, displaying any ad or reselling anything". The only reason they ask for consent is to gather analytics, which is similar to you being asked for your postal code when paying while shopping.
It's interesting you phrase it that way, because that's sort of how DuckDuckGo works with their !g searches. I'm not saying that's good or bad, it's just an observation.
Still involves friction. A more "seamless" way for Apple to do this would've been to license GPT-4's weights from OpenAI and run it on Apple Intelligence servers.
If the system doesn't say "I'm gonna phone a friend to get an answer for this", it's going to stay either 100% local or at worst 100% within Apple Intelligence, which is audited to be completely private.
So if you're asking for a recipe for banana bread, going to ChatGPT is fine. Sending more personal information might not be.