Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There is almost no harm in a local, open model.

Depends what the side-effects can possibly be. A local+open model could still disregard-all-previous-instructions and erase your hard drive.





How, literally how? The LLM is provided a list of tab titles, and returns a classification/grouping.

There is no reason nor design where you also provide it with full disk access or terminal rights.

This is one of the most ignorant posts and comment sections I’ve seen on HN in a while.


You've lost the plot: The [local|remote]-[open|closed] comment is making a broad claim about LLM usage in general, not limited to the hyper-narrow case of tab-grouping. I'm saying the majority of LLM-dangers are not fixed by that 4-way choice.

Even if it were solely about tab-grouping, my point still stands:

1. You're browsing some funny video site or whatever, and you're naturally expecting "stuff I'm doing now" to be all the tabs on the right.

2. A new tab opens which does not appear there, because the browser chose to move it over into your "Banking" or "Online purchases" groups, which for many users might even be scrolled off-screen.

3. An hour later you switch tasks, and return to your "Banking" or "Online Purchases". These are obviously the same tabs before that you opened from a trusted URL/bookmark, right?

4. Logged out due to inactivity? OK, you enter your username and password into... the fake phishing tab! Oops, game over.

Was the fuzzy LLM instrumental in the failure? Yes. Would having a local model with open weights protect you? No.


Seems like a mean thing to say when the subject they were replying to was AI in general and not just the dumb tab grouping feature.

Great, because an LLM can’t “do” anything! Only an agent can, and only whichever functions/tools it has access to. So my point still stands.

Also I’m referring to the post, not this comment specifically.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: