Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It doesn't have any logic, it's just prediction based on statistics. There is so many examples already floating around that it has no logic but I will give you really simple one from my experiments:

I told it to:

> curl somedomain.ext

It replied with curl error that this hostname doesn't exists.

Then I told it to:

> curl https://somedomain.ext

And it replied with some random http response showing that this hostname exists.

There is no logic here.



> And it replied with some random http response showing that this hostname exists.

And that's not logical? ChatGPT doesn't know what is there, so it answer logically based on what should happens there. Obviously having 2 different answers make it less logical for sure, but I have seen many peoples makes plenty of logic error too in real life.

It's crazy to me that for an AGI to be one, it need to be infallible in logic...


If it was an AGI it would have told you it doesn't have internet access.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: