Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It had a context window of 8192 characters (2x as long as GPT3).

It is possible they are using GPT itself to summarize the past transcript and pass that in as context. So your “remember X” would probably get included in that summary.

That said, I have not tried to tell it to remember X and then exceeded the 8192 token contest window.

Source: https://twitter.com/goodside/status/1598882343586238464



Btw the GPT3 codex model `code-davinci-002` has a 8k token limit, too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: