It had a context window of 8192 characters (2x as long as GPT3).
It is possible they are using GPT itself to summarize the past transcript and pass that in as context. So your “remember X” would probably get included in that summary.
That said, I have not tried to tell it to remember X and then exceeded the 8192 token contest window.
It is possible they are using GPT itself to summarize the past transcript and pass that in as context. So your “remember X” would probably get included in that summary.
That said, I have not tried to tell it to remember X and then exceeded the 8192 token contest window.
Source: https://twitter.com/goodside/status/1598882343586238464