Using an embedding space, solve the context length problem?
#16
by
win10
- opened
If an embedding space for storing long-term dialogue memory (user input and model output) is added to the model architecture, can the problem of context length be solved?
win10
changed discussion title from
embedding space, can the problem of context length be solved?
to Using an embedding space, solve the context length problem?
sam-mosaic
changed discussion status to
closed