-
-
Notifications
You must be signed in to change notification settings - Fork 351
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clear both chat history and conversation memory when /clear
is applied
#842
Conversation
(1) Updated `clear.py` to clear chat history and display the help command again. (2) Updated `base.py` to remove conversation memory when chat history is cleared.
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@srdas Great work on this, thank you! I agree that this solution works in the default case, since DefaultChatHandler.create_llm_chain()
always creates a new instance of ConversationBufferWindowMemory
every call.
However, I've noticed that the LangChain chat history is actually split between the default
and /ask
chat handlers. Here is an excerpt /ask
that shows it using its own separate memory, which will not be cleared in general:
class AskChatHandler(BaseChatHandler):
...
def create_llm_chain(
self, provider: Type[BaseProvider], provider_params: Dict[str, str]
):
unified_parameters = {
**provider_params,
**(self.get_model_parameters(provider, provider_params)),
}
self.llm = provider(**unified_parameters)
memory = ConversationBufferWindowMemory(
memory_key="chat_history", return_messages=True, k=2
)
Right now, default
and /ask
track two separate chat histories (one with all messages & replies to/from default
, the other to/from /ask
).
Ideally there should only be one instance of ConversationBufferWindowMemory
shared between all chat handlers. I think the best way to do this is to define the memory as a class attribute on BaseChatHandler
. This effectively shares global state between all chat handlers.
Under this proposal, BaseChatHandler
should define three new methods:
get_memory()
: returnsBaseChatHandler.memory
(notself.memory
)reset_memory()
: setsBaseChatHandler.memory = ConversationBufferWindowMemory(return_messages=self.llm.is_chat_provider, k=2)
clear_memory()
: callsBaseChatHandler.memory.clear()
Furthermore:
self.get_memory()
should be called bycreate_llm_chain()
indefault
and/ask
self.reset_memory()
should be called byBaseChatHandler.get_llm_chain()
right before callingself.create_llm_chain()
when switching LLMsself.clear_memory()
should be called by/clear
Also note:
- The
PROMPT_TEMPLATE
inask.py
should change{chat_history}
to{history}
under this proposal. This is because the memory object in/ask
definesmemory_key='chat_history'
instead of leaving it unset and defaulting tomemory_key='history'
.
Can you explore this implementation?
I am closing this PR as the code base has changed since this was opened and it needs a fresh PR, as this one is outdated. |
Addresses #616
clear.py
to clear chat history and display the help command again.base.py
to remove conversation memory when chat history is cleared.Use chat to ask questions and verify that the logs show it is using conversation memory to the default depth of
k=2
.Then also check that
/export
contains the entire question history:Next, use
/clear
and see that the next question does not have any conversation memory:See the log message above:
Clear conversation memory, re-initializing the llm chain.
Then check that
/export
only shows the new entries in chat:This ensures that chat history and conversation memory are consistent.