Features
OpenAI API with Memory
Memobase supports OpenAI API integration. This allows you to “patch” Memobase’s memory capabilities to OpenAI chat completion (or any LLM provider that is compatible to OpenAI SDK) without changing the original code.
Setup
- Make sure you install Memobase python SDK and OpenAI python SDK
- Initialize OpenAI and MemoBaseClient
Make sure you have the Memobase Endpoint and Token ready, check here
Patch Memory
You’re all set!
How to use OpenAI with Memory?
- You can use OpenAI API as you normally would but simply add
user_id
to the request.
- If no
user_id
is passed, the client will act just like the original OpenAI client. - If
user_id
is passed, the client will use Memobase automatically. - The memory processing of the user won’t be trigger immediately, there is a buffer zone to collect the recent messages. However, you can manually trigger the process by
Make sure the memory is retained
How it works?
- The
openai_memory
function patches the OpenAI client to call Memobase SDK before and after the chat completion. - Only the latest user query and assistant response will be inserted into the memory.
- If your messages are:
And the response isYou'r Gus!
.- Then the only the latest query and response will be inserted, equivalent to:
- So you don’t really change the way you’re currently using OpenAI API, you can still keep the recent messages when you call the chat completion API. And Memobase won’t repeatedly insert the same messages into the memory.
The full script is here